Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

EclipseMetrics

This page describes what are the metrics used in the Eclipse quality model.

The complete list of metrics can be seen on the dashboard documentation page: http://dashboard.castalia.camp/documentation/metrics.html

About metrics

Before diving into software metrics, one may wish to have a look at some important papers about measurement:

  • Fenton's now famous column: Software measurement: A necessary scientific basis[1].
  • Kaner's paper about software metrics: Software engineering metrics: What do they measure and how do we know? [2].
  • Basili's Goal Question Metric approach[3], further enhanced by Linda Westfall's 12 steps[4].


Identifying repositories

We assume here that a software project has at least the following data repositories available:

  • A SCM (Software Configuration Management) tool. This gives access to source code at specific dates and also provides useful meta information like e.g. authors, or date and number of commits.
  • Some communication channel, both for developers and users (the two communities identified in our Quality Requirements). Projects may use forums (web), news (nntp), or mailing lists.
  • Process-related information from the Eclipse Foundation infrastructure.
  • Tracker for bugs and change requests. In some cases, the project tracker is used to manage requirements.
  • The web site, wiki, articles about the project constitute meaningful publishing information unveiling how the project communicates with the outside world.


Note: The automotive working group has some similar needs for their tool qualification task force.


Source code metrics

Code metrics can be computed at different levels: the SLOC can be summed up at the higher level (application) while Halstead's measures make sense at the function level only. Because of this distinction, file-level and function-level metrics are considered at their respective levels and computed as indexes.

See the complete list of code metrics extracted from SonaQube on the dashboard documentation page: http://dashboard.castalia.camp/documentation/metrics.html#repo_SonarQube .


Rule-based metrics

Rules are anti-patterns that threaten one or more characteristics of the code (maintainability, stability, changeability...). From the quality improvement stand, they can be interpreted as coding practices and show how conventional good practices are implemented in the project.

We rely on a set of well-known rule checking tools: FindBugs, PMD. Each rule is tagged with one or more attribute of quality, and the violations are counted on the artefacts.

See the complete list of rule-checking metrics on the dashboard documentation page: http://dashboard.castalia.camp/documentation/metrics.html#repo_RuleChecking .


Communication metrics

Communication metrics target the communication media of the project. It is either a mailing list or a FudForum (Eclipse's forums).

In our case these metrics are retrieved from the Eclipse dashboard, which uses Grimoire.

We assume there is at least one communication channel for developers and another one for users (often referred to as dev and user mailing lists). Most Eclipse projects use a mailing list for developers and online forums for users. All mailing list metrics are prefixed with MLS_DEV (for developer mailing list) or MLS_USR (for user mailing list).

See the complete list of metrics retrieved from Grimoire on the dashboard documentation page: http://dashboard.castalia.camp/documentation/metrics.html#repo_Grimoire .


Configuration Management metrics

Communication metrics are prefixed with SCM_. They are retrieved from the Eclipse dashboard, which uses Grimoire.

Metrics can be retrieved either from a Subversion (svn log -v --xml) or Git (git log --stat=4096) repository. It should be noted that metrics may not have the same ranges or values for all tools: as an example, the number of commits for Git has not exactly the same meaning as for subversion, because of the branching philosophy often used in the former. Similarly fix-related commits identification may be impacted by the coding conventions used in the project.

See the complete list of metrics retrieved from Grimoire on the dashboard documentation page: http://dashboard.castalia.camp/documentation/metrics.html#repo_Grimoire .


Issues Tracking System metrics

ITS metrics are prefixed with ITS_. They are retrieved from the Eclipse dashboard, which uses Grimoire.

See the complete list of metrics retrieved from Grimoire on the dashboard documentation page: http://dashboard.castalia.camp/documentation/metrics.html#repo_Grimoire .


Process metrics

The Eclipse foundation is in the process of setting up a repository for process-related information [5].

Wayne Beaton is in charge of this initiative and some results are already available. It should be noted however that it is still in development and things may change (e.g. in the information scope, format used, or way to retrieve information). The json file for a project can be downloaded from the following url: http://projects.eclipse.org/json/project/<project.name>

Let's take the Papyrus project as an example:

Intellectual Property is of primary importance to the Eclipse Foundation. As such, all contributions should have a IP agreement registered. IP logs can be fetched in the XML format from the following URL: http://www.eclipse.org/projects/xml/project_ip_data.php?id=<project.name> . As for Papyrus, the IP log can be retrieved from http://www.eclipse.org/projects/xml/project_ip_data.php?id=modeling.mdt.papyrus

See the complete list of metrics retrieved from the PMI on the dashboard documentation: http://dashboard.castalia.camp/documentation/metrics.html#repo_PMI


References

  1. Paper available at http://www.researchgate.net/publication/3187630_Software_measurement_a_necessary_scientific_basis/file/5046351b05e6b37ee6.pdf
  2. The full document is available at http://testingeducation.org/a/metrics2004.pdf
  3. More information on Basili's Goal Question Metric approach, see http://en.wikipedia.org/wiki/GQM.
  4. For more information on Linda Westfall's 12 steps see the full paper at http://floors-outlet.com/specs/spec-t-1-20111117171200.pdf or read a short summary at http://maisqual.squoring.com/wiki/index.php/12_Steps_to_Useful_Software_Metrics
  5. See http://projects.eclipse.org

Back to the top