Main Content

metric.Result

Metric data for specified metric algorithm

Description

A metric.Result object contains the metric data for a specified metric algorithm that traces to the specified unit or component.

Creation

Description

example

metric_result = metric.Result creates a handle to a metric result object.

Alternatively, if you collect results by executing a metric.Engine object, using the getMetrics function on the engine object returns the collected metric.Result objects in an array.

Properties

expand all

Metric identifier for the metric algorithm that calculated the results, returned as a string.

Example: 'TestCasesPerRequirementDistribution'

Project artifacts for which the metric is calculated, returned as a structure or an array of structures. For each artifact that the metric analyzed, the returned structure contains these fields:

  • UUID — Unique identifier of the artifact.

  • Name — Name of the artifact.

  • Type — Type of artifact.

  • ParentUUID — Unique identifier of the file that contains the artifact.

  • ParentName — Name of the file that contains the artifact.

  • ParentType — Type of file that contains the artifact.

Value of the metric result for the specified algorithm and artifacts, returned as an integer, string, double vector, or structure. For a list of model testing metrics and their result values, see Model Testing Metrics.

Scope of the metric results, returned as a structure. The scope is the unit or component for which the metric collected results. The structure contains these fields:

  • UUID — Unique identifier of the unit or component.

  • Name — Name of the unit or component.

  • Type — Type of unit or component.

  • ParentUUID — Unique identifier of the file that contains the unit or component.

  • ParentName — Name of the file that contains the unit or component.

  • ParentType — Type of file that contains the unit or component.

User data provided by the metric algorithm, returned as a string.

Examples

collapse all

Use a metric.Engine object to collect metric data on the design artifacts in a project.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Collect results for the metric "slcomp.OverallCyclomaticComplexity" by executing the metric engine. For more information on the metric, see Model Maintainability Metrics.

execute(metric_engine,'slcomp.OverallCyclomaticComplexity');

Use the function getMetrics to access the results. Assign the array of result objects to the results variable.

results = getMetrics(metric_engine,'slcomp.OverallCyclomaticComplexity');

Access the metric results data by using the properties of the metric.Result objects in the results array.

for n = 1:length(results)
    disp(['Model: ',results(n).Scope.Name])
    disp(['  Overall Design Cyclomatic Complexity: ',num2str(results(n).Value)])
end
Model: db_Controller
  Overall Design Cyclomatic Complexity: 1
Model: db_LightControl
  Overall Design Cyclomatic Complexity: 4
Model: db_ThrottleController
  Overall Design Cyclomatic Complexity: 4
Model: db_ControlMode
  Overall Design Cyclomatic Complexity: 22
Model: db_DriverSwRequest
  Overall Design Cyclomatic Complexity: 9

For more information on how to collect metrics for design artifacts, see Collect Model Maintainability Metrics Programmatically.

Collect metric data on the requirements-based testing artifacts in a project. Then, access the data by using the metric.Result objects.

Open the project. At the command line, type dashboardCCProjectStart.

dashboardCCProjectStart

Create a metric.Engine object for the project.

metric_engine = metric.Engine();

Update the trace information for metric_engine to ensure that the artifact information is up to date.

updateArtifacts(metric_engine)

Collect results for the metric 'RequirementsPerTestCase' by using the execute function on the metric.Engine object.

execute(metric_engine,'RequirementsPerTestCase');

Use the function getMetrics to access the results. Assign the array of result objects to the results variable.

results = getMetrics(metric_engine,'RequirementsPerTestCase');

Access the metric results data by using the properties of the metric.Result objects in the array.

for n = 1:length(results)
    disp(['Test Case: ',results(n).Artifacts(1).Name])
    disp(['  Number of Requirements: ',num2str(results(n).Value)])
end

Version History

Introduced in R2020b