Documentation

Collect and Explore Metric Data by Using the Metrics Dashboard

The Metrics Dashboard collects and integrates quality metric data from multiple Model-Based Design tools to provide you with an assessment of your project quality status. To open the dashboard:

  • From a model editor window, select Analysis > Metrics Dashboard.

  • At the command line, enter metricsdashboard(system). The system can be either a model name or a block path to a subsystem. The system cannot be a Configurable Subsystem block.

You can collect metric data by using the dashboard or programmatically by using the slmetric.Engine API. When you open the dashboard, if you have previously collected metric data for a particular model, the dashboard populates from existing data in the database.

If you want to use the dashboard to collect (or recollect) metric data, in the toolbar:

  • Use the Options menu to specify whether to include model references and libraries in the data collection.

  • Click All Metrics. If you do not want to collect metrics that require compiling the model, click Non-Compile Metrics.

The Metrics Dashboard provides the system name and a data collection timestamp. If there were issues during data collection, click the alert icon to see warnings.

The dashboard contains widgets that provide visualization of metric data in these categories: size, modeling guideline compliance, and architecture. To explore the data in more detail, click an individual metric widget. For your selected metric, a table displays the value, aggregated value, and measures (if applicable) at the model component level. From the table, the dashboard provides traceability and hyperlinks to the data source so that you can get detailed results and recommended actions for troubleshooting issues. When exploring drill-in data, note that:

  • The Metrics Dashboard calculates metric data per component. A component can be a model, subsystem, chart, or MATLAB Function block.

  • To sort the results by value or aggregated value, click the corresponding value column header.

  • The metric data that is collected quantifies the overall system, including instances of the same model. For aggregated values, the metric engine aggregates data from each instance of a model in the referencing hierarchy. For example, if the same model is referenced twice in the system hierarchy, its block count contributes twice to the overall system block count.

  • If a subsystem, chart, or MATLAB Function block uses a parameter or is flagged for an issue, then the parameter count or issue count is increased for the parent component.

  • The Metrics Dashboard analyzes all variants.

Size

This table lists the Metrics Dashboard widgets that provide an overall picture of the size of your system. When you drill into a widget, this table also lists the detailed information available.

WidgetMetricDrill-In Data
BlocksSimulink block count (mathworks.metrics.SimulinkBlockCount)Number of blocks by component
ModelsModel file count (mathworks.metrics.ModelFileCount)Number of model files by component
FilesFile count (mathworks.metrics.FileCount)Number of model and library files by component
MATLAB LOCEffective lines of MATLAB code (mathworks.metrics.MatlabLOCCount)Effective lines of code, in MATLAB Function block and MATLAB functions in Stateflow, by component
Stateflow LOCEffective lines of code for Stateflow blocks (mathworks.metrics.StateflowLOCCount)Effective lines of code for Stateflow blocks by component
System Interface
  • Input and Output count (mathworks.metrics.ExplicitIOCount)

  • Parameter count (mathworks.metrics.ParameterCount)

  • Number of inputs and outputs by component (includes trigger ports)

  • Number of parameters by component

Modeling Guideline Compliance

For this particular system, the model compliance widgets indicate the level of compliance with industry standards and guidelines. This table lists the Metrics Dashboard widgets related to modeling guideline compliance and the detailed information available when you drill into the widget.

WidgetMetricDrill-In Data
High Integrity ComplianceModel Advisor standards check compliance - High Integrity (mathworks.metrics.ModelAdvisorCheckCompliance.hisl_do178)

For each component:

  • Percentage of checks passed

  • Status of each check

Integration with the Model Advisor for more detailed results.

MAAB ComplianceModel Advisor standards check compliance - MAAB (mathworks.metrics.ModelAdvisorCheckCompliance.maab)

For each component:

  • Percentage of checks passed

  • Status of each check

Integration with the Model Advisor for more detailed results.

High Integrity Check IssuesModel Advisor standards issues - High Integrity (mathworks.metrics.ModelAdvisorCheckIssues.hisl_do178)
  • Number of compliance check issues by component (see the following Note below).

  • Components without issues or aggregated issues are not listed.

MAAB Check IssuesModel Advisor standards issues - MAAB (mathworks.metrics.ModelAdvisorCheckIssues.maab)
  • Number of compliance check issues by component (see the following Note below).

  • Components without issues or aggregated issues are not listed.

Code Analyzer WarningsWarnings from MATLAB Code Analyzer (mathworks.metrics.MatlabCodeAnalyzerWarnings)Code Analyzer warnings by component
Diagnostic WarningsSimulink diagnostic warning count (mathworks.metrics.DiagnosticWarningsCount)Simulink diagnostic warnings by component

Note

An issue with a compliance check that analyzes configuration parameters adds to the issue count for the model that fails the check.

Architecture

These widgets provide a view of your system architecture:

  • The Library Reuse widget is a percentage scale that shows the percentage of subsystems that are candidates for reuse. Orange indicates potential reuse. Green indicates actual reuse.

  • The other system architecture widgets use a value scale. For each value range for a metric, a colored bar indicates the number of components that fall within that range. Darker colors indicate more components.

This table lists the Metrics Dashboard widgets related to architecture and the detailed information available when you select the widget.

WidgetMetricDrill-In Data
Library Reuse

Directly:

  • Clone detection (mathworks.metrics.CloneDetection)

  • Library content (mathworks.metrics.LibraryContent)

Indirectly (for percentage values):

  • MATLAB Function count (mathworks.metrics.MatlabFunctionCount)

  • Chart count (mathworks.metrics.StateflowChartCount)

  • Subsystem count (mathworks.metrics.SubSystemCount)

Number of clones per component, broken down into clone patterns

Number of components involved in a library, excluding clones.

Integrate with the Identifying Modeling Clones tool by clicking the Open Conversion Tool button.

Model ComplexityCyclomatic complexity (mathworks.metrics.CyclomaticComplexity)Model complexity by component
BlocksSimulink block count (mathworks.metrics.SimulinkBlockCount)Number of blocks by component
Stateflow LOCEffective lines of code for Stateflow blocks (mathworks.metrics.StateflowLOCCount)Effective lines of code for Stateflow blocks by component
MATLAB LOCEffective lines of MATLAB code (mathworks.metrics.MatlabLOCCount)Effective lines of code, in MATLAB Function block and MATLAB functions in Stateflow, by component

Dashboard Limitations

When using the Metrics Dashboard, note these considerations:

  • The analysis root for the Metrics Dashboard cannot be a Configurable Subsystem block.

  • The Model Advisor, a tool that the Metrics Dashboard uses for data collection, cannot have more than one open session per model. For this reason, when the dashboard collects data, it closes an existing Model Advisor session.

  • If you use an sl_customization.m file to customize Model Advisor checks, these customizations can change your dashboard results. For example, if you hide Model Advisor checks that the dashboard uses to collect metrics, the dashboard does not collect results for those metrics.

  • When the dashboard collects metrics that require a model compilation, the software changes to a temporary folder. Because of this folder change, relative path dependencies in your model can become invalid.

  • The Metrics Dashboard does not support self-modifying masked library blocks. Analysis of these components might be incomplete.

  • The Metrics Dashboard does not count MAAB checks that are not about blocks as issues. Examples include checks that warn about font formatting or file names. In the Model Advisor Check Issues widget, the tool might report zero MAAB issues, but still report issues in the MAAB Modeling Guideline Compliance widget. For more information about these issues, click the MAAB Modeling Guideline Compliance widget.

Related Topics

Was this topic helpful?