You can use model metrics to assess that your model and code comply with size, complexity, and readability requirements. As you test your model against requirements, you can use metrics to assess the status and quality of your requirements-based testing activities. You can use the model metric API to create your own model metrics, compute model and testing metrics, and export metric data. To visualize model metric data and compliance status for your model, use the Metrics Dashboard. Use the Model Testing Dashboard to view metric data on the completeness of requirements, test cases, and test results for your model. To get started, see Collect and Explore Metric Data by Using the Metrics Dashboard and Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.
|Collect metric data on models or model components|
|Set metadata for custom metrics|
|Metric data for specified model component and metric algorithm|
|Metric data for specified model metric|
| Details about instances of |
|Access metric data thresholds results|
|Specify metric data categories and custom metric families|
| Specify metric and |
|Object for holding metric result thresholds|
|Specify categorical metric data ranges|
|Specify metric data threshold values|
|Object containing information on Metrics Dashboard layout and widgets|
| Widget for holding |
|Object for holding custom Metrics Dashboard widgets|
| Widget for holding |
|Create object for holding Metrics Dashboard customizations|
|Object for holding Actual/Potential Reuse, System Interface, or System Info widgets|
|Obtain file path and name of XML file containing active Metrics Dashboard custom configuration|
|Activate custom configuration for metric engine to use|
|Activate custom metric dashboard layout|
|Obtain file path and name of XML file containing active Metrics Dashboard layout|
|Create new metric class for a custom model metric|
|Register a custom model metric with the model metric repository|
|Unregister a custom model metric from the model metric repository|
|Update available model metrics|
Collect and view metric data for quality assessment.
Configure compliance metrics, add metric thresholds, and customize Metrics Dashboard layout.
Use the model metric API to programmatically collect metrics for a model, such as subsystem and block counts.
Model metrics provided by MathWorks that return metric data on your model for size, complexity, readability, and compliance to standards and guidelines.
Options for defining model metric data aggregation and returning aggregated model metric results.
Create a new model metric by using the
slmetric.metric.createNewMetricClass function and
defining the metric algorithm.
You can use the Model Metrics Dashboard tool to enable subsystem reuse by identifying exact graphical clones across a model hierarchy.
Assess model size, complexity, and readability, using the Model Advisor to run model metric checks.
This example shows how to collect model metric data by using the Metrics Dashboard and explore detailed compliance results and, fix compliance issues by using the Model Advisor.
This example shows how to use the model metrics API to collect model metric data for your model, and then explore the results by using the Metrics Dashboard.
Use a continuous integration workflow to investigate whether your model violates metric threshold values.
Assess the complexity of your system in model-based design.
Evaluate the status and quality of model testing in your project.
Fix model testing quality issues by using the Model Testing Dashboard.
Model testing metrics provided by MathWorks that return metric data on your model for implementing and testing requirements.
Set up and manage a project that uses the Model Testing Dashboard.
Use a script to assess the quality of your requirements-based testing.
Use the Model Testing Dashboard to analyze the completeness and quality of requirements-based testing activities in accordance the ISO 26262 standard.