Collecting Requirements-Based Testing Metrics Using Continuous Integration
Requirements-based testing metrics allow you to assess the status and quality of your requirements-based testing activities. You can visualize the results by using the Model Testing Dashboard and integrate metric collection by using continuous integration workflows. Continuous collection of these metrics helps you to monitor the progression and quality of a project. This example uses GitLab® to host the project source and Jenkins® to build and test the project as well as archive the results.
Use of MATLAB® projects
Use of the Simulink Test Manager test harness
Set Up the Project in Source Control
Create a GitLab project for source-controlling your project. For more information, see https://docs.gitlab.com/ee/index.html.
Install the Git Client.
Set up a branching workflow. Using GitLab, from the main branch, create a temporary branch for implementing changes to the model files. Integration engineers can use Jenkins test results to decide whether to merge a temporary branch into the main branch. For more information, see https://git-scm.com/book/en/v2/Git-Branching-Branching-Workflows.
Under Settings > Repository, protect the main branch by enforcing the use of merge requests when developers want to merge their changes into the main branch.
Under Settings > Integrations, add a webhook to the URL of your Jenkins project. The webhook triggers a build job on Jenkins.
Add the Project
This example uses the example project for the dashboard. To create a working copy of the project, at the command line, enter:
Add all of the files in the project along with the files attached to this example to the main branch by using Git™. These scripts are used to run tests and collect the metrics.
Derived Artifact Filtering
Collecting metrics generates files that you typically do not want checked into source control. Git allows you to ignore files by adding filters to a text file named
.gitignore located at the root directory. You can add the sample
.gitignore file attached to this example which will filter the files generated by this example that do not need to be added to source control. For more information on
.gitignore files, see https://git-scm.com/docs/gitignore.
Set Up the Project in the Continuous Integration Tool
The continuous integration tool automates the building and testing of the project. Many different tools can be used to automatically generate requirements-based testing results by following the same general steps. In this example, use Jenkins as the automation tool. To run the example, you must install the GitLab and MATLAB plugins for Jenkins.
Creating the Project
The CI tool will need integration to the source control repository of the project. The integration allows the CI tool to listen to changes and access the project to build. Jenkins provides a Freestyle project which serves as a generic template for projects which can work with any source control management (SCM). In the Freestyle project, add the source control information to enable the SCM to access the hosted project.
Click New Item, fill in the name, and choose Freestyle project. Or, for an existing Freestyle Jenkins project, click Configure.
Click the Source Code Management tab and specify the URL of your GitLab repository in the Repository URL field.
Click the Build Triggers tab and select Build when a change is pushed to GitLab.
Click the Build Environment tab, select Use MATLAB version, and provide the MATLAB root to specify the MATLAB version for the build.
Building and Testing
The MATLAB plugin for Jenkins allows integration by specifying MATLAB commands as well as configuring testing, bypassing the need to use the command line. In this example, a single build step will be used to open the project, initialize the metrics infrastructure, run tests, and then collect results.
In the Build section, select Add build step > Run MATLAB Command. In the Command field, enter:
Archiving and Consuming Metric Results
Metrics results can be archived during the build step and then reimported into MATLAB when you want to review them. In this example, the result collection script stores the metric data in the
derived directory. Because some of the metrics rely on exported Simulink Test results, include the exported
.mldatx files in the archive.
To archive results for later review, configure the CI system to export these files:
All files located in
All test results exported to
For this example, use the Jenkins provided post-build action to archive artifacts produced during the build.
Click the Post-build Actions tab and click Add post-build action. Choose Archive the artifacts. Enter the path:
to archive all files saved to that directory.
Click the Save button to save and close the configuration.
Running a Build Job in Jenkins
Jenkins is now configured to execute a new build job each time new changes to the project have been committed to the GitLab repository. You can also manually run a build by clicking on Build Now in the Jenkins project page.
Reviewing the Archived Results in MATLAB
Jenkins will store all the files generated and archived for each successful build and these can be viewed individually or downloaded together in a single zip file. To view the results in MATLAB:
Get the version of the project that was used to generate the results from source control.
Get the archived metric results from the archived location.
Download and copy or extract the
deriveddirectory and all files into the root directory of the project.
Download the archived, exported, Simulink Test results files and copy or extract these files.
Open the project in MATLAB and open the Model Testing Dashboard. The dashboard displays the results generated from the CI build.
Alternative CI Integration Using Command Line
If you use a different automation tool, you can alternatively use the command line for testing integration. Run the tests and collect metrics by running the appropriate commands through the command line interface along using the
For example, when you use this command, MATLAB opens the project, initializes the model testing results, runs all of the tests, collects the model metrics, and then shuts down.
matlab -c %LICENSE_PATH% -nosplash -logfile output.log -batch "openProject(pwd);collectModelTestingResults();runTests();collectModelTestingResults(); exit;"