You can use Simulink® Test™ to author, manage, and execute tests for Simulink models and generated code. You can author tests from scratch, import existing test data and harness models, and organize tests using the Test Manager. You can execute tests in model, software-in-the-loop (SIL), processor-in-the-loop (PIL), and hardware-in-the-loop (HIL) modes, control parameters, and iterate over parametric values. You can run test cases individually, in batch, or as a filtered subset of the test file. You can also run the same tests back-to-back in multiple releases of MATLAB®.
Results include a concise pass/fail summary for elements in your test hierarchy, including iterations, test cases, test suites, and the test file. Visualization tools help you drill down into individual data sets to determine, for example, the time and cause of a particular failure. Coverage results from Simulink Coverage™ help quantify the extent to which your model or code is tested.
For example, you can:
Compare results between your model and generated code by running back-to-back equivalence tests between different environments, such as model simulation, SIL, PIL, and HIL execution.
Optimize your model or code by iterating over parametric values or configuration parameters.
Start testing on a unit level by using test harnesses, and reuse those tests as you scale up to the integration and system level.
Run models that contain test vectors and assessments inside the Simulink block diagram.
Simulink Test includes a comprehensive programmatic interface for writing test scripts, and Simulink tests can be integrated with MATLAB tests using MATLAB Unit Test.
When you author a test, you define test inputs, signals of interest, signal pass/fail tolerances, iterations over parametric values, and assessments for simulation behavior. You can author test input vectors in several ways:
Graphically, such as with the Signal Editor
From datasets, such as using Excel® or MAT files
As a sequence of test steps that progresses according to time or logical conditions
You can define assessments to indicate when functional requirements are not met. These assessments follow from your design requirements or your test plan. You can define assessments in several ways:
With a structured assessment language. The structured language helps you assess complex timing behavior, such as two events that must happen within a certain time frame. It also helps you identify conflicts between requirements.
verify statements in a Test
Assessment or Test Sequence block. For
information on how to set up the blocks in your model, see Assess Model Simulation Using verify Statements.
With blocks in the Model Verification block library.
With tolerances you set on the simulation data output. Tolerances define the acceptable delta from baseline data or another simulation.
With a custom criteria script that you author using MATLAB.
You can use existing test data and test models with Simulink Test. For example, if you have data from field testing, you can test your model or code by mapping the data to your test case. If you have existing test models that use Model Verification blocks, you can organize those tests and manage results in the Test Manager.
Using Simulink Design Verifier™, you can generate test cases that achieve test objectives or increase model or code coverage. You can generate test cases from the Test Manager, or from the Simulink Design Verifier interface. Either way, you can include the generated test cases with your original tests to create a test file that achieves complete coverage. You can also link the new test cases to additional requirements.
You can control test execution modes from the Test Manager. For example, you can:
Run tests in multiple releases of MATLAB. Multiple release testing allows you to leverage recent test data while executing your model in its production version.
Run back-to-back tests to verify generated code. You can run the same test in model, SIL, and PIL mode and compare numeric results to demonstrate code-model equivalence.
Run HIL tests to verify systems running on real-time hardware using
verify statements in your model
that help you determine whether functional requirements are met.
Decrease test time by running tests in parallel using the Parallel Computing Toolbox™, or running a filtered subset of your entire test file.
When reporting your test results, you can set report properties that match your development environments. For example, reporting can depend on whether tests passed or failed, and reports can include data plots, coverage results, and requirements linked to your test cases. You can create and store custom MATLAB figures that render with a report. Reporting options persist with your test file, so they run every time you execute a test.
A MATLAB Report Generator™ license adds additional customization options, including:
Creating reports from a Microsoft® Word or PDF template
Assembling reports using custom objects that aggregate individual results