Documentation

Introduction to the Test Manager

The Test Manager in Simulink® Test™ helps you to automate Simulink model testing and organize large sets of tests. You perform model tests in the Test Manager using test cases in which you specify the criteria that determine a pass-fail outcome. After you run a test, you can view and share the results.

Start the Test Manager

You can start the Test Manager from a model or from the MATLAB® command prompt.

  • To start the Test Manager from a model, select Analysis > Test Manager.

  • To start the Test Manager from the command prompt, enter: sltestmgr.

Create Tests and Understand the Test Hierarchy

In the Test Manager, you create test files, which contain one or more test suites that each contain one or more test cases.

To create a test file, in the Test Manager, select New > Test File. Name the file and click Save.

The test files and their contents appear in the Test Browser pane.

Each new test file contains a test suite, New Test Suite 1, which contains a test case, New Test Case 1. You can rename test suites and test cases in the browser. The figure shows a test file that contains two test suites that each contain test cases.

Add test suites and test cases to the test file hierarchy using the New menu. Use test suites to group related test cases. For each test case, specify details such as the model under test, the simulation outputs to capture, and parameter overrides to apply. Run the test case in the Test Manager, and view results in the Results and Artifacts pane.

For baseline and equivalence test cases, you can specify tolerances for the simulation outputs that determine pass or fail. For more information on setting tolerances, see Apply Tolerances to Test Criteria.

Using the Test Manager, from the New menu, you can create these types of test cases:

  • Baseline — A baseline test is a type comparison test. For a baseline test, you first generate a baseline set of simulation outputs as the basis for comparison. Running a baseline test compares the outputs of the comparison simulation to the baseline. With equivalence tests, you can specify tolerances that determine a range of values that allow the test to pass. That is, the results are equivalent even if not the same. You can set absolute, relative, leading, or lagging tolerances in the Baseline Criteria section of the test case. See Test Model Output Against a Baseline.

  • Equivalence — An equivalence test in Test Manager compares signal outputs from two simulations. You can specify tolerances that help the test determine whether the results are equivalent. Set tolerances in the Equivalence Criteria section of the test case. See Test Two Simulations for Equivalence.

  • Simulation — A simulation test checks that a simulation runs without errors, including model assertions. See Test a Simulation for Run-Time Errors.

  • Real-Time Test — A baseline, equivalence, or simulation test that runs on the target hardware. See Test Models in Real Time.

  • Test Manager Generated Tests — Tests that you do not need to configure:

    • Test File from Model, which generates a test file and uses signal builders and test harnesses in the model as the basis for generating test cases. See Generate Test Cases from Model Components.

    • Test for Subsystem, which generates a test harness for the subsystem you select and generates a test case to run on the test harness. See Generate Test for a Subsystem.

View Test Results

Tests can pass or fail. If all the criteria defined in a test case are satisfied, within the defined tolerances, then a test passes. If any of the criteria are not satisfied, then the test fails. After the test runs, you can see the results in the Results and Artifacts pane. Each test result has a summary page that highlights the outcome of the test: passed, failed, or incomplete. You can also see the simulation output in the results. You can further inspect the signal data from the simulation output using the data inspector view. To view a result in the data inspector view, select it.

Share Results

Once you have completed the test execution and analyzed the results, you can share the test results with others or archive them. If you want to share the results to view later in the Test Manager, then you can export the results to a file. To archive the results in a document, generate a report, which can include the test outcome, test summary, and criteria used for test comparisons. See Export Test Results and Generate Reports.

Compare Test Files

You can use the Compare command in the File section of the MATLAB toolstrip to compare two test files. Comparing test files is useful for determining the differences between two similar test files. For example, you can see whether they contain the same test cases and whether those test cases are configured identically.

  1. From the File section of the MATLAB toolstrip, click Compare.

  2. In the First file or folder box, enter the first test file that you want to compare. Test files are in the .mldatx format.

  3. In the Second file or folder box, enter the second test file that you want to compare.

  4. For Comparison type, select Simulink Test File Comparison. Then click Compare.

    The figure shows an example of a comparison between two test files. The highlights indicate where one file specifies information that the comparison file does not. For example, newbaseline.mldatx includes a test suite that the other file does not contain.

Related Topics

Was this topic helpful?