Model-Based Testing

What Is Model-Based Testing?

Model-based testing is a systematic method to generate test cases from models of system requirements. It allows you to evaluate requirements independent of algorithm design and development.

Model-based testing involves:

  • Creating a model of system requirements for testing
  • Generating test data from this requirements model representation
  • Verifying your design algorithm with generated test cases
Design and test workflow diagram. The design workflow steps are requirements, design algorithms, simulate algorithms, and validate outputs. The test workflow steps are requirements, create requirements model, autogenerate test cases with an arrow pointing to simulate algorithms in the design workflow, simulate requirements model, and validate outputs.

Generate tests from requirements using model-based testing.

Model-Based Testing Using Simulink

In model-based testing, you use requirement models to generate test cases to verify your design. This process also helps automate other verification tasks and streamlines the review process by linking test cases and verification objectives to high-level test requirements. With Requirements Toolbox™, you can author requirements directly within Simulink® or exchange requirements with third-party requirements tools. You can formalize requirements and analyze them for consistency, completeness, and correctness using the Requirements Table.

Using Simulink Test™, you manage the test cases and systematically execute them to confirm that your design meets requirements. To increase the quality of generated test cases beyond traditional stochastic and heuristic methods, you can generate tests with Simulink Design Verifier™, which uses formal analysis techniques. With Simulink Coverage™, you can use model and code coverage metrics to assess the completeness of your model-based testing efforts. These metrics can identify missing requirements and unintended functionality.

To incorporate hardware and production code into model-based testing, you can compare dynamic outputs of simulation results with data collected through testing in software-in-the-loop (SIL), processor-in-the-loop (PIL), or in real time with hardware-in-the-loop (HIL). You can use Simulink Test to help manage this equivalence testing workflow.

See also: formal verification, requirements traceability, Simulink Design Verifier, Simulink Coverage, Requirements Toolbox, Simulink Test