Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Component Verification

Using component verification, you can test a design component in your model with one of these approaches:

  • System analysis. Within the context of the model that contains the component, you use systematic simulation of closed-loop controllers to verify components within a control system model. You can then test the control algorithms with your model.

  • Component analysis. As standalone components, for a high level of confidence in the component algorithm, verify the component in isolation from the rest of the system.

    Verifying standalone components provides several advantages:

    • You can use the analysis to focus on portions of the design that you cannot test because of the physical limitations of the system being controlled.

    • For open-loop simulations, you can test the plant model without feedback control.

    • You can use this approach when the model is not yet available or when you need to simulate a control system model in accelerated mode for performance reasons.

Simulink Verification and Validation Tools for Component Verification

By isolating a component to verify and by using tools that the Simulink® Verification and Validation™ software provides, you create test cases to expand the scope of the testing for large models. You can:

  • Achieve 100% model coverage — If certain model components do not record 100% coverage, the top-level model cannot achieve 100% coverage. By verifying these components individually, you can create test cases that fully specify the component interface, allowing the component to record 100% coverage.

  • Debug the component — To verify that each model component satisfies the specified design requirements, you can create test cases that verify that specific components perform as they were designed to perform.

  • Test the robustness of the component — To verify that a component handles unexpected inputs and calculations properly, you can create test cases that generate data. Then, test the error-handling capabilities in the component.

Workflow for Component Verification

This graphic illustrates two approaches for component verification.

  1. Choose your approach for component verification:

    • For closed-loop simulations, verify a component within the context of its container model by logging the signals to that component and storing them in a data file. If those signals do not constitute a complete test suite, generate a harness model and add or modify the test cases in the Signal Builder.

    • For open-loop simulations, verify a component independently of the container model by extracting the component from its container model and creating a harness model for the extracted component. Add or modify test cases in the Signal Builder and log the signals to the component in the harness model.

  2. Prepare component for verification.

  3. Create and log test cases. You can also merge the test case data into a single data file.

    The data file contains the test case data for simulating the component. If you cannot achieve the expected results with a certain set of test cases, add new test cases or modify existing test cases in the data file, and merge them into a single data file.

    Continue adding or modifying test cases until you achieve a test suite that satisfies the goals of your analysis.

  4. Execute the test cases in software-in-the-loop or processor-in-the-loop mode.

  5. After you have a complete test suite, you can:

    • Simulate the model and execute the test cases to:

      • Record coverage.

      • Record output values to make sure that you get the expected results.

    • Invoke the Code Generation Verification (CGV) API to execute the generated code for the model that contains the component in simulation, software-in-the-loop (SIL), or processor-in-the-loop (PIL) mode.

        Note:   To execute a model in different modes of execution, you use the CGV API to verify the numerical equivalence of results. For more information about the CGV API, see Programmatic Code Generation Verification (Embedded Coder).

Verify a Component Independently of the Container Model

Use component analysis to verify:

  • Model blocks

  • Atomic subsystems

  • Stateflow® atomic subcharts

  1. Depending on the type of component, take one of the following actions:

    • Model blocks — Open the referenced model.

    • Atomic subsystems — Extract the contents of the subsystem into its own Simulink model.

    • Atomic subcharts — Extract the contents of the Stateflow atomic subchart into its own Simulink model.

  2. Create a harness model for:

    • The referenced model

    • The extracted model that contains the contents of the atomic subsystem or atomic subchart

  3. Add or modify test cases in the Signal Builder in the harness model.

  4. Log the input signals from the Signal Builder to the test unit.

  5. Repeat steps 3 and 4 until you are satisfied with the test suite.

  6. Merge the test case data into a single file.

  7. Depending on your goals, take one of the following actions:

    • Execute the test cases to:

      • Record coverage.

      • Record output values and make sure that they equal the expected values.

    • Invoke the Code Generation Verification (CGV) API to execute the test cases in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode on the generated code for the model that contains the component.

If the test cases do not achieve the expected results, repeat steps 3 through 5.

Verify a Model Block in the Context of the Container Model

Use system analysis to:

  • Verify aModel block in the context of the block's container model.

  • Analyze a closed-loop controller.

  1. Log the input signals to the component by simulating the container model or analyze the model using the Simulink Design Verifier™ software.

  2. If you want to add test cases to your test suite or modify existing test cases, create a harness model with the logged signals.

  3. Add or modify test cases in the Signal Builder in the harness model.

  4. Log the input signals from the Signal Builder to the test unit.

  5. Repeat steps 3 and 4 until you are satisfied with the test suite.

  6. Merge the test case data into a single file.

  7. Depending on your goals, do one of the following:

    • Execute the test cases to:

      • Record coverage.

      • Record output values and make sure that they equal the expected values.

    • Invoke the Code Generation Verification (CGV) API to execute the test cases in software-in-the-loop (SIL) or processor-in-the-loop (PIL) mode on the generated code for the model.

If the test cases do not achieve the expected results, repeat steps 3 through 5.

Was this topic helpful?