Loading results from an excel sheet and launching Simulink Design Verifier again to produce a new Simulink Design Verifier Report

2 views (last 30 days)
I am having some trouble creating new Simulink Design Verifier Reports that also include all satisfied objectives.
When I run my unit tests on vector or matrix inputs, there are some cases where no matter how much time I leave the verifier running, it will never converge to a result, forcing me to stop the tests and update manually all results so that they reach all branches of my code. Running the tests and looking at the test results, I can get a 100% coverage on all of my code, but the problem is that I would also like to have a Simulink Design Verifier html report that reports all the satisfied objectives. When I test simpler functions with scalar input such report file is generated automatically, so how do I re-run the harness creator with the updated inputs?
I have already tried increasing heap memory and running intermediate tests, no dice. Currently working on 2020a.

Accepted Answer

Devendra Bhave
Devendra Bhave on 11 Feb 2021
Edited: Devendra Bhave on 11 Feb 2021
Hi Marco,
SLDV supports extending existing test cases to achieve 100% coverage.
Let's call the set of test cases you authored manually M (stands for manually-written).
SLDV can generate an additional set of test cases A (stands for auto-generated) such that the set M + A gives you 100% coverage. This workflow is called "Test Case Extension" and produces a detailed SLDV report with the test cases and coverage details.
Test extension offers benefits over writing test cases manually. It treats test cases in M as a hint and discovers many new test cases by tweaking them. That means instead of writing 10 test cases manually, you simply write 2 test cases and let SLDV discover the remaining ones.
Section "Test Case Extension" in SLDV documentation describes this workflow.
These are the links for your quick reference.

More Answers (1)

Devendra Bhave
Devendra Bhave on 7 Feb 2021
Hi Marco,
I understand your query as follows:
You are running SLDV test generation analysis on your model to get 100% coverage, and you aim to get SLDV report showing the full coverage. SLDV is unable to cover all objectives automatically. So you wrote few unit test cases yourself in an Excel file to obtain 100% coverage. You verified that your test cases indeed cover all objectives. But you need the SLDV report stating that all objectives have been covered.
It is not clear from your query how are you importing Excel-based test cases in SLDV, or how you are measuring coverage for your test cases? Are you using other MathWorks products like Simulink Test?
I suggest you call MathWorks support. This will help me understand your requirements better and offer you the best guidance on how to obtain SLDV report you seek.
  2 Comments
Marco Montanaro
Marco Montanaro on 9 Feb 2021
Hello, Devandra,
You are running SLDV test generation analysis on your model to get 100% coverage, and you aim to get SLDV report showing the full coverage. SLDV is unable to cover all objectives automatically. So you wrote few unit test cases yourself in an Excel file to obtain 100% coverage. You verified that your test cases indeed cover all objectives. But you need the SLDV report stating that all objectives have been covered.
Thanks for putting my query into better terms, yes, that's exactly what I wish to do. Basically, after running a preliminary test batch using SLDV, it runs out of allowed computational time and asks me if I want a result sheet and a test harness produced, which I agree to. The Simulink Design Verifier Report it outputs shows all the objectives for the function, both the completed ones and those that were not reached before the maximum allowed execution time. The SLDV also outputs an excel sheet containing the input data for all generated tests. I then manually change said inputs in the generated harness and run them through the Simulink Test application. When the tests are run, the Results page contains a list of the executed functions, alongside the coverage percentage according to the selected coverage metrics (in my case, I had set up SLDV so that it would use MCDC). If some branches of the functions being run weren't reached by the test harness, Simulink Test highlights all missing cases (for example, an if statement like "A && B" where A never changes in the input data, meaning situations where A assumes a different value are not being tested). Using that information, I modify the input data once again until all functions have all their branches covered.
Now, having hopefully made my MO clearer, when I do finally reach a 100% coverage for all the functions under test, the Results page allows me to export HTML files containing proof that all functions have been tested accurately. Problem is, what I wish to have is the same output page as the Simulink Design Verifier Report, meaning I wish for Simulink to also show me all objectives being fully tested as well as the input data used for the test. I've tried to work around this by changing Design Verifier's settings so that it takes as input data a matrix containing my doctored inputs, but for some reason I can't seem to find a way to feed it the right kind of input. Basically what I'm asking is, is there a way to reproduce the Simulink Design Verifier Report after editing the generated harness, or alternatively, a way to feed a custom harness to Design Verifier so that it automatically generates a new harness and a Simulink Design Verifier Report with all objectives satisfied? For the record, the Simulink Design Verifier Report I'm referring to looks like this (for a function that had all their requirements satisfied at least):
By the way, and this might be of help for some other issue, but SLDV seems to have a lot of trouble when dealing with arrays that have to be tested element by element in for/while loops.

Sign in to comment.

Categories

Find more on Results, Reporting, and Test File Management in Help Center and File Exchange

Products


Release

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!