Polyspace® verification cannot automatically produce quality code at the end of the development process. However, if you integrate Polyspace verification into your development process, Polyspace verification helps you to measure the quality of your code, identify issues, and ultimately achieve your own quality goals.
To implement Polyspace verification within your development process, you must perform each of the following steps:
Define your quality goals.
Define a process to match your quality goals.
Apply the process to assess the quality of your code.
Improve the process.
Before you can verify whether your code meets your quality goals, you must define those goals. This process involves:
Before using Polyspace products to verify your code, you must decide what type of software verification you want to perform. There are two approaches to code verification that result in slightly different workflows:
Robustness Verification – Prove software does not generate run-time errors for all verification conditions.
Contextual Verification – Prove software does not generate run-time errors under normal working conditions.
Note: Some verification processes may incorporate both robustness and contextual verification. For example, developers may perform robustness verification on individual files early in the development cycle, while writing the code. Later, the team may perform contextual verification on larger software components.
Robustness Verification. Robustness verification proves that the software does not generate run-time errors under all verification conditions, including "abnormal" conditions for which it was not designed. This can be thought of as "worst case" verification.
By default, Polyspace software assumes you want to perform robustness verification. In a robustness verification, Polyspace software:
Assumes function inputs are full range
Initializes global variables to full range
Automatically stubs missing functions
While this approach ensures that the software works under all verified conditions, it can lead to orange checks (unproven code) in your results. You must then manually inspect these orange checks in accordance with your software quality goals.
Contextual Verification. Contextual verification proves that the software works under predefined working conditions. This limits the scope of the verification to specific variable ranges, and verifies the code within these ranges.
When performing contextual verification, you use Polyspace options to reduce the number of orange checks. For example, you can:
Use Data Range Specifications (DRS) to specify the ranges for your variables, thereby limiting the verification to these cases. For more information, see Inputs & Stubbing.
Create a detailed main program to model the call sequence, instead of using the default main generator. For more information, see Manually Generating a Main.
Provide manual stubs that emulate the behavior of missing functions, instead of using the default automatic stubs. For more information, see Manual vs. Automatic Stubbing.
The software quality level you define determines which Polyspace options you use, and which results you must review.
You define the quality levels for your application, from level SQL-1 (lowest) to level SQL-4 (highest). Each quality level consists of a set of software quality criteria that represent a certain quality threshold. For example:
Software Quality Levels
|Criteria||Software Quality Levels|
|Document static information||X||X||X||X|
|Review all red checks||X||X||X||X|
|Review all gray checks||X||X||X||X|
|Review first criteria level for orange checks||X||X||X|
|Review second criteria level for orange checks||X||X|
|Perform dataflow analysis||X||X|
|Review third criteria level for orange checks||X|
In the example above, the quality criteria include:
Static Information – Includes information about the application architecture, the structure of each module and file. Full verification of your application requires the documentation of static information.
Red checks – Represent errors that occur every time the code is executed.
Gray checks – Represent unreachable code.
Orange checks – Indicate unproven code, meaning a run-time error may occur. .
Dataflow analysis – Identifies errors such as non-initialized variables and variables that are written but not subsequently read. This can include inspection of:
Application call tree
Read/write accesses to global variables
Shared variables and their associated concurrent access protection
Once you have defined your quality goals, you must define a process that allows you to meet those goals. Defining the process involves actions both within and outside Polyspace software.
These actions include:
Once you have defined a process that meets your quality goals, it is up to your development team to apply it consistently to all software components.
This process includes:
Running a Polyspace verification for each software component as it is written.
Reviewing verification results consistently. See Results Management.
Saving review comments for each component, so they are available for future review. See Add Review Comments to Results.
Performing additional verifications on each component, as defined by your quality goals.
Once you review initial verification results, you can assess both the overall quality of your code, and how well the process meets your requirements for software quality, development time, and cost restrictions.
Based on these factors, you may want to take actions to modify your process. These actions may include:
Reassessing your quality goals.
Changing your development process to produce code that is easier to verify.
Changing Polyspace analysis options to improve the precision of the verification.
Changing Polyspace options to change how verification results are reported.
For more information, see Reduce Orange Checks.