Documentation

Overall HDL Filter Code Optimization

Optimize for HDL

By default, generated HDL code is bit-compatible with the numeric results produced by the original filter object. The Optimize for HDL option generates HDL code that is slightly optimized for clock speed or space requirements. However, this optimization causes the coder to:

  • Make tradeoffs concerning data types.

  • Avoid extra quantization.

  • Generate code that produces numeric results that are different than the results produced by the original filter object.

To optimize generated code for clock speed or space requirements:

  1. Select Optimize for HDL in the Filter architecture pane of the Generate HDL dialog box.

  2. Consider setting an error margin for the generated test bench. The error margin is the number of least significant bits the test bench ignores when comparing the results. To set an error margin,

    1. Select the Test Bench pane in the Generate HDL dialog box. Then click the Configuration tab.

    2. Set the Error margin (bits) field to an integer that indicates the maximum acceptable number of bits of difference in the numeric results.

  3. Continue setting other options or click Generate to initiate code generation.

Command-Line Alternative: Use the generatehdl function with the property OptimizeForHDL to enable these optimizations.

Set Error Margin for Test Bench

Customizations that provide optimizations can generate test bench code that produces numeric results that differ from results produced by the original filter object. These options include:

  • Optimize for HDL

  • FIR adder style set to Tree

  • Add pipeline registers for FIR, asymmetric FIR, and symmetric FIR filters

If you choose to use these options, consider setting an error margin for the generated test bench to account for differences in numeric results. The error margin is the number of least significant bits the test bench ignores when comparing the results. To set an error margin:

  1. Select the Test Bench pane in the Generate HDL dialog box.

  2. Within the Test Bench pane, select the Configuration subpane.

  3. For fixed-point filters, the initial Error margin (bits) field has a default value of 4. To change the error margin, enter an integer in the Error margin (bits) field. In the figure, the error margin is set to 4 bits.

Command-Line Alternative: Use the generatehdl function with the property ErrorMargin to set the comparison tolerance.

Was this topic helpful?