Why are sparse matrix operations slow in Simulink R2023a?

I am developing a simulation model using Simulink R2023a. Some of the parts of the system require complex matrix logic would be complex to implement in Simulink directly, so I am using a MATLAB Function block to embed a MATLAB implementation into my Simulink model.
This MATLAB Function block makes heavy use of very large sparse matrices, and these use MATLAB's sparse matrix operations with the intention that it will decrease the memory usage and compute faster. This happens when running the algorithm in MATLAB, but within the MATLAB Function block the sparse matrix operations are about 5x slower than their MATLAB counterparts.
Why is the MATLAB Function block so much slower? And what can be done to improve this?

 Accepted Answer

In Simulink, MATLAB Function blocks are executed via code generation - i.e. the MATLAB code is converted to C/C++ code and that is executed. In most cases, this causes a substantial speed-up by removing the overheads from the MATLAB interpreter. However, in the case of sparse matrix operations, the operations in MATLAB are executed using highly optimized math libraries but the code generation cannot use this and therefore falls back on non-sparse operations. This causes the slowdown you describe.
However, MATLAB's sparse operations can be used if we make sure that MATLAB code is run in "interpreted" mode - in other words, not using code generation. This can be achieved by putting the MATLAB code in a MATLAB System block, and setting the execution mode to "Interpreted Execution", as described here:
https://uk.mathworks.com/help/releases/R2023a/simulink/ug/what-is-matlab-system-block.html
This will cause a slight overhead from the communication with the MATLAB interpreter, but the performance of the sparse operations will be as fast as MATLAB. For sufficiently large sparse matrices, the cost of the communication is negligible.

6 Comments

Correction: The MATLAB Function block does perform "sparse" operations for sparse matrices. However, due to the lack of optimized sparse libraries in the code generation context, spare operations in MATLAB Function blocks tend to be substantially slower. (Note, work is underway to improve this situation.)
An alternative to the MATLAB System Block workaround above is to use coder.extrinsic calls in the MATLAB Function block.
Is it the case that the inputs to and outputs from a Matlab Function block have to be full, and the only sparse support is for variables that are local to the Matlab Function block?
Can you provide a bit more insight into the libraries? The generated code doesn't link to the same sparse libraries as used by Matlab? If not, do the libraries used by the generated code just convert to full, do the operations in full, and convert back to sparse? Or are they really operating on sparse, but just not as well as the base Matlab libraries?
Hi Walter,
That linked doc page doesn't really answer the first question in my comment, assuming that's the reason it's cited. "sparse" is not a type, it's an attribute, i.e., a full double and a sparse double are both of the same numeric type.
I don't believe that a Simulink signal can be sparse based on my experience, though I couldn't find anything in the doc specifically on point. The closest I could find is Signal Types, which does not state that sparse is not supported for numeric signal types.
I do see at Code Generation for Sparse Matrices that code generation does not support sparse for Simulink signals, but that's not the same as Simulink not supporting sparse for Simulink signals in general.
Unfortunately, that doc page doesn't alert the user that performance with sparse in generated code might (will?) not be as good as in base Matlab as stated by @Fred Smith.
Also, I'm curious if that same reduction in sparse performance applies to code generation in general, e.g., from Matlab Coder, or only when code is generated from Simulink.

As Walter says, Simulink does not support sparse signals. Therefore, sparse matrices can only be used within the block.

Code generation from Matlab Coder and the Matlab Function block are the same for sparse.

A goal for code generation is to support arbitrary targets, which is why these blocks do not use MATLAB’s shipping optimized sparse libraries. Rather they produce C code that implements the sparse algorithms .

We have been working to improve the performance of these sparse implementations.

For full matrix math the blocks do use MATLAB’s optimized BLAS and LAPACK but it takes work to support both paths. For sparse, there has not been enough demand to optimize this case yet.

Hope this clarifies the situation. Can you say more about your use case?

If sparse Simulink signals are not supported, the doc should say so. Maybe it does, but I can't find it.
If generated code from a Matlab Function handles sparse operations differently than base Matlab, the doc should say so here.
Do Simulink blocks, such as Product, Matrix Multiply, link to the same libraries as base Matlab when running in normal mode? Quite some time ago Tech Support told me that was, in fact, not the case. Maybe that situation has changed in the intervening years.
Is there any difference in how sparse in a Matlab Function block is handled for simulation as opposed to generating code?
I understand the desire to support arbitrary targets. Why doesn't sparse in a Matlab Function work the same way as fft? As I understand that situation from Speed Up Fast Fourier Transforms in Code Generated from a MATLAB Function Block a call to fft in a Matlab Function block will use the same fftw library as base Matlab for simulation, and the arbitrary vs. specific target issue is controllable by the user "If you generate C/C++ code for this model," Seems like sparse could be handled the same way.
I did not know that code generation for a Matlab Function block is different for simulation as compared to "generate C/C++ code for this model."

Sign in to comment.

More Answers (0)

Categories

Products

Release

R2023a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!