Every program in newer versions of Matlab runs slower, Why?
Show older comments
Every program in newer versions of Matlab runs slower and slower. I have measured with external stop watches also. And it does drop frame rate in vision processing which is very obvious from human observation too.
Bench (Same matlab, different computers) produces same results butactual program runs slower measured with external stop watches.
Bench (different Matlab vesrions, same PC) produces rsults which match external stop watches.
Also newer figure Uifigure and widgets are too slow.
I have swicth to old matlab for faster execution.
Why is it happening?
3 Comments
Adam Danz
on 3 Jan 2022
I can't answer your question but it would be helpful to know which old and new versions you're comparing.
Khalid Mahmood
on 6 Jan 2022
Walter Roberson
on 6 Jan 2022
R2021b should have faster App Designer than earlier releases; they worked hard to make performance improvements.
Answers (1)
Rik
on 6 Jan 2022
1 vote
Some things get slower, some things get faster. It really matters what exact functions you're calling.
If I write a function where time might be an important factor, I tend to write my own performance tester. Feel free to have a look at one of my functions: ComputeNonCryptHash. If you look at the performance section (the HTML doc is the 'examples' tab), you can compare the relative speed.
8 Comments
Khalid Mahmood
on 17 Mar 2023
Walter Roberson
on 17 Mar 2023
Could you give some examples of companies working in comparable fields that publish official benchmarks?
Rik
on 17 Mar 2023
There isn't really much point in publishing benchmarks for all functions, because then you would have the tedious job of putting everything together. And even then, the performance changes based on how often the function has been called with a comparable syntax (or the exact same data). The only logical thing is to let you write your own benchmark.
Matlab is such a multi-purpose tool that I doubt publishing benchmarks with enough detail for each function would be feasible, while anything less would be meaningless.
Khalid Mahmood
on 17 Mar 2023
Edited: Khalid Mahmood
on 17 Mar 2023
Khalid Mahmood
on 17 Mar 2023
Rik
on 17 Mar 2023
I think you underestimate how hard it is to write a good benchmark function. What kind of data are you going to use? What sort of procedural generation are you going to use? How many edge cases are you going to include?
I don't doubt they have internal tools to assess performance, but I can understand a hesitancy to share such things. They also depend on hardware, not just code. Most things in Matlab run on a single thread, but some things can run in parallel, in which case the speed up will depend on your core count. There are just a huge amount of variables.
It is not that I think it is tedious to run the benchmarks, because I don't. They developped a testing framework. Do you really think there would be any human interaction when running the test suite? I just code for myself, and even I have created tools that avoid manual interaction.
Just a final note: the things bench does is not guaranteed to stay the same between versions. Mathworks changes it every now and then to reflect changes in expected hardware and workloads. (like how pacman is not used to benchmark the latest generation of GPUs)
Michelle Hirsch
on 17 Mar 2023
I can confirm that we do extensive performance testing throughout our product family. We run unit-level performance tests and application-level tests.
It's not that hard to write a performance test - the hard part is making sense out of it, so I agree with Rik and others that it's not that useful to publish all performance tests. I'm the head of product for MATLAB and I find most unit level tests impenetrable. They are essential for locking down performance, but aren't helpful for figuring out the impact of changes on the code that I care about. Application-level tests can be much more interesting, but aren't easily generalizable to other applications.
We have been increasingly publishing performance release notes, but these focus on just the good news - things that we've made faster.
@Khalid Mahmood, I encourage you to reach out to MathWorks support with the specific issues you are encountering with slowdowns from release to release. They may be able to suggest a workaround and they may pass along helpful feedback to development to address the issues in future releases.
You've also identified many functions you've made appreciably faster. This could prove to be helpful input to development, too. Though I'll caution that I often see that there's a tradeoff between generality and performance. Once I make the limiting assumptions that are valid for my code, I have the potential to bake these assumptions into the code I call to get better performance. But the changes I make might not be appopriate for other users of the same functions - they might slow down different use cases, stop supporting other use cases altogether, reduce numerical stability, etc ...
Steven Lord
on 18 Mar 2023
I agree 100% with what Michelle wrote. I've written more than a few unit-level performance tests over the nearly 19 years I've been in the MathWorks Quality Engineering department. Based on my experience publishing performance data would not be as simple as writing a simple function or class, publishing a one or two page report tabulating the data, and calling it a day.
As Rik mentioned (There are just a huge amount of variables.) even a simple function can require a large collection of unit-level performance test cases that sweep over several dimensions including but not limited to:
- problem sizes (small, medium, and large, so we can evaluate how a function's performance scales)
- data types (if there are different code paths used for double precision, single precision, integer data, logical data, etc.)
- sparsity (sparse matrices are stored differently in memory than full matrices and require or can benefit from processing by algorithms that can take advantage of the storage format)
- and sometimes properties of the data itself. See the Algorithms section of the documentation page for the mldivide function aka the \ operator for how it handles matrices with different shapes and properties.
And that's just for one function. Looking at the function list for MATLAB in the documentation the Mathematics category alone has 558 functions. [That's the one with which I'm most familiar.] Granted, not all of those 558 functions would require the same amount of performance testing. At the simplest extreme, something like the pi function doesn't need to have a performance test. :) But you'd get positively buried in data if we reported on decent sized subsets of the 558 Mathematics functions and the 534 Data Import and Analysis functions and the 539 Language Fundamentals functions and the ...
And that's not even touching upon the many toolboxes we have.
Categories
Find more on Get Started with MATLAB in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!