What should go in a next-generation MATLAB X?

Andrew Janke on 11 Sep 2021
Latest activity Reply by Stephen23 on 3 Apr 2025 at 8:23

Let's say MathWorks decides to create a MATLAB X release, which takes a big one-time breaking change that abandons back-compatibility and creates a more modern MATLAB language, ditching the unfortunate stuff that's around for historical reasons. What would you like to see in it?
I'm thinking stuff like syntax and semantics tweaks, changes to function behavior and interfaces in the standard library and Toolboxes, and so on.
(The "X" is for major version 10, like in "OS X". Matlab is still on version 9.x even though we use "R20xxa" release names now.)
What should you post where?
Wishlist threads (#1 #2 #3 #4 #5): bugs and feature requests for Matlab Answers
Frustation threads (#1 #2): frustrations about usage and capabilities of Matlab itself
Missing feature threads (#1 #2): features that you whish Matlab would have had
Next Gen threads (#1): features that would break compatibility with previous versions, but would be nice to have
@anyone posting a new thread when the last one gets too large (about 50 answers seems a reasonable limit per thread), please update this list in all last threads. (if you don't have editing privileges, just post a comment asking someone to do the edit)
Yevgeniy Gorbachev
Yevgeniy Gorbachev on 26 Mar 2025 at 22:48
A native <xarray> -- in which an array's dimensions (row/col/page/...) can be named, and even more conveniently, assigned coordinate variables. Summarizing functions, arithmetic, interpolation, etc. can then act based on coordinate names and values. I more and more find myself dealing with highly dimensional data. I made my own <xarray> implementation, but I am sure MathWorks could make one that is much more performant and convenient...
Austin Fite
Austin Fite on 26 Mar 2025 at 21:41
In a professional setting I've found MATLAB to be the most effective tool for engineering visualization problems. We use it for rapidly building tools that integrate our telemetry with MATLAB's flexible plotting capabilities, and create GUIs that allow us to peel back the onion of some complex datasets. Some examples are overlaying sensor telemetry on recorded images, or analyzing kalman filter performance at scale. These tools are highly interactive and allow for user interactions by design (e.g. clicking on objects to interact, custom click-and-drag, callbacks, keybinds, etc).
The ability to create visualization tools like this paired with the extensive math and specialized toolbox capabilities of MATLAB is a technology differentiator that I don't think MathWorks leans into enough. I say this because renderer performance is often the bottleneck to our tools, and performance does not seem to be a primary focus of the MathWorks development team (I do see and appreciate performance updates in the release notes, but I wouldn't call it the North Star). On our team we have to use a lot of low-level tricks to make things feel reasonably performant (e.g. hgtransforms, NaN-breaks to minimize number of objects plotted, minimizing cla() calls, etc), and even then it's not what I would consider good. Some examples of issues that come up fairly often:
  • renderer performance gets significantly worse as a function of the figure/axes size on the monitor
  • text() objects scale terribly and cause the axes to become very slow
  • modern axes objects use "linger" mechanics that bog down performance (https://undocumentedmatlab.com/articles/improving-graphics-interactivity)
  • patch and surface objects can become quite slow when interacting with them, particularly with a maximized figure
  • uifigure performance is so bad (and worse on Linux vs. Windows) that we do not use it for anything except for the occasional geoglobe() plot
In my pie-in-the-sky MATLAB dream world, all figures would perform near to the Unreal Engine 5 renderer, where I can draw basically unlimited shapes, surfaces, geometry, lighting etc and it's all dispatched to some GPU-based renderer that always feels snappy and interactive. I know that's not realistic, but it's the general direction that I'd like to see MathWorks steering the product. My point is to please, invest heavily in your visualization infrastructure! It is one of MATLAB's key technology differentiators as Python gobbles up market share left and right. I get frustrated because updates to MATLAB often come at the cost of performance. I would happily leave all the gloss and polish on the table if it meant I could visualize more complex datasets or run my code faster. For instance the updates I have been most happy with in the past few years have been the "page" functions, such as "pagemtimes". These are functions I use all the time to process data faster and/or at larger scale.
Mike Croucher
Mike Croucher on 27 Mar 2025 at 9:24
Thanks for your feedback I've made sure the relevant teams are aware of your comments.
Glad you like pagemtimes so much. Its one of my favourites too. I wrote about the whole suite of paged functions last year Paged Matrix Functions in MATLAB (2024 edition) » The MATLAB Blog - MATLAB & Simulink
David Young
David Young on 13 Mar 2024
The first change I would make would be to scrap the special treatment of Nx1 and 1xN matrices. These are given special status (as "column vectors" and "row vectors"), which must, I suppose, be helpful sometimes, but in practice it's confusing (that is, it confuses me) and makes general code much more complex than it should be.
For example, if you write c = a(b) where the values of all the variables are numeric arrays, the rule is that c will be the same shape as b except when a or b is a column vector and the other is a row vector. An exception to a general rule is, as a general rule, a bad thing. One that affects as fundamental an operation as indexing an array is a very bad thing.
Another exception: size truncates trailing 1s except in the case of column vectors, and ndims returns 2 for column vectors. General code therefore has to handle this case specially. For an example of code that could be simpler without these complexities see exindex.
It makes for messy code in other ways: my arguments blocks are peppered with (1,1) to indicate scalars, when (1) would be easier to read and should be sufficient.
It's not as if row vector and column vectors are always treated the same as each other. Matrix multiplication distinguishes between them of course, as does the loop construct for. Making them a special category, when they're actually just different shapes of arrays, simply adds complexity.
Can anyone make a case for keeping this peculiarity?
Stephen23
Stephen23 on 26 Mar 2025 at 23:53 (Edited on 3 Apr 2025 at 7:48)
"Can anyone make a case for keeping this peculiarity?"
A consistent mathematical language with arrays and matrices as its primary data type necessarily has row and column vectors (not as special data types!), otherwise you end up with the kind of astonishing mess that is named Julia:
Someone wrote on this thread regarding Julia's challenges with vector transpose "There has been a lot of thought put into this, and it’s a very complicated issue":
Of course Julia programmers made it into a complicate issue: once a language has been forced by programmers to have one-dimensional vectors then performing basic mathematical operations on them becomes "complicated" by their own admission! There is no way to avoid this: what is the transpose of a one-dimensional vector?
"Making them a special category, when they're actually just different shapes of arrays, simply adds complexity."
In MATLAB (almost) everything is a matrix/array:
As such, row/column vectors do have some special syntaxes/use-cases, as you mentioned.
David Young
David Young on 27 Mar 2025 at 12:43
Stephen23 and Austin Fite - I'm afraid you have both interpreted my suggestion as almost the opposite of what I said. I said that I wanted 1xN and Nx1 matrices to be handled like other matrices. I am precisely not suggesting they be made special categories - they are currently special categories in certain circumstances, and that's what is confusing.
Stephen - yes, everything in MATLAB is a matrix, except that 1xN and Nx1 matrices are sometimes handled specially. (I gave examples). What I proposed was that the exceptions should be removed so that all matrices are treated consistently. Row vectors and column vectors are indeed different to each other, and the thing I'd change is that MATLAB currently sometimes tries to treat them is if they were the same. Please have another look at my post.
Austin - my proposal would not break implicit expansion, which behaves consistently at present, as you observe, and which would not be affected by what I suggest. Nor would it affect the way 1x1xP arrays are treated, which is also already consistent.
If you still wish to disagree, please look carefully at the current behaviour of ndims and of c = a(b) when a and b is 1xN and b is Nx1 (or vice versa), and explain how they are consistent with general rules for arrays, which you are both in favour of, as indeed am I.
Stephen23
Stephen23 on 3 Apr 2025 at 8:23
Sorry, I seem to have misunderstood your original comment.
You are correct, there are some special use-cases or syntaxes that behave differently with row/column vectors, e.g. the indexing that you mentioned. This topic has been discussed before, e.g.:
I suspect (much like some others on that thread) that the reason is very historical and very human, e.g. displaying the output from some basic commands e.g. COLON looks nicer as a row vector. It is worth remembering that MATLAB has very old roots back to times when programming tastes were quite different.
To be consistent MATLAB should also define a vector as any array with at most one non-scalar dimension, etc., and get rid of any special definitions of matrix. It certainly would be interesting to work with such a language.
J. Alex Lee
J. Alex Lee on 27 Mar 2025 at 13:11
I kind of get what David is saying, as sometimes a 1D thing is convenient of "programming" purposes and has some kind of intuition but following the rules can be confusing coming from the outside, my prime example being iterating:
for i = (1:4)
i
end
i = 1
i = 2
i = 3
i = 4
vs
for i = (1:4)'
i
end
i = 4×1
1 2 3 4
<mw-icon class=""></mw-icon>
<mw-icon class=""></mw-icon>
I can't think of specific examples off the top of my head right now, but I do recall running into some confusions in the past where some functions return things as columns and some functions returns things as rows. When the context for these outputs is not matrix math but rather just a list, it can lead to need of transposing or not transposing based on needs.
Austin Fite
Austin Fite on 26 Mar 2025 at 20:39
This would break a lot of convenient features introduced by implicit expansion. For instance applying element-wise operators when dimensions are the same size becomes ambiguous: consider A .* x where A = NxN and x = 1xN or Nx1 -- the orientation of the x vector matters and means different things here. The same applies for every other element-wise operator.
It also has impacts to arrays that are intentionally shaped to apply math in higher dimensions--i.e. 2 or more singleton dimensions preceeding non-singleton dimensions. For example, a 1x1xP array used to apply a scale factor separately to each "page" of an MxNxP array.
I much prefer the existing scheme where all arrays operate under consistent rules and for there not to be a special case for vectors.
David Young
David Young on 27 Mar 2025 at 12:44
Austin - please see my reply to Stephen23 above.
Austin Fite
Austin Fite on 27 Mar 2025 at 13:38
Hey David, I don't disagree with the spirit of wanting consistent behavior. I think the fact that this was posted under requests for "next gen MATLAB" ideas implies a desire for more extensive changes to vector handling than your specific examples implied.
Jim Svensson
Jim Svensson on 19 Sep 2023
Some simple things would be nice:
  1. counter += 1; salery *= 2 % operator assignment, or whatever it is called
  2. y = (x < 0) ? 3 : 2*x; % ternary operator
Walter Roberson
Walter Roberson on 19 Sep 2023
y = (x < 0) ? 3 : 2*x
Would that be:
y = zeros(size(x), 'like', x); %option 1
y(x<0) = 3;
y(~(x<0)) = 2*x(~(x<0));
or would it be
y = zeros(size(x), 'like', 3); %option 2
y(x<0) = 3;
y(~(x<0)) = 2*x(~(x<0));
or would it be
if any(x < 0, 'all') %option 3
y = 3;
else
y = x;
end
Or would it be
if all(x < 0, 'all') %option 4
y = 3;
else
y = x;
end
or should it be
if isempty(x)
y = x;
else
if x(1) < 0 %option 5
y = zeros(size(x), 'like', 3);
else
y = zeros(size(x), 'like', x);
end
y(x<0) = 3;
y(~(x<0)) = 2*x(~(x<0));
end
Option 1 has a result that is always class(x); option 2 has a result that is class double (because 3 is double). Option 3 and Option 4 force a scalar test and have a class that depends upon which way the test gets answered. Option 5 has a result that is the same class as the first result.
Suppose you had
y = (x < 0) ? ones(1,10,'uint8') : x
and suppose that x is non-scalar that does not happen to be 1 x 10. Does the statement make sense? It potentially does if, as is consistent with if and while you interpret it as a test over "all" of x . But that would only be consistent if you treat the ?: operator as syntax that can only occur as a syntactical shortcut for if/elseif with an assignment always needed. If you treat ?: as operators then for consistency you need the ?: test to be vectorized, more like
(x<0).*3 + (~(x<0)) .* x
(except that expression doesn't work for cases where x might contain infinity, since infinity times 0 is nan rather than 0)
Jim Svensson
Jim Svensson on 19 Sep 2023
Yes this is what I mean.
Simon
Simon on 6 Sep 2023
Insert 'parfor' option into splitapply( ), grouptransform( ) or create separate parallel versions of those two functions.
Right now the groupbased functions run through groups with for-loop. It's very slow for data with large number of groups. When the said data set was run through with parfor-loop, it was 5 to to 10 times faster.
functional programming hiding looping details makes the coding process closer human cognition. And parfor is a really powerful beast. The combination of these two infowar-horses will make Matlab take a decise lead ahead of sluggish reptile.
Clay Swackhamer
Clay Swackhamer on 5 Sep 2023
My wish list:
  • A real, beautiful dark theme
  • Improving the appearance of figures. Reduce padding around subplots, set default axis and tick mark color to black, adjust default linewidth and font sizes to be a bit larger. In general, try to make figures made quickly with default settings look better.
  • Multi-start options for all solvers in the optimization/curve fit toolbox.
  • Consistent arguments for plotting functions. I think some still use different capitalization schemes (like "LineWidth" vs "linewidth").
Clay Swackhamer
Clay Swackhamer on 6 Sep 2023
This is exactly what I was thinking of. I use both methods to change plot attributes. Maybe I should pick one method and stick with it...
Walter Roberson
Walter Roberson on 5 Sep 2023
When you use name/value pairs for the plotting functions, then the comparisons done are case-insensitive. The same is true when you use set() calls.
When you use dot-syntax like
h.LineWidth = 2.5;
then the comparisons done are case sensitive
At the moment I do not know whether you need to use case-sensitive when you use name=value calling syntax.
Andrew Janke
Andrew Janke on 4 Sep 2023
Yeah, @Rik and/or @Paul, go for creating a new "Matlab X Part 2" question; my browsers are also having trouble dealing with how big this question has gotten. I don't see a way I can lock this question; dunno if that's a moderator-only action or I just don't have enough rep.
Rik
Rik on 4 Sep 2023
The current thread is fairly close to my arbitrary suggested limit of 50 answers. If you think it makes more sense to start a new thread, go ahead. I'm happy to start a new one, but you can also do it and add it to the list of threads (don't forget to edit the other threads as well).
In an attempt to discourage new answers (while waiting for the ability to soft-lock threads), I have started editing the older questions by putting '[DISCONTINUED]' at the start of the question.
Paul
Paul on 4 Sep 2023
This wonderful thread is becoming unwieldy and slow to respond to editing on both laptop and desktop for me. If others are having the same problem, perhaps this Question should be locked, at least for new Answers (is that possible?), and a new Question opened for new Answers?
Simon
Simon on 1 Sep 2023
My wish list, not about code improvement but about official tutorials.
  • a tutorial of using splitapply to take advantage of parallel computation.
  • a tutorial of assignment and indexing involving comma-seprated list, cell array. It not only shows what works, but also explains what syntax would go wrong, and why it go wrong.
For example, x = ["a", "b"] is a 1x2 string array. But then x(:) becomes a column vector, then x{:} is a comma-seprated list; then [x{:}] is a character vector 'ab'. Such 'delicate' usage is the biggest bottleneck for my coding process. @Stephen23 has written a tutorial of comma-separated list. I hope Mathworks staff can take from there to expand it, covering the use cases of table. For example, if T is a table. T(1,:) is a single-row table. But then T{1,:} sometimes works if variables' data type can be lumped together; sometimes fails if variables have mixed data types. But then when it works, say, all table variables are 'string'. Why then T{1,:} is a string array, intead of a comma-separated list? Two similar syntaxes, x{:} and T{1,:}, have two different semantic meaning. That really causes workflow jam in my coding.
Andrew Janke
Andrew Janke on 5 Sep 2023
@Simon - I do cast some of my table variables to categorical, and have also noticed things go slower than I expected with them. (Kinda the whole point of categoricals is that they're small and fast compared to strings, right?) I have no idea what would cause categoricals in table variables to go slow.
Simon
Simon on 5 Sep 2023
@Andrew Janke Do you cast your table variables to categorical? In my case, if a task is to process strings, it will be many-times slower if variables are casted as categorical. I don't know why. What you think might have caused that?