MATLAB Answers


Can I use MATLAB with an NVIDIA GPU on macOS 10.14 Mojave?

Can I use MATLAB with an NVIDIA GPU on macOS 10.14 Mojave?


No tags entered yet.

1 Answer

Answer by MathWorks Support Team on 30 Jan 2019
 Accepted Answer

MATLAB requires that an NVIDIA-supplied graphics driver be installed on your Mac in order to take full advantage of an NVIDIA GPU. NVIDIA has not released an Apple-approved graphics driver for macOS Mojave. For more information, please see this official statement from NVIDIA on NVIDIA's developer forums.
The impact on MATLAB is as follows:
You can use MATLAB with an NVIDIA GPU on macOS Mojave, however, graphics performance is degraded when compared to running MATLAB on previous releases of macOS.
Computational acceleration
NVIDIA-specific functionality such as CUDA is not available which means GPU Arrays, provided by Parallel Computing Toolbox and used by many products, will not work.
The following products have features that make use of CUDA functionality and these features will be impacted by the lack of an NVIDIA-supplied graphics driver:
  • Parallel Computing Toolbox
  • GPU Coder
  • Image Processing Toolbox
  • Deep Learning Toolbox
  • Statistics and Machine Learning Toolbox
  • Computer Vision System Toolbox
  • Signal Processing Toolbox
  • Communications Toolbox
  • Phased Array System Toolbox
  • Text Analytics Toolbox
  • Reinforcement Learning Toolbox


The "translator" is the existing opengl framework from Apple, which they are saying that they will stop supporting soon and which they will certainly not improve. I would expect by two, at most three OS releases from now that MacOS will pretty much not function with opengl.
Apple considers such a translator to be too much work for them, so I would not expect Mathworks to be able to handle it.
Thanks Walter.
I guess either Apple and Nvidia will work out their differences, or Apple and AMD will develop Metal to be a real alternative to CUDA.
In the meantime, does anyone know if it's possible to use something like PlaidML for these hardware/software (Apple/Metal, PlaidML/Matlab) combinations?
Metal is a graphics protocol, not a gpu interface.
In theory Apple could certify a set of gpu drivers for cuda that were distinct from the graphics drivers. I do not know what either Nvidia or Apple are thinking of at this point.
Based on past history, I can speculate, without any inside knowledge at all (so I could be wrong, widely so)
Apple seems willing to to have Nvidia say "fine, we won't bother porting to Apple then!". It has been 6 years since Apple put an Nvidia into anything other than the Mac Pro, so except perhaps on the Mac Pro side, their direct revenue doesn't depend much on Nvidia.
Apple probably has more to lose from the game industry depending on opengl, and several major high profile games companies are working on metal ports and (I gather) getting performance better than DirectX, so they can expect to keep some of the games market, the high performance end.
One reason Apple can afford to tell/let Nvidia take a hike is that apple has AMD to rely on. The old play one company off against the other trick.
But really Apple hates being dependent on one company because that gives the company too much leverage. Apple's solution to this is to go in-house, to build its own graphics and gpu. Indeed it has already been working on that for years and I find 2015 articles about this. They have already put their own gpu into some of their phones.
Apple has also been working on replacing the x64 with in-house cpu with possible Mac out next year. If I recall correctly, definitions for the new CPU have already been found inside the os currently in beta, Catalina (which, by the way, ends 32 bit support)
If I were Mathworks I would probably think hard about holding off on putting effort into Metal until more was known about the new CPUs, because if the new CPUs are not machine code compatible with x64 then it is not obvious that Mathworks will want to bother: it would be a big effort for a platform estimated by some parties to be roughly 15% of their market.
Oh yes, if Apple goes in-house for gpu (already known to be well underway) then there is no certainty that they will be AMD or Nvidia compatible, and more likely that they will not be, OpenCL at most. This is a reason why it would be risky for Mathworks to spend much effort on AMD gpu for Apple systems.
I can talk about these things because all I know is what is known to the public: I have not discussed this with Mathworks or Nvidia or Apple or AMD.

Sign in to comment.