MATLAB Answers

0

Deep learning with a GPU that supports fp16

Asked by Sukhwa Hong on 28 Aug 2019
Latest activity Commented on by Krishna Bindumadhavan on 14 Sep 2019
Hi.
NVDIA released RTX 2XXX series that supports fp16 that accelrated training process.
Does Matlab support this?
Thank you

  4 Comments

Show 1 older comment
It is supported for deep learning code generation, but not for general code generation.
There is support for half precision in MATLAB via the half precision object, available in the fixed point designer toolbox:https://www.mathworks.com/help/fixedpoint/ref/half.html.
General Code generation support for half precision data type via MATLAB Coder and GPU Coder is under active development. This functionality is expected in an upcoming release.
As mentioned below, there is no support currently for using half for training a deep learning network in MATLAB. This is expected in a future release.

Sign in to comment.

1 Answer

Answer by Joss Knight
on 29 Aug 2019
 Accepted Answer

You can take advantage of FP16 when generating code for prediction on a deep neural network. Follow the pattern of the Deep Learning Prediction with NVIDIA TensorRT example but set the DataType property of the DeepLearningConfig to 'fp16'. This will use the Tensor cores on a Volta or Turing card such as the RTX series.
There is no way yet to use half precision or Tensor cores for training a deep neural network in MATLAB. This is expected in an upcoming release.

  0 Comments

Sign in to comment.