MATLAB Answers

How do I export a Neural Network from MATLAB?

290 views (last 30 days)
I have a Neural Network which I trained using MATLAB. I want to export the network so I can use it with other frameworks, for example caffe. How do I do that?

Accepted Answer

MathWorks Support Team
MathWorks Support Team on 29 Aug 2019
Edited: MathWorks Support Team on 29 Aug 2019
The recently released Neural Network Toolbox Converter for ONNX Model Format now allows one to export a trained Neural Network Toolbox™ deep learning network to the ONNX™ (Open Neural Network Exchange) model format. The ONNX model can then be imported into other deep learning frameworks, such as TensorFlow®, that support ONNX model import.
Alternatively, you could export via the MATLAB Compiler SDK.
Using the MATLAB Compiler SDK, you can save the trained network as a MAT file, and write a MATLAB function that loads the network from the file, performs the desired computation, and returns the Network's output.
You can then compile your MATLAB function into a shared library to be used in your C/C++, .NET, Java, or Python project.
You can find more information about MATLAB Compiler SDK in the following link:
Furthermore, the objects that MATLAB uses to represent Neural Networks are transparent, and you can therefore access all the information that describes your trained network.
For example, you will get an object of type SeriesNetwork, which is a trained Convolutional Neural Network. You can then see the weights and biases of the trained network:
convnet.Layers(2).Weights
convnet.Layers(2).Bias
Then, using for example caffe's MATLAB interface, you should be able to save a Convolutional Neural Network as a caffe model. The code for the MATLAB interface is in the following link:
and includes a classification demo that shows you how to use the interface.
Please note that the above code is not developed or supported by MathWorks Technical Support. If you have any questions about how to use the code, please contact the project's developers.

  1 Comment

Sign in to comment.

More Answers (2)

Maria Duarte Rosa
Maria Duarte Rosa on 25 Jun 2018
The recently released Neural Network Toolbox Converter for ONNX Model Format now allows one to export a trained Neural Network Toolbox™ deep learning network to the ONNX™ (Open Neural Network Exchange) model format. The ONNX model can then be imported into other deep learning frameworks, such as TensorFlow®, that support ONNX model import.

  1 Comment

michael scheinfeild
michael scheinfeild on 6 Aug 2018
still i have no success to import it to c++ from onnx there are many issues of compilation

Sign in to comment.


michael scheinfeild
michael scheinfeild on 14 Apr 2019
after testing onnx i found that the output of convolutions is not the same as in matlab .

  3 Comments

Guillaume Vanoost
Guillaume Vanoost on 5 May 2020
Did you use TensorRT to import it to c++ ?
Shweta Singh
Shweta Singh on 5 May 2020
Hi Michael,
For any further questions or clarification, feel free to contact MathWorks Technical Support by using the following link:
Thanks,
Shweta Singh
Vasil Ppov
Vasil Ppov on 20 Jul 2020
Yes, the output size is not the same! I'm trying to export ONNX YOLO model from Matlab to Python...succesfully! The expected output tensor should have size 14x14x12, however the size in Python is 2x6x196 (2x6=12 and 14*14=196). Could you tell me why and how to fix it ?

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!