File Exchange

image thumbnail

Deep Learning Toolbox Converter for ONNX Model Format

Import and export ONNX™ models within MATLAB for interoperability with other deep learning frameworks

48 Downloads

Updated 19 Apr 2019

Import and export ONNX™ (Open Neural Network Exchange) models within MATLAB for interoperability with other deep learning frameworks. ONNX enables models to be trained in one framework and transferred to another for inference.

Opening the onnxconverter.mlpkginstall file from your operating system or from within MATLAB will initiate the installation process for the release you have.
This mlpkginstall file is functional for R2018a and beyond.

Usage example:
%% Export to ONNX model format
net = squeezenet; % Pretrained Model to be exported
filename = 'squeezenet.onnx';
exportONNXNetwork(net,filename);

%% Import the network that was exported
net2 = importONNXNetwork('squeezenet.onnx', 'OutputLayerType', 'classification');

% Compare the predictions of the two networks on a random input image
img = rand(net.Layers(1).InputSize);
y = predict(net, img);
y2 = predict(net2,img);

max(abs(y-y2))

To import an ONNX network in MATLAB, please refer:
https://www.mathworks.com/help/deeplearning/ref/importonnxnetwork.html

To export an ONNX network from MATLAB, please refer:
https://www.mathworks.com/help/nnet/ref/exportonnxnetwork.html

Comments and Ratings (17)

Hi Jihang, thanks for sharing this information, unfortunately it didn't resolve the problem in my case.

Jihang Wang

Hi everyone, I found the reason why it doesn't work under the help of MathWorks Technical support team. I just want to share my experience here. Basically there is a function on my path which is shadowing one of the built-in MATLAB functions. I reset my MATLAB path using the code below:
>> restoredefaultpath
>> rehash toolboxcache
>> savepath % note: this command will overwrite my current path preferences.

After that, I downloaded and reinstalled the converter app from this page and rerunning the export code. Problem solved :) Hope this helps.

Hi Andreas, I just used a custom CNN and checked it with WinMLRunner, I didn't try any pretrained models though.

Hi Gabriel
Could you tell me which CNN did you use?
As mentioned before i tryed the basic googlenet and i couldn't use it with Microsoft ML.
It would be very helpful if i could use the onnx file exchange.
Thanks in advance

Hi Ting, thanks a lot for the Opset update. However, now I obtain the same error as Andreas for LSTM networks: "First input does not have rank 2". If I have more than one LSTM-layer in the network the error messages somehow changes to: "First input tensor must have rank 3", CNNs seem to work though.

Ting Su

Hi Andreas and Jihang, Can you reach our technical support and send model to us?

Hi Ting, I ran into the same issue with C#. I can export the Network in different versions. If I try to load the Model into windows.ml I get an "ShapeInferenceError" the First Input does not have rank 2. With Opset v6 it is possible to load the File but it can't be used. I tested googlenet and tried to compare the onnx models with a program called "Netron". The difference I found was that the first layer “Sub” changed from [3x244x244] to [1x3x244x244] but I’m not sure if this is the Problem. A second thing is that with onnx v6 Visual Studio can generate a model class automatically but not with v7 or higher. It seems that it is not recognized as an onnx model. Can you give an advice how to use Matlab trained model's in C#?

Jihang Wang

Hi Ting, I have the same issue when loading the ONNX model in C#. I tried to save the model to different Opset versions but none of them works. Please advise.

Ting Su

Hi Gabriel,
We recently added support for ONNX Opset 7, 8 and 9. One can specify which Opset to use via an optional input argument 'OpsetVersion' during the export. You should be able to download it if you have a R2018b MATLAB.

Ting Su

Hi Kennth,
We saw a similar issue and the fix will be released soon. It will be great if you could send us your MATLAB model to allow us to test it.

It would be great if the export could be updated to version 7 or 8 to allow the use with windows ml.

exportONNXNetwork does not work properly using CNTK and Python. The conversion produces a ValueError: Gemm: Invalid shape, input A and B are expected to be rank=2 matrices.

Hui Yin Lee

Hi, Is the code or toolbox available for Faster R-CNN model to be exported? As i get the error mentioning the model is not DAGnetwork. Hopefully can get some feedback or help here

Do you guys know when support for the constant operator will get added?

Error using importONNXNetwork (line 39)
Node 'node_20': Constant operator is not supported yet.

umit kacar

I worked this code:) It is very good. Thank you.

Ting Su

Hi Trihn,
We would like to hear more details on the problem of importONNXNetwork(). Have you installed an old version of this converter before?

Trinh Pham

The function importONNXNetwork() doesn't work when I use example above!

MATLAB Release Compatibility
Created with R2018a
Compatible with R2018a to R2019a
Platform Compatibility
Windows macOS Linux

Discover Live Editor

Create scripts with code, output, and formatted text in a single executable document.


Learn About Live Editor