shallow neural network way to extract
11 views (last 30 days)
Show older comments
I built a shallow neural network in Matlab.
I want to take it outside and use it, but is there a way to extract it with another code?
As I browse, only .m files are extracted, and the created model cannot be imported to other files.
Answers (2)
Walter Roberson
on 21 Oct 2022
Edited: Walter Roberson
on 5 Jan 2023
Yes, certainly. A shallow neural network is pretty much just a struct. You can extract the fields and export them in whatever binary or text format that is suitable for your purposes. You can find a description of the properties at network
I would suggest that you are not really interested in the general theoretical question of whether it is possible to get data out of a shallow neural network. I would suggest that really what you are asking is a duplicate of https://www.mathworks.com/matlabcentral/answers/1812035-convert-matlab-neural-network-to-onnx?s_tid=srchtitle in which you want to know whether there is a way to convert a specific kind of shallow neural network into Onyx network format. As was indicated to you there, Mathworks does not provide such a function.
I have never looked at the structure of Onyx networks, so I do not know how practical it is to convert a shallow network to Onyx format. https://github.com/onnx/
0 Comments
Karol P.
on 5 Jan 2023
Shallow neural networsk are easy to port manually. I would recommend running this command on your network:
genFunction(net)
This will generate a function that simulat your shallow network. In practise it will be just a function file with series of simple mathamatical operation and, possibly, explicit loops. Porting it to any other language will be simple. I succefully did it for Python 3.x implemantation. For example for generic matlab data set, you can train network:
[x,t] = bodyfat_dataset;
bodyfatNet = feedforwardnet(10);
bodyfatNet = train(bodyfatNet,x,t);
y = bodyfatNet(x);
And running genFunction(bodyfatNet,'bodyfatFcn'); will create file that perfectly represents your ANN. As you can see it contain only trivial mathematical operation, universal in any sicentific programming language:
function [Y,Xf,Af] = bodyfatFcn(X,~,~)
%BODYFATFCN neural network simulation function.
%
% Auto-generated by MATLAB, 05-Jan-2023 20:03:49.
%
% [Y] = bodyfatFcn(X,~,~) takes these arguments:
%
% X = 1xTS cell, 1 inputs over TS timesteps
% Each X{1,ts} = 13xQ matrix, input #1 at timestep ts.
%
% and returns:
% Y = 1xTS cell of 1 outputs over TS timesteps.
% Each Y{1,ts} = 1xQ matrix, output #1 at timestep ts.
%
% where Q is number of samples (or series) and TS is the number of timesteps.
%#ok<*RPMT0>
% ===== NEURAL NETWORK CONSTANTS =====
% Input 1
x1_step1.xoffset = [22;118.5;29.5;31.1;79.3;69.4;85;47.2;33;19.1;24.8;21;15.8];
x1_step1.gain = [0.0338983050847458;0.00817494379726139;0.0414507772020725;0.099502487562189;0.0351493848857645;0.0254129606099111;0.0318979266347687;0.0498753117206983;0.124223602484472;0.135135135135135;0.099009900990099;0.143884892086331;0.357142857142857];
x1_step1.ymin = -1;
% Layer 1
b1 = [-4.1688788164517145418;3.9421924254343023719;-0.57836024709106670372;-2.7267031902382474762;0.34123076571202975993;0.86642967283179794791;3.9612324512272656385;3.6322017725241400044;1.0269173063776930732;1.8766051649753006103];
IW1_1 = [-0.013051683830370475192 -0.40864434799490134687 -3.4093327802380102298 2.76724396774887893 -0.093027162630096249529 3.4978642845103053993 0.32983632675776008991 -0.39671040165174920045 -4.162922388295510423 -3.5739684834914360323 -3.4650779694327868974 0.57215487201826187302 0.051545947864942348593;0.8724325338010686659 -4.2206981579094371426 -7.5786952452679008374 -1.1672027680878336309 -1.6476103335786520532 2.8658925750719603798 -0.69730513383429981733 2.9872147037139087367 0.58713929657199170897 3.7184709557831121529 1.7095110287866268628 1.5222350228149608142 1.7167610314621337686;-5.7212715144591257399 -0.52020098314708684839 3.0395733948529395363 0.25306369843645259987 -1.5381181686895804006 -0.12680744032575441693 -0.19003639098014835085 2.2904747696095846266 2.4031499084859055948 0.333434207787930037 1.7975873339642221005 0.43439227612087949471 0.20715853211547052837;2.6504884847548821902 -1.074167437323735097 3.4652021896933487 2.2511893237371078946 -1.8985953939745767727 -2.6851969581156236444 1.8881869624905358584 1.1376311780867540691 -1.3706323507792048666 3.7309552164924513207 6.4030845092061570156 2.364375947677445744 -0.80619261360355443102;-1.9725281695067233834 1.5186659167929952297 -4.1391949111005859052 0.68616648606016827916 -5.9851968227681107138 -2.2433321982141607442 -1.230298259356137569 2.0386085722660549635 0.15633394664609567837 3.462087035982897909 -0.71324170176982781832 -3.806486691435979175 0.14512960530992444208;0.05922882609162639922 -0.74200096501974288632 -1.3078955205228659509 -0.7596626232890700825 0.72608870289072013904 1.2784178430216996958 -0.27709188898229464293 -0.16291462029477538076 0.99369761691639124646 -1.1448616639504127779 0.73218054313546709899 -0.52616293066492758612 -0.21387000219698323877;-1.7184897327703325676 -0.90654155391768087568 1.7373781867653084188 3.6373866032727408815 -2.1378927944345345047 -1.3251670568909215131 1.9696851768284975304 4.45270737058025734 2.840672731294883846 0.325760172910157908 -0.76522827718488595217 -1.4066696338607995731 -0.95431038499267273334;0.046991643251568987472 3.5925164413972696664 -6.4800651801687045861 -2.0420590257136108647 -1.9753967810127166516 -0.93434223792897597161 -2.0023418045476191196 2.1214370668647233309 3.0113476039367146342 -2.7444487476015351213 2.0127420211373934222 -1.0251841129479244419 -1.5184378010576033979;0.19060476453690519683 1.7513918369692982324 0.66275021465404226895 1.4430458277024251768 -2.5263440195316477777 -1.1738479832578336826 -0.44710643227143137546 0.22552135854961619099 -3.7668192606835386727 -2.9319047551023489362 -0.32803915815444395498 -1.7159308933576893352 -3.4566153590778312399;0.97096293624496943231 1.5524899156780680443 -0.024232109175742033713 0.88923409532768604713 5.1223167619529412775 -1.6893845712674291359 1.634784055755800658 -3.8156233192361943551 0.97035322711856064615 -4.0445245599638663947 0.706886878460264656 -0.41308353311699397281 0.55660073486280270405];
% Layer 2
b2 = 0.6105657332883046573;
LW2_1 = [0.86045384182207018675 0.25917492316268486707 -0.0013754261582584470514 -0.0023612870729325351193 -0.25454110944757596391 0.8504138920809453106 0.28849076973762566301 -0.28754348691450903885 -0.010545675420425126151 -0.4509272900288166519];
% Output 1
y1_step1.ymin = -1;
y1_step1.gain = 0.0421052631578947;
y1_step1.xoffset = 0;
% ===== SIMULATION ========
% Format Input Arguments
isCellX = iscell(X);
if ~isCellX
X = {X};
end
% Dimensions
TS = size(X,2); % timesteps
if ~isempty(X)
Q = size(X{1},2); % samples/series
else
Q = 0;
end
% Allocate Outputs
Y = cell(1,TS);
% Time loop
for ts=1:TS
% Input 1
Xp1 = mapminmax_apply(X{1,ts},x1_step1);
% Layer 1
a1 = tansig_apply(repmat(b1,1,Q) + IW1_1*Xp1);
% Layer 2
a2 = repmat(b2,1,Q) + LW2_1*a1;
% Output 1
Y{1,ts} = mapminmax_reverse(a2,y1_step1);
end
% Final Delay States
Xf = cell(1,0);
Af = cell(2,0);
% Format Output Arguments
if ~isCellX
Y = cell2mat(Y);
end
end
% ===== MODULE FUNCTIONS ========
% Map Minimum and Maximum Input Processing Function
function y = mapminmax_apply(x,settings)
y = bsxfun(@minus,x,settings.xoffset);
y = bsxfun(@times,y,settings.gain);
y = bsxfun(@plus,y,settings.ymin);
end
% Sigmoid Symmetric Transfer Function
function a = tansig_apply(n,~)
a = 2 ./ (1 + exp(-2*n)) - 1;
end
% Map Minimum and Maximum Output Reverse-Processing Function
function x = mapminmax_reverse(y,settings)
x = bsxfun(@minus,y,settings.ymin);
x = bsxfun(@rdivide,x,settings.gain);
x = bsxfun(@plus,x,settings.xoffset);
end
0 Comments
See Also
Categories
Find more on Function Approximation and Nonlinear Regression in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!