Curve fitting and convergence to estimate two coefficients

Hello
Question Objective:
I am looking to employ curve fiting and convergence to estimate two coefficients in my model equation. I have estimated one coefficient by lscurvefit, fminsearch, but wondering if there is any solver in Matlab to estimate two coefficients in the approach ( details below) that I prefer.
Model Equation: ( See the below code for reference)
  • Model equation is constructed using the terms L1, L2, L3, L4, L5, S and At_Af
  • Final form of the model:
At_Af(t+1) = Af*(1 - L1*S((t)+1)
Coefficients to be estimated:
  • Two coefficients - d and Af
Apporach that I wanted to employ:
  • Consider the base line data( curve) array for fitting
  • Evaluate the best fit by stepping 'n' in the equation from 0 to #( how many ever iterations) to obtain the best fit while trying to determine the two coefficients d and Af
  • Can consider d= 1*10^-10 and Af = 0.100826 as the initial guess for the regression.
Data set - attachment:
The initial data set ( baseline curve data) = At_Af (LHS of the model equation) is attached to this query in a spreadsheet
The code of the model equation is below:
L=0.00075; % height of the tissue
% The diffusion equation has been broken down for ease of tranformation
% into code
gama = 0.000167;
L2 = zeros(14,1);
L3 = zeros(100,1);
L4 = zeros(100,1);
L5 = zeros(100,1);
S= zeros(73,1);
At_Af = zeros(73,1);
t1 = 0:1:3000
d= 1*10^-10;
L1 = ((8*gama)/((pi*(1-exp(-2*gama*L)))));
format longE
t_min = t1./60
for t = t1(:).'
for n=0:1:50
L2(n+1) = exp((((2*n + 1)^2)*-d*pi*pi*t)/(4*L*L));
L3(n+1) = (((-1)^n)*2*gama)+(((2*n+1)*pi)*exp(-2*gama*L))/(2*L);
L4(n+1)= ((2*n)+1)*((4*gama*gama)+((((2*n)+1)*pi)/(2*L))^2);
L5(n+1) = ((L2(n+1)*L3(n+1))/L4(n+1));
end
S((t) +1) = sum(L5);
At_Af(t+1) = Af*(1 - L1*S((t)+1));
end
Any help would be greatly appreciated. Thanks in advance!

 Accepted Answer

I was bothered by the poor performance of fmincon() on this problem. Even when we tried starting at multiple locations, it did not progress reliably o the true minimum. I figured out why: the values of the parameter vector, x=[d,Af], differ by 10 orders of magnitude (d~=1e-10, Af~=1). This large difference in scale of the parameters is a problem for fmincon(). fmincon() works much better if we scale the elements of the vector x to have the same order of magnitude. Threfore I adapted the code so that we specify d*1e10 in the main program. d*1e10 has magnitude~=1, like Af. This is the value passed to fmincon(). We divide this number by 1e10 inside myModel(), to get it back to its "correct" value. With this adjustment, we don't need the "PolyStart" version of the program. You can use fitData.m, which makes only one initial guess. You will see that for a wide rage of initial guesses, it fiinds the same best fit result - which is good. And the best fit it find is a lot better than the "best fit" we found before: the rms error is 0.00000.
I changed the allowable range for Af to [0.5,1.5], compared to the range [0,1] which you specified. I did this because myModelTest.m shows that values of Af in the range [0.5 to 1.5] are reasonable.
The modified versions of fitData.m, fitDataPolyStart.m, and myModel.m, are attached.
Console output and graphical output below, when the initial guess is d0*1e10=.5, Af0=.5:
>> fitData
Best fit at d=1.000e-10, Af=1.0000. R.m.s.error=0.00000.
fitDataPolyStart produces identical reults, after trying 30 different starting guesses.

14 Comments

Thanks Will !!
But I am really sorry that I am stuck with troubleshooting the incompatible size issue. Unable to determine the root cause. I followed your steps and still unable to determine the root cause.
Seems very straight forward, but I am missing the obvious. Been spending a while on this trying to diagnose which assignment is causing compatability issues despite changing the array to length of the data set instead of harcoding it.
Below is the console response
"
rrays have incompatible sizes for this operation.
Error in sumsqerr (line 18)
sse=sum((AtAfExp-AtAfSim).^2); %sum squared error
Error in fmincon (line 567)
initVals.f = feval(funfcn{3},X,varargin{:});
Error in fitData (line 27)
[x,sse]=fmincon(@sumsqerr,x0,[],[],[],[],lb,ub,[],options);
Caused by:
Failure in initial objective function evaluation. FMINCON
cannot continue.
Related documentation
>> fitData
Arrays have incompatible sizes for this operation.
Error in sumsqerr (line 18)
sse=sum((AtAfExp-AtAfSim).^2); %sum squared error
Error in fmincon (line 567)
initVals.f = feval(funfcn{3},X,varargin{:});
Error in fitData (line 27)
[x,sse]=fmincon(@sumsqerr,x0,[],[],[],[],lb,ub,[],options);
Caused by:
Failure in initial objective function evaluation. FMINCON
cannot continue.
Related documentation
>>
Attaching the new data set under the same file name
There was a line in myModel.m which was
for i=0:3000
I have changed it to
for i=0:N-1
I saved the new file (with 241 points) as AtAfExpt2.txt, and I changes the file name to AtAfExpt2.txt in fitData.m. With these changes, fitData.m runs without error.
But the fit is not very good:
>> fitData
Best fit at d=7.155e-11, Af=0.5227. R.m.s.error=0.01078.
The graph of experiment versus simulation above, and our understanding of d and Af, which we gained from running myModelTest.m, make it clear that we need a higher value of d and a smaller value of Af to improve the fit. This reminds me that I changed the bounds for Af to [.5,1.5]. I will therefore change the bounds for Af to [0,1.5] in fitData.m. With this change, fitData.m finds a much better fit - see below. The best fit value of d is higher now, and the value of Af is smaller. Those are the changes we predicted.
>> fitData
Best fit at d=5.047e-09, Af=0.0847. R.m.s.error=0.00176.
The updated versions of fitData.m and myModel.m are attached.
This is great!
I have been able to plugin the values I want and the results make sense, including the ones you just got above. This is jus amazing and I cannot thank you enough Will. I am going to play around with data as I am seeing positive signs in the output also I am able to disgest your coding a lot easier because you have been very clear in providing guidance. Huge thanks and very much appreciate your time and effort, Sir.
The model worked great for my analysis of trying to determine the probable value for the coefficients.
I tried tweaking the code to determine L and d, instead of Af and d. ( wanted to estimate L instead of Af) in the model equation. Seems like a very minor update to the code, which I did. Updated the experimental data as well accordingly and tried fitting to determine the estimate.
Unfortunately, the length estimate is always turning out to be athe lower bound value I provide. Attached the updated code. Any thoughts?
Thanks!!!
@Anand Rathnam, You posted a data file and 3 scripts with the same names as files we echanged previously. Are all four files altered from the earlier files with the same names? If so, how? It takes me extra time to figure this out.
I have renamed the files you sent me. I have named the latest data file AtAfExpt3.txt on my computer, since we already had AtAfExpt2.txt. I renamed your latest main program fitdata2.m on my computer, to avoid confusion with the earlier version, which works. Renaming function files (for example by adding a number to the filename) can cause challenges, because if you give the file a new name, you must rename the function itself in line 1, and you must change the call to the function, in any script that calls it. Therefore I have not renamed the function files myModel.m and sumsqerr.m, but I have saved the previous versions, which worked, with a filename reflecting the last date edited.
@Anand Rathnam, I am looking at your new main program which I have rnamed fitData2.m.
You decreased the minimum value for d*1e10, the first filled parameter, from 0.001 to 0.0001. Why?
The range you specified for the second parameter* to be fitted is now 0 to 0.005, with an initial guess of 0. I said in my ealrier comments that fmincon() wirks better if the parameters to be fitted are of the same order of magnitude. Your new range does not satisfy this suggestion since the max value is .005, whereas the max value for d*1e10 is 1000, and the best fit value on earlier runs is 1.
*The range and the initial guess for the second variable to be fitted have been altered, but none of the comments have been altered. The comments in fitData2.m still refer to the second parameter as Af. Do you actually mean for the second fitteted variable to be "L" now, rather than Af? If so, then fix the comments in the code, all the way through.
The comments in myModel.m also refer to Af, but I think you mean L. If that is true, then update the comments for myModel.m also.
You modified a line and added a line to myMOdel.m to show progress of minimization. The printed line refer to d and Af. Do you mean d and L?
The new program runs to completion without errors, and I can confirmthat the fitted value of the socndparameter is 0, which equals the initial guess adnd it at the low end of the range. I can alsos confirm that the fit doen't look too good. Do you know that a fitted value of 0 for L is wrong? Maybe that is in fact the best possible value in the range.
My sincere apologies for uploading the files without completed updates from Af to L.
The attachment to this comment contains the updated .m files. Since I cannot update the.m file name as they refer to the function, you can consider the attachment to this comment attachment as the updated code to save it in your local accordingly.
  • I have marked my update with comments $$$updated$$$ for identification.
  • However, I updated the data file as AtAfExpt3.
  • I got your point for consistency in the order, yet the predicted length is always the lower bound value.
  • I think below a summary of what I am attempting to do will help here:
  1. I am using the same model equation and attempting to predict the length(L) and diffusion coefficient (d) with relevant data instead of Af that was attempted earlier.
  2. Basically updated the data set with suitable Af( normalized the AtAfExp array with the experimentally known Af )
  3. As you can see, the L value of 0.0023 was used for the prediction Af previously. ( I have now removed it/ marked it as a comment to since I am trying to predict it, and you can notice that)
  4. Since I have revised the input data set with an Af close to what we estimated previously, I expect to get the L value near what I formerly assumed.
Thanks a lot for looking into this!
I understand that:
  1. You have set Af to unity. SInce Af acted as a scaling fctor, setting it to unity means that it no longer appears in the model equations.
  2. You want to fit d and L instead of d and Af.
  3. You have made good changes to the code in order to fit d and Af.
  4. The code runs to completion without error.
  5. The best-fit value of d is 0.001, which is at the low extreme of the allowable range.
You said in your Oct 3 comment that, in the earlier version of the code, L=0.0023. That is not correct, at least not for the version you posted on Sept 22, which is the version I used as my base for fitting. In the Sept 22 versoin of myModel.m, L=0.00075. The allowed range for L, in th Oct 3 version of fitData.m, is 0.001<=L<=5000. L=0.001 is the closest allowed value to the original value.
In the Oct 3 which you posted, code, gama=5678. In the Sept 22 code, gama=0.000167. This change in gama by a factor of more than 1e7 means that the best-fit values are likely to change.
By the way, your initial value for L in fitdata.m (Oct 3 versioin) is L=0. This is outside the allowed range. I highly recommend that the initial value should be in the interior of the allowed range.
The data in AtAfExpt3.txt, which you posted on Oct 3, has a jump from t=0 to t=1,2,3 (row 1 to rows 2,3,4...) Is the value at t=0 (row 1) correct?
When I run your code with the new experimental data file, AtAfExpt3.txt, I get the following console output and plot:
Best fit at d=1.313e-08, L=0.0010, R.m.s.error=0.06173.
I get a better fit (i.e. lower RMSE) in Excel, by setting d=1e-10, and adjusting L to make the error small. When L=1.03e-4, RMSE=0.029, which is a lot better than the fit above. This is with gama=5678, since that is the value of gama in myModel.m, which you posted on Oct 3. See Excel plot below.
If I use the Solver function in Excel, I can do somewhat better. See Excel plot below, in which the RMSE is 0.0243, which is better. But the values of d and L are outside the range you like. I think the unexpected best-fit values for d and L are due to gama=5678.
Now back to Matlab: Let's set gama=0.000167 in myModel.m. That is the value gama had on Sept 22. Then run fitData.m, with 1e-5<=L<=100, and initial guess L0=0.01. fitData.m finds the following solution:
Best fit at d=4.250e-10, L=2.286e-04, R.m.s.error=0.02444.
Very interesting to see how excel out performs matlab solver. Did you attempt fitting it manually in excel.
I think the simulations you ran shows the dependency between L and D. Wondering if an attempt is made to estimate two dependent variables, we will end up with extreme value ( in this case lower bound L value is getting picked always) for one of the variable since it has to satisfy the sum of sqares minimization.
I am going to try fitting for L alone, by keeping d as a constant ( to the value that you estimated since it has low RMSE) and see if I can determine the best L value.
Sorry for the late response, I got bogged down by ther stuffs. Your help in resolving and getting me understand the fitting process is very much appreictaed and cannot be thanked enough!
The data in AtAfExpt3.txt, which you posted on Oct 3, has a jump from t=0 to t=1,2,3 (row 1 to rows 2,3,4...) Is the value at t=0 (row 1) correct? Yes, thats correct.
Regarding other changes:
  • I am sorry about mentioning the incorrect L, yo uare right - my [previpous L value was different.
  • As far as the gamma value goes, I discovered the updated vavlue to be used which is 5678. I didnt realize I had a different value in the version I shared with you. Sorry forgot to mention that as an update.
You're welcome, Anand. You said "Very interesting to see how excel out performs matlab solver. Did you attempt fitting it manually in excel."
I would not say excel outperforms Matlab solver, because they are very different, and they may be compared with various metrics giving different results.
I did try doing the fit manually before remembering thatExcel has an optimization capability.

Sign in to comment.

More Answers (5)

fmincon() is a multidimensional minimization routine. By multidimensional I mean it adjusts multiple unknown parameters until a specified function (such as sum of squared errors) is minimized. fmincon() is complicated, but the help page for it is good, and has a bunch of examples. You will want to write a function that implements your diffusion model. You will pass to it the vector x=[d,Af]. It will return the vector AtAf.
function AtAf = myModel(x);
%MYMODEL Function to simulate diffusion
%Input: x=[d,Af];
%Output: AtAf=vector with same length as the experimental data vector
N=99; %length of experimental data vector; replace 99 with correct value
d=x(1);
Af=x(2);
AtAf=d*ones(N,1)+Af; %replace this line with code for the diffusion model
end
Save the code above as file myModel.m.
Write a function to compute the error between the model and the experimental data:
function sse=sumsqerr(x)
%SUMSQERR Function to compute sum squared error between model and experiment
%Input: x=[d,Af];
%Output: sse=scalar
AtAfExp=load('MyExptData.txt'); %read experimental data from text file
AtAfSim=myModel(x);
sse=sum((AtAfExp-AtAfSim).^2); %sum squared error
end
Save code above as file sumsqerr.m.
Write the main program. It calls fmincon(). You pass to fmincon() the initial guess x0=[d0;Af0], and the name of the function to be minimized (sumsqerr), and the allowed bounds for d and Af.
%fitData.m
x0=[1E-10,0.100826]; %initial guess
lb=[0,0]; %lower bound for [x,Af], replace with appropriate values
ub=[1,1]; %upper bound for [x,Af]; replace with appropriate values
x=fmincon(sumsqerr,x0,[],[],[],[],lb,ub);
disp(x) %display value of x=[d,Af] that minimizes the error
Save code above as fitData.m. Run fitData.m.
That gives you the general idea. Obviousy you need to make adjustments in the code above.

2 Comments

Thanks Will.
I just followed you instructions, but not sure where I am going wrong. I have been trying to troubleshoot but unable to find the error which seems obvious.
Main function
%fitData.m
clc
clear;
x0=[1E-10,0.100826]; %initial guess
lb=[1E-15,0]; %lower bound for [x,Af], replace with appropriate values
ub=[1E-5,1]; %upper bound for [x,Af]; replace with appropriate values
x=fmincon(sumsqerr,x0,[],[],[],[],lb,ub);
% x=fmincon(sumsqerr,x0,[],[],[],[]);
disp(x) %display value of x=[d,Af] that minimizes the error
Model fn
function AtAf = myModel(x); %MYMODEL Function to simulate diffusion
%Input:
% x=[d,Af];
%Output: AtAf=vector with same length as the experimental data vector
N=50; %length of experimental data vector; replace 99 with correct value
d=x(1);
Af=x(2);
% DIffusion Model
L=0.00075; % height of the tissue
% The diffusion equation has been broken down for ease of tranformation
% into code
gama = 0.000167;
L2 = zeros(14,1);
L3 = zeros(100,1);
L4 = zeros(100,1);
L5 = zeros(100,1);
S= zeros(73,1);
At_Af = zeros(73,1);
t1 = 0:1:3000
% d= 1*10^-10;
L1 = ((8*gama)/((pi*(1-exp(-2*gama*L)))));
format longE
t_min = t1./60
for t = t1(:).'
for n=0:1:50
L2(n+1) = exp((((2*n + 1)^2)*-d*pi*pi*t)/(4*L*L));
L3(n+1) = (((-1)^n)*2*gama)+(((2*n+1)*pi)*exp(-2*gama*L))/(2*L);
L4(n+1)= ((2*n)+1)*((4*gama*gama)+((((2*n)+1)*pi)/(2*L))^2);
L5(n+1) = ((L2(n+1)*L3(n+1))/L4(n+1));
end
S((t) +1) = sum(L5);
At_Af(t+1) = Af*(1 - L1*S((t)+1));
end
end
SSe function
function sse=sumsqerr(x) %SUMSQERR Function to compute sum squared error between model and experiment
%Input:
x=[d,Af];
%Output:
% sse=scalar
AtAfExp= [0.003973252
0.015048578
0.02127694
0.026058798
0.030090108
0.033641763
0.036852705
0.039805471
0.042553838
0.045135162
0.047576638
0.049898799
0.052117595
0.054245713
0.056293437
0.058269243
0.060180216
0.062032346
0.063830758
0.06557987
0.067283527
0.068945098
0.070567558
0.072153544
0.073705411
0.07522527
0.076715024
0.078176393
0.079610942
0.081020095
0.082405154
0.083767315
0.085107677
0.086427255
0.087726986
0.08900774
0.090270324
0.091515491
0.092743941
0.093956332
0.095153276
0.09633535
0.097503094
0.098657017
0.099797598
0.10092529
0.102040521
0.103143694
0.104235192
0.105315378
0.106384597
0.107443176
0.108491427
0.109529647
0.110558117
0.111577107
0.112586875
0.113587667
0.114579718
0.115563253
0.116538487
0.117505628
0.118464874
0.119416414
0.120360432
0.121297104
0.122226597
0.123149075
0.124064694
0.124973605
0.125875953
0.126771879
0.127661517
0.128544998
0.129422449
0.13029399
0.131159741
0.132019814
0.13287432
0.133723366
0.134567055
0.135405487
0.136238759
0.137066966
0.137890198
0.138708545
0.139522092
0.140330923
0.141135118
0.141934757
0.142729916
0.143520669
0.14430709
0.145089248
0.145867212
0.146641048
0.147410823
0.148176599
0.148938437
0.149696399
0.150450541
0.151200923
0.151947599
0.152690623
0.153430049
0.154165929
0.154898313
0.15562725
0.156352789
0.157074977
0.157793859
0.158509481
0.159221887
0.159931119
0.160637221
0.161340232
0.162040192
0.162737143
0.163431121
0.164122165
0.164810311
0.165495596
0.166178055
0.166857723
0.167534633
0.168208819
0.168880314
0.16954915
0.170215357
0.170878968
0.171540011
0.172198516
0.172854513
0.173508029
0.174159094
0.174807733
0.175453975
0.176097845
0.176739369
0.177378573
0.178015482
0.17865012
0.179282512
0.179912681
0.180540651
0.181166443
0.181790082
0.182411588
0.183030984
0.183648291
0.18426353
0.184876722
0.185487886
0.186097043
0.186704213
0.187309415
0.187912667
0.188513989
0.189113399
0.189710916
0.190306556
0.190900337
0.191492278
0.192082394
0.192670703
0.193257221
0.193841964
0.194424949
0.195006191
0.195585705
0.196163508
0.196739613
0.197314037
0.197886793
0.198457896
0.19902736
0.1995952
0.200161429
0.20072606
0.201289108
0.201850585
0.202410504
0.202968879
0.203525722
0.204081046
0.204634862
0.205187184
0.205738023
0.206287391
0.2068353
0.207381762
0.207926787
0.208470388
0.209012574
0.209553358
0.21009275
0.21063076
0.2111674
0.21170268
0.212236609
0.212769199
0.213300458
0.213830398
0.214359028
0.214886357
0.215412396
0.215937153
0.216460637
0.216982859
0.217503827
0.21802355
0.218542038
0.219059298
0.219575339
0.220090171
0.220603801
0.221116238
0.22162749
0.222137565
0.222646472
0.223154219
0.223660812
0.224166261
0.224670573
0.225173755
0.225675815
0.226176761
0.2266766
0.227175339
0.227672985
0.228169547
0.228665029
0.229159441
0.229652788
0.230145077
0.230636316
0.231126511
0.231615668
0.232103794
0.232590896
0.23307698
0.233562052
0.234046119
0.234529187
0.235011262
0.23549235
0.235972457
0.23645159
0.236929753
0.237406954
0.237883197
0.238358489
0.238832834
0.23930624
0.239778711
0.240250252
0.24072087
0.24119057
0.241659357
0.242127236
0.242594213
0.243060292
0.24352548
0.243989781
0.2444532
0.244915742
0.245377412
0.245838215
0.246298156
0.246757239
0.24721547
0.247672854
0.248129394
0.248585096
0.249039964
0.249494002
0.249947216
0.25039961
0.250851187
0.251301954
0.251751913
0.252201069
0.252649427
0.25309699
0.253543764
0.253989752
0.254434957
0.254879386
0.25532304
0.255765925
0.256208045
0.256649403
0.257090003
0.257529849
0.257968946
0.258407296
0.258844904
0.259281773
0.259717908
0.260153311
0.260587987
0.261021939
0.261455171
0.261887686
0.262319488
0.26275058
0.263180966
0.26361065
0.264039634
0.264467923
0.264895518
0.265322425
0.265748646
0.266174185
0.266599044
0.267023228
0.267446738
0.267869579
0.268291754
0.268713265
0.269134116
0.26955431
0.26997385
0.270392739
0.27081098
0.271228576
0.271645531
0.272061846
0.272477525
0.272892571
0.273306987
0.273720775
0.274133939
0.274546481
0.274958403
0.27536971
0.275780403
0.276190486
0.276599961
0.27700883
0.277417097
0.277824763
0.278231833
0.278638308
0.27904419
0.279449483
0.27985419
0.280258311
0.280661851
0.281064812
0.281467195
0.281869004
0.282270241
0.282670909
0.283071009
0.283470545
0.283869519
0.284267932
0.284665788
0.285063089
0.285459836
0.285856033
0.286251682
0.286646784
0.287041343
0.28743536
0.287828837
0.288221778
0.288614183
0.289006056
0.289397398
0.289788211
0.290178499
0.290568261
0.290957502
0.291346223
0.291734426
0.292122113
0.292509286
0.292895947
0.293282098
0.293667742
0.29405288
0.294437514
0.294821646
0.295205279
0.295588413
0.295971052
0.296353196
0.296734849
0.297116011
0.297496685
0.297876872
0.298256574
0.298635794
0.299014533
0.299392793
0.299770575
0.300147882
0.300524715
0.300901076
0.301276967
0.30165239
0.302027346
0.302401837
0.302775864
0.303149431
0.303522537
0.303895185
0.304267377
0.304639115
0.305010399
0.305381231
0.305751614
0.306121549
0.306491037
0.30686008
0.30722868
0.307596838
0.307964556
0.308331836
0.308698678
0.309065085
0.309431058
0.309796599
0.310161708
0.310526389
0.310890642
0.311254468
0.311617869
0.311980848
0.312343404
0.31270554
0.313067257
0.313428556
0.31378944
0.314149908
0.314509964
0.314869608
0.315228841
0.315587666
0.315946082
0.316304093
0.316661699
0.317018901
0.317375702
0.317732101
0.318088101
0.318443703
0.318798909
0.319153719
0.319508135
0.319862158
0.320215789
0.320569031
0.320921883
0.321274348
0.321626427
0.32197812
0.32232943
0.322680357
0.323030903
0.323381069
0.323730856
0.324080265
0.324429298
0.324777955
0.325126239
0.325474149
0.325821689
0.326168857
0.326515657
0.326862088
0.327208153
0.327553852
0.327899186
0.328244157
0.328588765
0.328933012
0.3292769
0.329620428
0.329963598
0.330306412
0.330648871
0.330990974
0.331332725
0.331674123
0.33201517
0.332355867
0.332696215
0.333036214
0.333375867
0.333715174
0.334054136
0.334392755
0.33473103
0.335068964
0.335406558
0.335743811
0.336080726
0.336417304
0.336753544
0.33708945
0.33742502
0.337760257
0.338095161
0.338429734
0.338763976
0.339097889
0.339431472
0.339764728
0.340097657
0.34043026
0.340762539
0.341094493
0.341426125
0.341757434
0.342088422
0.34241909
0.342749439
0.343079469
0.343409182
0.343738578
0.344067659
0.344396425
0.344724877
0.345053016
0.345380843
0.345708359
0.346035564
0.346362461
0.346689048
0.347015328
0.347341301
0.347666968
0.34799233
0.348317387
0.348642141
0.348966593
0.349290743
0.349614592
0.34993814
0.35026139
0.350584341
0.350906994
0.351229351
0.351551412
0.351873177
0.352194648
0.352515825
0.35283671
0.353157303
0.353477604
0.353797615
0.354117336
0.354436768
0.354755913
0.35507477
0.35539334
0.355711625
0.356029624
0.356347339
0.356664771
0.35698192
0.357298787
0.357615372
0.357931677
0.358247702
0.358563448
0.358878916
0.359194106
0.359509019
0.359823656
0.360138017
0.360452103
0.360765916
0.361079455
0.361392722
0.361705716
0.362018439
0.362330892
0.362643075
0.362954988
0.363266633
0.363578011
0.363889121
0.364199964
0.364510542
0.364820855
0.365130903
0.365440688
0.365750209
0.366059468
0.366368465
0.366677201
0.366985676
0.367293891
0.367601847
0.367909545
0.368216984
0.368524167
0.368831092
0.369137762
0.369444176
0.369750335
0.37005624
0.370361892
0.37066729
0.370972437
0.371277331
0.371581974
0.371886367
0.37219051
0.372494404
0.372798049
0.373101445
0.373404595
0.373707497
0.374010153
0.374312563
0.374614728
0.374916649
0.375218325
0.375519758
0.375820948
0.376121896
0.376422602
0.376723067
0.377023291
0.377323276
0.37762302
0.377922526
0.378221794
0.378520823
0.378819616
0.379118172
0.379416491
0.379714575
0.380012424
0.380310038
0.380607418
0.380904565
0.381201479
0.38149816
0.38179461
0.382090828
0.382386815
0.382682572
0.382978099
0.383273397
0.383568466
0.383863307
0.38415792
0.384452306
0.384746465
0.385040398
0.385334105
0.385627587
0.385920844
0.386213877
0.386506686
0.386799272
0.387091635
0.387383776
0.387675696
0.387967393
0.38825887
0.388550127
0.388841164
0.389131982
0.38942258
0.38971296
0.390003123
0.390293067
0.390582795
0.390872306
0.391161601
0.391450681
0.391739545
0.392028195
0.39231663
0.392604851
0.39289286
0.393180655
0.393468238
0.393755608
0.394042768
0.394329716
0.394616453
0.394902981
0.395189298
0.395475406
0.395761305
0.396046996
0.396332479
0.396617754
0.396902822
0.397187683
0.397472338
0.397756787
0.39804103
0.398325068
0.398608902
0.398892531
0.399175957
0.399459179
0.399742198
0.400025015
0.400307629
0.400590042
0.400872253
0.401154264
0.401436073
0.401717683
0.401999093
0.402280304
0.402561315
0.402842128
0.403122743
0.40340316
0.40368338
0.403963403
0.404243229
0.404522859
0.404802293
0.405081532
0.405360575
0.405639424
0.405918079
0.406196539
0.406474806
0.40675288
0.407030762
0.40730845
0.407585947
0.407863252
0.408140366
0.408417288
0.40869402
0.408970562
0.409246914
0.409523077
0.40979905
0.410074835
0.410350431
0.410625839
0.41090106
0.411176093
0.411450939
0.411725598
0.412000071
0.412274358
0.41254846
0.412822376
0.413096107
0.413369654
0.413643016
0.413916195
0.41418919
0.414462002
0.414734631
0.415007077
0.415279342
0.415551424
0.415823325
0.416095044
0.416366583
0.416637941
0.416909119
0.417180117
0.417450935
0.417721574
0.417992035
0.418262316
0.418532419
0.418802345
0.419072092
0.419341663
0.419611056
0.419880273
0.420149313
0.420418177
0.420686866
0.420955378
0.421223716
0.421491879
0.421759868
0.422027682
0.422295322
0.422562789
0.422830082
0.423097202
0.42336415
0.423630925
0.423897528
0.424163959
0.424430219
0.424696307
0.424962225
0.425227971
0.425493548
0.425758954
0.426024191
0.426289258
0.426554156
0.426818885
0.427083445
0.427347838
0.427612062
0.427876118
0.428140007
0.428403728
0.428667283
0.428930671
0.429193892
0.429456948
0.429719837
0.429982562
0.43024512
0.430507514
0.430769743
0.431031808
0.431293708
0.431555445
0.431817018
0.432078427
0.432339674
0.432600757
0.432861678
0.433122437
0.433383034
0.433643468
0.433903742
0.434163854
0.434423805
0.434683595
0.434943225
0.435202694
0.435462004
0.435721153
0.435980144
0.436238975
0.436497647
0.43675616
0.437014515
0.437272712
0.43753075
0.437788631
0.438046355
0.438303921
0.43856133
0.438818583
0.439075679
0.439332619
0.439589402
0.43984603
0.440102503
0.44035882
0.440614982
0.440870989
0.441126842
0.44138254
0.441638085
0.441893475
0.442148712
0.442403795
0.442658726
0.442913503
0.443168127
0.4434226
0.443676919
0.443931087
0.444185103
0.444438968
0.444692681
0.444946243
0.445199654
0.445452915
0.445706025
0.445958985
0.446211794
0.446464455
0.446716965
0.446969326
0.447221538
0.447473602
0.447725516
0.447977282
0.4482289
0.44848037
0.448731692
0.448982866
0.449233893
0.449484773
0.449735506
0.449986092
0.450236532
0.450486825
0.450736972
0.450986974
0.451236829
0.451486539
0.451736104
0.451985524
0.452234799
0.452483929
0.452732915
0.452981756
0.453230454
0.453479007
0.453727417
0.453975684
0.454223807
0.454471787
0.454719624
0.454967319
0.455214871
0.455462281
0.455709549
0.455956675
0.456203659
0.456450502
0.456697203
0.456943764
0.457190183
0.457436462
0.4576826
0.457928598
0.458174456
0.458420174
0.458665752
0.45891119
0.45915649
0.45940165
0.45964667
0.459891553
0.460136296
0.460380901
0.460625368
0.460869697
0.461113887
0.46135794
0.461601856
0.461845634
0.462089275
0.462332779
0.462576146
0.462819377
0.463062471
0.463305429
0.46354825
0.463790936
0.464033486
0.4642759
0.464518179
0.464760323
0.465002332
0.465244206
0.465485945
0.46572755
0.46596902
0.466210356
0.466451558
0.466692627
0.466933561
0.467174362
0.46741503
0.467655565
0.467895966
0.468136235
0.468376371
0.468616375
0.468856246
0.469095985
0.469335592
0.469575067
0.469814411
0.470053623
0.470292703
0.470531653
0.470770471
0.471009159
0.471247716
0.471486142
0.471724438
0.471962604
0.47220064
0.472438546
0.472676322
0.472913968
0.473151485
0.473388873
0.473626131
0.473863261
0.474100262
0.474337134
0.474573878
0.474810493
0.47504698
0.47528334
0.475519571
0.475755674
0.47599165
0.476227499
0.47646322
0.476698814
0.476934281
0.477169621
0.477404835
0.477639922
0.477874882
0.478109717
0.478344425
0.478579008
0.478813464
0.479047795
0.479282
0.479516081
0.479750035
0.479983865
0.48021757
0.48045115
0.480684605
0.480917936
0.481151143
0.481384225
0.481617183
0.481850018
0.482082728
0.482315315
0.482547778
0.482780118
0.483012335
0.483244428
0.483476399
0.483708247
0.483939972
0.484171575
0.484403055
0.484634413
0.484865649
0.485096762
0.485327754
0.485558624
0.485789373
0.48602
0.486250506
0.48648089
0.486711154
0.486941296
0.487171318
0.487401219
0.487631
0.48786066
0.4880902
0.488319619
0.488548919
0.488778099
0.489007159
0.489236099
0.48946492
0.489693621
0.489922204
0.490150667
0.490379011
0.490607236
0.490835343
0.49106333
0.4912912
0.491518951
0.491746584
0.491974098
0.492201495
0.492428774
0.492655935
0.492882978
0.493109904
0.493336713
0.493563404
0.493789978
0.494016435
0.494242776
0.494468999
0.494695106
0.494921096
0.49514697
0.495372728
0.495598369
0.495823894
0.496049304
0.496274598
0.496499775
0.496724838
0.496949785
0.497174616
0.497399332
0.497623934
0.49784842
0.498072791
0.498297048
0.498521189
0.498745217
0.49896913
0.499192928
0.499416613
0.499640183
0.499863639
0.500086982
0.50031021
0.500533326
0.500756327
0.500979215
0.50120199
0.501424652
0.5016472
0.501869636
0.502091959
0.502314169
0.502536266
0.502758251
0.502980124
0.503201884
0.503423532
0.503645068
0.503866492
0.504087804
0.504309004
0.504530092
0.504751069
0.504971935
0.505192689
0.505413332
0.505633864
0.505854285
0.506074595
0.506294794
0.506514882
0.50673486
0.506954727
0.507174484
0.507394131
0.507613667
0.507833094
0.50805241
0.508271616
0.508490713
0.5087097
0.508928577
0.509147345
0.509366004
0.509584553
0.509802993
0.510021324
0.510239546
0.51045766
0.510675664
0.51089356
0.511111347
0.511329026
0.511546596
0.511764058
0.511981412
0.512198658
0.512415796
0.512632826
0.512849748
0.513066562
0.513283269
0.513499868
0.51371636
0.513932745
0.514149022
0.514365193
0.514581256
0.514797212
0.515013062
0.515228805
0.515444441
0.515659971
0.515875394
0.516090711
0.516305921
0.516521026
0.516736024
0.516950916
0.517165703
0.517380383
0.517594958
0.517809428
0.518023791
0.51823805
0.518452203
0.518666251
0.518880193
0.519094031
0.519307763
0.519521391
0.519734914
0.519948332
0.520161646
0.520374855
0.520587959
0.520800959
0.521013855
0.521226647
0.521439335
0.521651919
0.521864399
0.522076775
0.522289047
0.522501215
0.52271328
0.522925242
0.5231371
0.523348855
0.523560507
0.523772055
0.523983501
0.524194843
0.524406083
0.52461722
0.524828254
0.525039186
0.525250015
0.525460741
0.525671366
0.525881888
0.526092307
0.526302625
0.526512841
0.526722954
0.526932966
0.527142876
0.527352685
0.527562391
0.527771997
0.5279815
0.528190903
0.528400204
0.528609404
0.528818503
0.5290275
0.529236397
0.529445193
0.529653888
0.529862483
0.530070976
0.53027937
0.530487662
0.530695855
0.530903947
0.531111939
0.53131983
0.531527622
0.531735314
0.531942905
0.532150397
0.532357789
0.532565082
0.532772274
0.532979368
0.533186361
0.533393256
0.533600051
0.533806747
0.534013344
0.534219841
0.53442624
0.53463254
0.534838741
0.535044843
0.535250847
0.535456752
0.535662558
0.535868266
0.536073876
0.536279387
0.5364848
0.536690115
0.536895332
0.537100451
0.537305472
0.537510395
0.53771522
0.537919947
0.538124577
0.53832911
0.538533545
0.538737882
0.538942122
0.539146265
0.539350311
0.539554259
0.539758111
0.539961865
0.540165523
0.540369084
0.540572548
0.540775915
0.540979186
0.54118236
0.541385438
0.541588419
0.541791305
0.541994093
0.542196786
0.542399382
0.542601883
0.542804287
0.543006596
0.543208809
0.543410926
0.543612947
0.543814872
0.544016703
0.544218437
0.544420076
0.54462162
0.544823069
0.545024422
0.54522568
0.545426844
0.545627912
0.545828885
0.546029763
0.546230547
0.546431236
0.54663183
0.54683233
0.547032735
0.547233045
0.547433261
0.547633383
0.547833411
0.548033344
0.548233184
0.548432929
0.54863258
0.548832138
0.549031601
0.549230971
0.549430247
0.549629429
0.549828518
0.550027513
0.550226415
0.550425223
0.550623938
0.55082256
0.551021089
0.551219524
0.551417866
0.551616116
0.551814272
0.552012336
0.552210306
0.552408184
0.552605969
0.552803662
0.553001262
0.55319877
0.553396185
0.553593508
0.553790738
0.553987877
0.554184923
0.554381877
0.554578739
0.554775508
0.554972186
0.555168773
0.555365267
0.55556167
0.55575798
0.5559542
0.556150328
0.556346364
0.556542309
0.556738162
0.556933924
0.557129595
0.557325175
0.557520664
0.557716061
0.557911368
0.558106583
0.558301708
0.558496742
0.558691685
0.558886538
0.559081299
0.559275971
0.559470551
0.559665042
0.559859442
0.560053751
0.56024797
0.560442099
0.560636138
0.560830087
0.561023946
0.561217715
0.561411394
0.561604983
0.561798482
0.561991891
0.562185211
0.562378441
0.562571582
0.562764633
0.562957594
0.563150467
0.563343249
0.563535943
0.563728547
0.563921063
0.564113489
0.564305826
0.564498074
0.564690233
0.564882303
0.565074285
0.565266177
0.565457981
0.565649697
0.565841323
0.566032861
0.566224311
0.566415672
0.566606945
0.56679813
0.566989226
0.567180234
0.567371154
0.567561986
0.56775273
0.567943386
0.568133954
0.568324434
0.568514827
0.568705131
0.568895348
0.569085477
0.569275519
0.569465473
0.56965534
0.569845119
0.570034811
0.570224416
0.570413933
0.570603363
0.570792706
0.570981962
0.571171131
0.571360212
0.571549207
0.571738115
0.571926937
0.572115671
0.572304319
0.57249288
0.572681354
0.572869742
0.573058043
0.573246258
0.573434387
0.573622429
0.573810385
0.573998255
0.574186038
0.574373735
0.574561347
0.574748872
0.574936311
0.575123664
0.575310932
0.575498114
0.575685209
0.57587222
0.576059144
0.576245983
0.576432736
0.576619404
0.576805986
0.576992483
0.577178895
0.577365221
0.577551462
0.577737618
0.577923689
0.578109675
0.578295575
0.578481391
0.578667122
0.578852767
0.579038328
0.579223804
0.579409196
0.579594503
0.579779725
0.579964862
0.580149915
0.580334884
0.580519768
0.580704567
0.580889283
0.581073914
0.58125846
0.581442923
0.581627301
0.581811596
0.581995806
0.582179932
0.582363975
0.582547933
0.582731808
0.582915599
0.583099306
0.58328293
0.583466469
0.583649926
0.583833298
0.584016587
0.584199793
0.584382916
0.584565955
0.58474891
0.584931783
0.585114572
0.585297278
0.585479901
0.585662441
0.585844898
0.586027272
0.586209563
0.586391771
0.586573896
0.586755939
0.586937898
0.587119776
0.58730157
0.587483282
0.587664911
0.587846458
0.588027923
0.588209305
0.588390605
0.588571822
0.588752957
0.58893401
0.589114981
0.58929587
0.589476676
0.589657401
0.589838044
0.590018604
0.590199083
0.59037948
0.590559796
0.590740029
0.590920181
0.591100251
0.59128024
0.591460147
0.591639972
0.591819716
0.591999379
0.59217896
0.59235846
0.592537879
0.592717216
0.592896473
0.593075648
0.593254742
0.593433755
0.593612687
0.593791538
0.593970308
0.594148997
0.594327605
0.594506133
0.59468458
0.594862946
0.595041232
0.595219437
0.595397561
0.595575605
0.595753568
0.595931452
0.596109254
0.596286977
0.596464619
0.59664218
0.596819662
0.596997063
0.597174385
0.597351626
0.597528787
0.597705869
0.59788287
0.598059791
0.598236633
0.598413395
0.598590077
0.598766679
0.598943202
0.599119645
0.599296009
0.599472293
0.599648497
0.599824622
0.600000668
0.600176634
0.600352521
0.600528329
0.600704057
0.600879706
0.601055276
0.601230767
0.601406179
0.601581512
0.601756766
0.601931941
0.602107037
0.602282055
0.602456993
0.602631853
0.602806634
0.602981336
0.60315596
0.603330505
0.603504971
0.603679359
0.603853669
0.6040279
0.604202053
0.604376127
0.604550123
0.604724041
0.604897881
0.605071642
0.605245326
0.605418931
0.605592458
0.605765907
0.605939279
0.606112572
0.606285787
0.606458925
0.606631985
0.606804967
0.606977871
0.607150698
0.607323447
0.607496119
0.607668713
0.607841229
0.608013668
0.60818603
0.608358314
0.608530521
0.60870265
0.608874702
0.609046678
0.609218575
0.609390396
0.60956214
0.609733806
0.609905396
0.610076909
0.610248344
0.610419703
0.610590985
0.61076219
0.610933318
0.61110437
0.611275345
0.611446243
0.611617065
0.61178781
0.611958479
0.612129071
0.612299586
0.612470025
0.612640388
0.612810675
0.612980885
0.613151019
0.613321077
0.613491058
0.613660964
0.613830793
0.614000546
0.614170224
0.614339825
0.61450935
0.6146788
0.614848174
0.615017471
0.615186693
0.61535584
0.61552491
0.615693905
0.615862825
0.616031668
0.616200437
0.616369129
0.616537747
0.616706288
0.616874755
0.617043146
0.617211462
0.617379702
0.617547868
0.617715958
0.617883973
0.618051912
0.618219777
0.618387567
0.618555282
0.618722921
0.618890486
0.619057976
0.619225391
0.619392732
0.619559997
0.619727188
0.619894304
0.620061345
0.620228312
0.620395205
0.620562022
0.620728765
0.620895434
0.621062029
0.621228549
0.621394994
0.621561365
0.621727662
0.621893885
0.622060034
0.622226108
0.622392108
0.622558035
0.622723887
0.622889665
0.623055369
0.623220999
0.623386556
0.623552038
0.623717447
0.623882782
0.624048043
0.62421323
0.624378344
0.624543384
0.62470835
0.624873243
0.625038063
0.625202809
0.625367481
0.62553208
0.625696606
0.625861058
0.626025437
0.626189743
0.626353975
0.626518135
0.626682221
0.626846234
0.627010174
0.627174041
0.627337834
0.627501555
0.627665203
0.627828778
0.627992281
0.62815571
0.628319067
0.62848235
0.628645562
0.6288087
0.628971766
0.629134759
0.62929768
0.629460528
0.629623303
0.629786006
0.629948637
0.630111195
0.630273681
0.630436095
0.630598436
0.630760705
0.630922902
0.631085026
0.631247079
0.631409059
0.631570968
0.631732804
0.631894568
0.632056261
0.632217881
0.632379429
0.632540906
0.632702311
0.632863644
0.633024905
0.633186095
0.633347213
0.633508259
0.633669233
0.633830136
0.633990968
0.634151728
0.634312417
0.634473034
0.634633579
0.634794054
0.634954457
0.635114788
0.635275049
0.635435238
0.635595356
0.635755403
0.635915379
0.636075283
0.636235117
0.63639488
0.636554571
0.636714192
0.636873742
0.637033221
0.637192629
0.637351966
0.637511232
0.637670428
0.637829553
0.637988608
0.638147591
0.638306504
0.638465347
0.638624119
0.638782821
0.638941452
0.639100012
0.639258503
0.639416922
0.639575272
0.639733551
0.63989176
0.640049899
0.640207968
0.640365966
0.640523894
0.640681753
0.640839541
0.640997259
0.641154907
0.641312486
0.641469994
0.641627432
0.641784801
0.6419421
0.642099329
0.642256488
0.642413577
0.642570597
0.642727548
0.642884428
0.643041239
0.643197981
0.643354653
0.643511255
0.643667788
0.643824252
0.643980646
0.644136971
0.644293226
0.644449413
0.64460553
0.644761578
0.644917556
0.645073466
0.645229306
0.645385078
0.64554078
0.645696413
0.645851978
0.646007473
0.646162899
0.646318257
0.646473546
0.646628765
0.646783917
0.646938999
0.647094013
0.647248958
0.647403834
0.647558642
0.647713381
0.647868051
0.648022653
0.648177187
0.648331652
0.648486049
0.648640377
0.648794637
0.648948829
0.649102952
0.649257007
0.649410994
0.649564913
0.649718763
0.649872546
0.65002626
0.650179906
0.650333484
0.650486995
0.650640437
0.650793811
0.650947118
0.651100356
0.651253527
0.65140663
0.651559665
0.651712632
0.651865532
0.652018364
0.652171128
0.652323825
0.652476454
0.652629016
0.65278151
0.652933937
0.653086296
0.653238588
0.653390812
0.653542969
0.653695059
0.653847081
0.653999037
0.654150924
0.654302745
0.654454499
0.654606185
0.654757805
0.654909357
0.655060842
0.65521226
0.655363612
0.655514896
0.655666113
0.655817264
0.655968348
0.656119365
0.656270315
0.656421198
0.656572015
0.656722765
0.656873448
0.657024064
0.657174615
0.657325098
0.657475515
0.657625865
0.657776149
0.657926367
0.658076518
0.658226603
0.658376621
0.658526573
0.658676459
0.658826279
0.658976032
0.659125719
0.65927534
0.659424895
0.659574384
0.659723807
0.659873163
0.660022454
0.660171679
0.660320838
0.66046993
0.660618957
0.660767919
0.660916814
0.661065643
0.661214407
0.661363105
0.661511737
0.661660304
0.661808805
0.661957241
0.66210561
0.662253915
0.662402154
0.662550327
0.662698435
0.662846477
0.662994454
0.663142366
0.663290212
0.663437994
0.663585709
0.66373336
0.663880945
0.664028466
0.664175921
0.664323311
0.664470635
0.664617895
0.66476509
0.66491222
0.665059285
0.665206285
0.66535322
0.66550009
0.665646895
0.665793635
0.665940311
0.666086922
0.666233468
0.66637995
0.666526367
0.666672719
0.666819006
0.666965229
0.667111388
0.667257482
0.667403511
0.667549476
0.667695377
0.667841213
0.667986985
0.668132693
0.668278336
0.668423915
0.66856943
0.66871488
0.668860266
0.669005589
0.669150847
0.669296041
0.66944117
0.669586236
0.669731238
0.669876176
0.67002105
0.67016586
0.670310606
0.670455288
0.670599907
0.670744461
0.670888952
0.671033379
0.671177743
0.671322042
0.671466278
0.671610451
0.67175456
0.671898605
0.672042587
0.672186505
0.67233036
0.672474151
0.672617879
0.672761543
0.672905145
0.673048682
0.673192157
0.673335568
0.673478916
0.673622201
0.673765423
0.673908581
0.674051676
0.674194708
0.674337678
0.674480584
0.674623427
0.674766207
0.674908924
0.675051578
0.67519417
0.675336698
0.675479164
0.675621567
0.675763907
0.675906184
0.676048399
0.676190551
0.67633264
0.676474666
0.67661663
0.676758532
0.676900371
0.677042147
0.677183861
0.677325512
0.677467101
0.677608628
0.677750092
0.677891494
0.678032833
0.67817411
0.678315325
0.678456478
0.678597569
0.678738597
0.678879563
0.679020467
0.679161309
0.679302089
0.679442807
0.679583463
0.679724057
0.679864589
0.680005059
0.680145467
0.680285813
0.680426097
0.68056632
0.680706481
0.68084658
0.680986617
0.681126593
0.681266507
0.681406359
0.68154615
0.681685879
0.681825547
0.681965153
0.682104697
0.68224418
0.682383602
0.682522962
0.682662261
0.682801498
0.682940675
0.683079789
0.683218843
0.683357835
0.683496766
0.683635636
0.683774445
0.683913193
0.684051879
0.684190505
0.684329069
0.684467572
0.684606015
0.684744396
0.684882717
0.685020976
0.685159175
0.685297313
0.685435389
0.685573406
0.685711361
0.685849256
0.68598709
0.686124863
0.686262575
0.686400227
0.686537819
0.686675349
0.686812819
0.686950229
0.687087578
0.687224867
0.687362095
0.687499263
0.68763637
0.687773417
0.687910404
0.68804733
0.688184197
0.688321002
0.688457748
0.688594434
0.688731059
0.688867624
0.689004129
0.689140574
0.689276959
0.689413284
0.689549548
0.689685753
0.689821898
0.689957983
0.690094008
0.690229973
0.690365879
0.690501724
0.69063751
0.690773236
0.690908902
0.691044509
0.691180056
0.691315543
0.69145097
0.691586338
0.691721647
0.691856896
0.691992085
0.692127215
0.692262285
0.692397296
0.692532248
0.69266714
0.692801972
0.692936746
0.69307146
0.693206115
0.69334071
0.693475247
0.693609724
0.693744142
0.693878501
0.694012801
0.694147041
0.694281223
0.694415345
0.694549409
0.694683413
0.694817359
0.694951246
0.695085073
0.695218842
0.695352552
0.695486203
0.695619796
0.695753329
0.695886804
0.69602022
0.696153578
0.696286877
0.696420117
0.696553298
0.696686421
0.696819486
0.696952492
0.697085439
0.697218328
0.697351158
0.69748393
0.697616644
0.697749299
0.697881896
0.698014434
0.698146915
0.698279337
0.6984117
0.698544006
0.698676253
0.698808442
0.698940573
0.699072646
0.699204661
0.699336618
0.699468516
0.699600357
0.69973214
0.699863865
0.699995531
0.70012714
0.700258691
0.700390185
0.70052162
0.700652997
0.700784317
0.700915579
0.701046784
0.70117793
0.701309019
0.70144005
0.701571024
0.70170194
0.701832799
0.7019636
0.702094343
0.702225029
0.702355657
0.702486229
0.702616742
0.702747198
0.702877597
0.703007939
0.703138223
0.70326845
0.70339862
0.703528732
0.703658788
0.703788786
0.703918727
0.70404861
0.704178437
0.704308207
0.704437919
0.704567575
0.704697173
0.704826715
0.704956199
0.705085627
0.705214998
0.705344312
0.705473569
0.705602769
0.705731912
0.705860999
0.705990029
0.706119002
0.706247919
0.706376779
0.706505582
0.706634328
0.706763018
0.706891652
0.707020228
0.707148749
0.707277213
0.70740562
0.707533971
0.707662265
0.707790503
0.707918685
0.708046811
0.70817488
0.708302892
0.708430849
0.708558749
0.708686593
0.708814381
0.708942113
0.709069788
0.709197408
0.709324971
0.709452479
0.70957993
0.709707325
0.709834664
0.709961948
0.710089175
0.710216346
0.710343462
0.710470522
0.710597525
0.710724473
0.710851366
0.710978202
0.711104983
0.711231708
0.711358377
0.711484991
0.711611549
0.711738051
0.711864498
0.711990889
0.712117225
0.712243505
0.71236973
0.712495899
0.712622013
0.712748071
0.712874074
0.713000022
0.713125914
0.713251751
0.713377532
0.713503259
0.71362893
0.713754546
0.713880107
0.714005612
0.714131062
0.714256458
0.714381798
0.714507083
0.714632313
0.714757488
0.714882608
0.715007673
0.715132683
0.715257638
0.715382539
0.715507384
0.715632175
0.71575691
0.715881591
0.716006217
0.716130789
0.716255305
0.716379767
0.716504175
0.716628527
0.716752825
0.716877068
0.717001257
0.717125391
0.717249471
0.717373496
0.717497467
0.717621383
0.717745245
0.717869052
0.717992805
0.718116504
0.718240148
0.718363738
0.718487274
0.718610755
0.718734182
0.718857555
0.718980874
0.719104139
0.719227349
0.719350505
0.719473608
0.719596656
0.71971965
0.71984259
0.719965476
0.720088308
0.720211086
0.72033381
0.72045648
0.720579097
0.720701659
0.720824168
0.720946623
0.721069024
0.721191371
0.721313665
0.721435905
0.721558091
0.721680223
0.721802302
0.721924328
0.722046299
0.722168217
0.722290082
0.722411893
0.72253365
0.722655354
0.722777005
0.722898602
0.723020146
0.723141636
0.723263073
0.723384457
0.723505787
0.723627064
0.723748288
0.723869458
0.723990576
0.72411164
0.724232651
0.724353609
0.724474513
0.724595365
0.724716163
0.724836909
0.724957601
0.725078241
0.725198827
0.725319361
0.725439841
0.725560269
0.725680644
0.725800966
0.725921235
0.726041451
0.726161615
0.726281725
0.726401783
0.726521788
0.726641741
0.726761641
0.726881488
0.727001282
0.727121024
0.727240714
0.727360351
0.727479935
0.727599467
0.727718946
0.727838373
0.727957747
0.728077069
0.728196339
0.728315556
0.728434721
0.728553833
0.728672893
0.728791901
0.728910857
0.72902976
0.729148612
0.729267411
0.729386158
0.729504852
0.729623495
0.729742085
0.729860624
0.72997911
0.730097545
0.730215927
0.730334257
0.730452536
0.730570762
0.730688937
0.73080706
0.73092513
0.731043149
0.731161117
0.731279032
0.731396896
0.731514707
0.731632468
0.731750176
0.731867833
0.731985438
0.732102991
0.732220493
0.732337943
0.732455342
0.732572689
0.732689985
0.732807229
0.732924422
0.733041563
0.733158653
0.733275691
0.733392678
0.733509614
0.733626498
0.733743331
0.733860113
0.733976843
0.734093522
0.73421015
0.734326727
0.734443252
0.734559727
0.73467615
0.734792522
0.734908843
0.735025113
0.735141332
0.7352575
0.735373617
0.735489683
0.735605698
0.735721662
0.735837575
0.735953437
0.736069249
0.736185009
0.736300719
0.736416378
0.736531986
0.736647544
0.73676305
0.736878506
0.736993912
0.737109266
0.73722457
0.737339824
0.737455026
0.737570179
0.73768528
0.737800332
0.737915332
0.738030282
0.738145182
0.738260031
0.73837483
0.738489579
0.738604277
0.738718925
0.738833522
0.738948069
0.739062566
0.739177012
0.739291409
0.739405755
0.739520051
0.739634297
0.739748492
0.739862638
0.739976733
0.740090778
0.740204774
0.740318719
0.740432614
0.740546459
0.740660254
0.740774
0.740887695
0.74100134
0.741114936
0.741228482
0.741341978
0.741455424
0.74156882
0.741682166
0.741795463
0.74190871
0.742021907
0.742135055
0.742248153
0.742361201
0.7424742
0.742587149
0.742700048
0.742812898
0.742925699
0.74303845
0.743151151
0.743263803
0.743376406
0.743488959
0.743601462
0.743713917
0.743826322
0.743938677
0.744050984
0.744163241
0.744275448
0.744387607
0.744499716
0.744611776
0.744723787
0.744835749
0.744947662
0.745059525
0.745171339
0.745283105
0.745394821
0.745506488
0.745618106
0.745729676
0.745841196
0.745952667
0.74606409
0.746175463
0.746286788
0.746398063
0.74650929
0.746620469
0.746731598
0.746842678
0.74695371
0.747064693
0.747175628
0.747286513
0.74739735
0.747508139
0.747618878
0.747729569
0.747840212
0.747950806
0.748061352
0.748171849
0.748282297
0.748392697
0.748503049
0.748613352
0.748723606
0.748833813
0.748943971
0.74905408
0.749164142
0.749274155
0.749384119
0.749494036
0.749603904
0.749713724
0.749823496
0.749933219
0.750042895
0.750152522
0.750262102
0.750371633
0.750481116
0.750590551
0.750699938
0.750809277
0.750918568
0.751027811
0.751137006
0.751246153
0.751355253
0.751464304
0.751573308
0.751682264
0.751791172
0.751900032
0.752008844
0.752117609
0.752226326
0.752334995
0.752443616
0.75255219
0.752660716
0.752769195
0.752877626
0.752986009
0.753094345
0.753202634
0.753310874
0.753419068
0.753527214
0.753635312
0.753743363
0.753851366
0.753959322
0.754067231
0.754175093
0.754282907
0.754390673
0.754498393
0.754606065
0.75471369
0.754821268
0.754928798
0.755036282
0.755143718
0.755251107
0.755358449
0.755465744
0.755572991
0.755680192
0.755787346
0.755894452
0.756001512
0.756108524
0.75621549
0.756322409
0.756429281
0.756536106
0.756642884
0.756749615
0.756856299
0.756962937
0.757069528
0.757176072
0.757282569
0.75738902
0.757495423
0.757601781
0.757708091
0.757814355
0.757920572
0.758026743
0.758132867
0.758238944
0.758344975
0.758450959
0.758556897
0.758662789
0.758768634
0.758874432
0.758980184
0.75908589
0.759191549
0.759297162
0.759402729
0.759508249
0.759613723
0.75971915
0.759824532
0.759929867
0.760035156
0.760140399
0.760245595
0.760350746
0.76045585
0.760560908
0.760665921
0.760770887
0.760875807
0.760980681
0.761085508
0.76119029
0.761295026
0.761399716
0.76150436
0.761608959
0.761713511
0.761818017
0.761922478
0.762026893
0.762131261
0.762235585
0.762339862
0.762444093
0.762548279
0.762652419
0.762756514
0.762860563
0.762964566
0.763068523
0.763172435
0.763276302
0.763380122
0.763483898
0.763587627
0.763691311
0.76379495
0.763898543
0.764002091
0.764105593
0.76420905
0.764312462
0.764415828
0.764519149
0.764622424
0.764725654
0.764828839
0.764931979
0.765035073
0.765138122
0.765241126
0.765344085
0.765446998
0.765549867
0.76565269
0.765755468
0.765858201
0.765960889
0.766063532
0.76616613
0.766268683
0.766371191
0.766473653
0.766576071
0.766678444
0.766780773
0.766883056
0.766985294
0.767087488
0.767189636
0.76729174
0.767393799
0.767495814
0.767597783
0.767699708
0.767801588
0.767903423
0.768005214
0.76810696
0.768208661
0.768310318
0.76841193
0.768513498
0.768615021
0.7687165
0.768817934
0.768919323
0.769020668
0.769121969
0.769223225
0.769324437
0.769425604
0.769526727
0.769627805
0.76972884
0.769829829
0.769930775
0.770031676
0.770132533
0.770233346
0.770334115
0.770434839
0.770535519
0.770636155
0.770736747
0.770837295
0.770937798
0.771038258
0.771138673
0.771239045
0.771339372
0.771439655
0.771539895
0.77164009
0.771740242
0.771840349
0.771940413
0.772040432
0.772140408
0.77224034
0.772340228
0.772440072
0.772539873
0.77263963
0.772739343
0.772839012
0.772938637
0.773038219
0.773137757
0.773237251
0.773336702
0.773436109
0.773535473
0.773634793
0.773734069
0.773833302
0.773932491
0.774031637
0.774130739
0.774229798
0.774328814
0.774427785
0.774526714
0.774625599
0.774724441
0.774823239
0.774921994
0.775020706
0.775119374
0.775217999
0.775316581
0.77541512
0.775513615
0.775612067
0.775710476
0.775808842
0.775907165
0.776005444
0.776103681
0.776201874
0.776300024
0.776398132
0.776496196
0.776594217
0.776692195
0.77679013
0.776888023
0.776985872
0.777083678
0.777181442
0.777279162
0.77737684
0.777474475
0.777572067
0.777669616
0.777767122
0.777864586
0.777962007
0.778059385
0.77815672
0.778254013
0.778351263
0.77844847
0.778545635
0.778642757
0.778739837
0.778836874
0.778933868
0.77903082
0.779127729
0.779224596
0.77932142
0.779418202
0.779514942
0.779611638
0.779708293
0.779804905
0.779901475
0.779998002
0.780094487
0.78019093
0.78028733
0.780383689
0.780480004
0.780576278
0.780672509
0.780768698
0.780864845
0.78096095
0.781057013
0.781153033
0.781249012
0.781344948
0.781440842
0.781536694
0.781632504
0.781728272
0.781823999
0.781919683
0.782015325
0.782110925
0.782206483
0.782301999
0.782397474
0.782492906
0.782588297
]; %read experimental data
AtAfSim=myModel(x);
sse=sum((AtAfExp-AtAfSim).^2); %sum squared error
end

Sign in to comment.

Your code was a very good start. There were two fatal errors. ONe is that when you call fmincon(), you must pass a handle to the function, so you write fmincon(x0,@sumsqerr,[],[],[],[],lb,ub). I forgot to include the "@" in my example. The other problem is that function myMOdel() did not assign a value to AtAf. It assigned a value to At_Af. So I fied that. I also simplified your code and I replaced a for n=0:50 loop with vectorized statements to assign values to L2, L3, L4, L5. I checked that my vectorized versions were equivalent to the for loop values before deleting the foor loop code. Another change I made is that I read in the value for AtAfExp from a text file (see attached text file) in the main program, rather than defining it on every call to sumsqerr(). I declare AtAfExp to be global, in the main program and in sumsqerr(), so that its value will be known inside sumsqerr(). This approach may be faster, and more importantly it allows me to plot the eperimental and best fit values of AtAf one the "best fit" is found.
Now let's consider the results. The plot of the eperimental data and the "best fit' is below. It does not look good. I also show y=sqrt(t)/68, which is much better fit than the "best fit" simulation.
I think the "best fit" is not a good fit because fmincon() has gotten stuck in a local minimum that is not a global minimum. This is not uncomon in multidimensinal fitting with fmincon() and similar routines.
What can we do about it? We can manually try different values for the initial guess, x0. Or we can write code that automatically tries different starting points and finds the best fit from each starting point, and then picks the overall best fit. This is the approach I always take when doing multidimensional minimization.
Here is a new main program fitDataPolyStart.m, that tries multiple starting points, to reduce the chance of finding a solution that is not a global minimum. Use this instead of fitData.m. The other scripts and functions which I posted have not been changed.
Recal that with fitData, we obtained the "best fit"
d=4.1e-10, Af=0.560, RMSE=0.1093.
My first attempt with the PolyStart code was to try initial guessess that were spread equally across the allowed ranges which you specified. You specified 1e-15<=d<=1e-5, and 0<=Af<=1. Thefore I specified
d0=[1e-15,1e-13,1e-11,1e-9,1e-7,1e-5]; %initial guesses for d
Af0=[0,.2,.4,.6,.8,1]; %initial guesses for Af
This makes 6x6=36 initial guesses. The best of all 36 fits was
d=3.5e-10, Af=0.599, RMSE=0.0878.
That is better but still not very good. Therefore I refined by grid of starting points as follows:
d0=[1e-11,2e-11,5e-11,1e-10,2e-10,5e-10,1e-9]; %initial guesses for d
Af0=[.6,.7,.8,.9,1]; %initial guesses for Af
This makes 7x5=35 initial guesses. The best of all 35 fits was
d=1.8e-10, Af=0.775, RMSE=0.0264.
This is better. Refine the initial guesses again. I think we need a smaller d value and a bigger Af value to improve the fit.
d0=[1.5e-11,2e-11,3e-11,5e-11,7e-11,1e-10,1.5e-10,2e-10]; %initial guesses for d
Af0=[.75,.8,.85,.9,.95,1]; %initial guesses for Af
This best fit of these 56 initial guesses is at
d=1.50e-10, Af=0.850, RMSE=0.0139.
Notice that the RMSE improved by about a factor of 2.
Further guessing produces
d0=[1.0e-10,1.1e-10,1.2e-10]; %initial guesses for d
Af0=[.95,.96,.97]; %initial guesses for Af
which gives a very nice fit:
d=1.10e-10, Af=0.9600, RMSE=0.0026.
I have had good results with fmincon() for various model fitting problems in the past. However, I don't think fmincon() is doing a very good job on this fitting problem. IIt seems to get stuck, or it stops tryng to improve, too soon. fmincon() has many options. Perhpas the performance would improve with different choices for certain options.

2 Comments

Thank you Sir! This is great and let me try to plug in different values to see how the model responds. Appreciate all the help very much espcially your guided problem solving!!. I will update you after some evaluation.
Hey Will, I think fitDataPolyStart.m file is missing. Can you please attach that when youy get a chance? Thanks a ton!

Sign in to comment.

I also wrote the attached script: myModelTest.m. This script calls test myModel to make sure it gives reasonable output, and to see how changes in d and Af affect the output. It does not do a search or a minimization. It just sets the values for d and Af and computes the corresponding AfAt(t). It does it for five different combinations of d and Af. I like to choose a "central" pair of [d,Af]. Then I do higher and lower d, with the same Af; and higher and lower Af, with the same d. That lets me see what Af does to the fit, and what d does to the fit.
The script also reads the experimental data from a file, and it computes the RMS Error of each of the five model outputs, reative to the experimental data.
I learned from this that d is like a rate constant for exponential rise or decay. When d is big, AfAt(t) quickly approaches its asymptotic value. When d is small, AfAt(t) takes a long time to approach its asymptotic value. Af is a scaling factor, and it is the asymptotic value which AfAt(t) approaches, as t gets large.
Here is the console output from the script, and the plot that it generates.
>> myModelTest
i=1, x=1.10e-10,0.900, rmsErr=0.0356
i=2, x=1.10e-10,0.960, rmsErr=0.0026
i=3, x=1.10e-10,1.000, rmsErr=0.0242
i=4, x=8.00e-11,0.960, rmsErr=0.0757
i=5, x=1.00e-09,0.960, rmsErr=0.3787

2 Comments

Hi Will, sorry, I am pretty new to coding and I am parsing through your responses so bare with me. I am trying to fit a different or rather an apporpirate data set to determine d and Af. I am losing track of where the updates have to be made once I update the input array ( ATAfExp). I am understanding your code and the response of your code to this new data set should make sense for the same initial assumption
Getting hit with incompatible size errors for sum of squares function. Trying to figure out what the needs to be fixed, since I update the array data ( ATAfExp)
If you have a new set of experimental data which you want to fit, then save the numbers in a text file, one number per file, with no header, just like file AtAfExpt.txt. If the new file name is AtAfExpt2.txt, then modify fitData.m. Change
AtAfExp=load('AtAfExpt.txt');
to
AtAfExp=load('AtAfExpt2.txt');
I made one change in fitData.m and in fitDataPolyStart.m and myModel.m: I replaced "3001" with "length(AtAfEp)". That shoud assure a correct result, even if the new data file has a different number of numbers in it. That could have been the source of the incompatible size errors. Try the attached files, after you change the file name in fitData.m.

Sign in to comment.

Categories

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!