Change of boundary condition

Attached is a long code of a mathematical problem in modeling.
If I want to change r(1)=ya(2)-t^alpha with alpha given, can you please help define t so that it is the same as t in the main loop while?

 Accepted Answer

Change
Phig=gamma*(Vol+2*C*Leak)-t;Phib=GS(y(:,1),y(:,N),tipVal);
to
Phig = gamma*(Vol+2*C*Leak) - t;
Phib = GS(y(:,1), y(:,N), tipVal, t);
Change
function r=GS(ya,yb,tipVal)
r(1)=ya(2)-1;r(2)=yb(1)-tipVal;end
to
function r = GS(ya, yb, tipVal, t)
alpha = 1/3; %or 3/8. Or whatever is appropriate
r(1) = ya(2) - t.^alpha;
r(2) = yb(1) - tipVal;
end

8 Comments

The code worked with alpha=3/8 or 1/3. But when I tried alpha=5/2 it showed this error:
Error using chckxy (line 30)
The X vector should have real elements.
Error in spline (line 53)
[x,y,sizey,endslopes] = chckxy(x,y);
Error in P3Dclassical>FS (line 119)
F(1,:) =-gamma*y(2,:)./y(1,:).^3./Ups;tau0=spline(gammah,tauh,gamma*x);
Error in P3Dclassical>getRHS (line 99)
F = FS(x,y,gamma,K,y0,gamma0,dt,OMEGA,PHI,C,tauh,gammah,ipen);
Error in P3Dclassical (line 33)
b=getRHS(xi,y,gamma,K,t,tc,y0,y0h,gamma0,dt,OMEGA,PHI,C,tauh,gammah,ipen);
With alpha=5/2 it showed:
Index exceeds the number of array elements (0).
Error in P3Dclassical (line 40)
if iter==Maxit,[' No convergence: Residual = ',num2str(R(iter))],end
Can you please explain this to me?
t=t + dt;if t >= tM,break;end;R=[];tauh=[tauh t];gammah=[gammah gamma];
That code initializes R to empty. That is the only place that R is assigned to.
if iter==Maxit,[' No convergence: Residual = ',num2str(R(iter))],end
That code tries to use R indexed at something, but R is empty.
Unfortunately I cannot tell from the code which variable holds residuals.
Testing iter==MaxIt is a bug there. You are inside a loop
for iter = 1:Maxit
and during your processing of the iteration iter = MaxIt you might have reached tolerance. The proper test should probably be
if norm(b,2) >= tol;
fprintf('No convergence at t = %g\n', t);
else
the code for the rest of the while loop body goes here
end
I was frustrated by how badly formatted the code was, so I cleaned it up. The revised code is attached.
For some alpha, lambda_u (the boundary for too high of growth) gets exceeded and the code will terminate.
But for some other alpha values, an error will be generated inside spline() complaining that the x values are not real. The x values are coming out of GS, but it is not GS's fault: GS is just doing its duties but tipvalue is complex valued. tipvalue in turn is complex valued because Atau is complex valued. Atau is complex valued because
Atau = (3*dgamma*gamma)^(1/3);
is complex valued if sign(dgamma) is different from sign(gamma). And that happens because dgamma goes negative in the line
dgamma = (gamma-gamma0)/dt; %
gamma0 is a constant from the initialization phase.
gamma has been adjusted by the lines
dS = J\b;
gamma = gamma-dS(2*N+1);
so when the jacobian is fit to the right hand side, you back-project gamma too much.
As to why the jacobian gets to that state... I do not know. I do not follow the mathematics of what is being calculated.
The version I attached has alpha = 99/100 which is a value for which you get the problem with the splines (because complex inputs, because complex inputs, because gamma is too small, because jacobian leads you there.)
I'm glad the original code irriated you :)
Thank you very much for your explanations. You helped me to understand more about the code I have already spent months to study.
One more question: while running the revised code for some certain cases where alpha failed to bring the convergence, instead of showing the text " Stop early due to failure" it still showed the same error as original code. Why so?
For example when alpha=2, it still showed:
Error using chckxy (line 30)
The X vector should have real elements.
Error in spline (line 53)
[x,y,sizey,endslopes] = chckxy(x,y);
Error in fracmodel>FS (line 212)
tau0 = spline(gammah,tauh,gamma*x);
Error in fracmodel>getRHS (line 173)
F = FS(x,y,gamma,K,y0,gamma0,dt,OMEGA,PHI,C,tauh,gammah,ipen);
Error in fracmodel (line 62)
b = getRHS(xi,y,gamma,K,t,tc,y0,y0h,gamma0,dt,OMEGA,PHI,C,tauh,gammah,ipen);
This is what the last part of my explanation was about: under some circumstances (not sure which ones), gamma back-projects to before gamma0 and that leads to complex values being generated, and spline() cannot deal with complex values.
I missed that gamma0 itself is changing. In the while loop, gamm0 is set to gamma (the last gamma created by the for iters loop.)
The implication of this is that gamma-gamm0 could be negative anytime the solution to the jacobian\b system goes more negative than the start of the iterations. Not quite "goes negative": the previous iterations for this cycle might have increased gamma and there is some room to decrease it again back to where it started for this cycle.
I got it now. Thank you for your clear explanation.
I do not have a complete program for you yet, but I decided I would post the current version of the code for you.
I implemented looping over alpha values.
I also did investigations of some of the failures, and put in smooth controls when some of the problems are detected, as I was able to track down where some of the problems happened, even if I do not yet understand why in all cases. So far the controls seem to have caught all of the cases of (for example) trying to spline over complex values.
My investigations appeared at first to show that there was some sharp cut-off of alpha values, that any alpha up to some particular value about 0.4 worked, and that every alpha larger than that led to failure. However, when I narrowed down the failure point down to the 0.0001 level, I did find that there are some alphas that work after some alphas have failed. The range over which this happens is pretty narrow, though, and in my testing so far, it at least looks like there is some alpha value beyond which nothing works.
The current program outputs summary information that can look similar to this:
Jacobian was dubious sometimes, rcond down to 3.73266e-13, but proceededed using \ operator anyhow; at alpha = 0.401732
Alpha = 0.401732 succeeded!
b went complex at t = 0.419431 alpha = 0.401733, iter = 5
No convergence at t = 0.419431
Jacobian was dubious sometimes, rcond down to 3.73266e-13, but proceededed using \ operator anyhow; at alpha = 0.401733
Alpha = 0.401733 stopped early due to failure, reason = "b went complex"
and
lambda_u exceeded at t = 0.0262153 alpha = 1.96, iter = 10
No convergence at t = 0.0262153
Jacobian was near singular, rcond down to 4.19951e-15, substituted pinv for \ operator at times; at alpha = 1.96
Alpha = 1.96 stopped early due to failure, reason = "lambda_u exceeded"
There are several key points about this:
  • there are alpha values that succeed with no warning messages needed
  • there are alpha values that succeed with warning -- the warning tells you that something very questionable is going on with the mathematics and yet something at least looks like success came out of it
  • there are alpha values for which convergence fails because the b matrix for J\b becomes complex. When b becomes complex, then that "pollutes" the remaining calculations and would inevitably lead to an error about trying to take spline over complex values. There were no cases in which the calculation was able to succeed after b became complex
  • there are alpha values for which convergence fails because lambda_u exceeds the built-in safety point that apparently has something to do with the equation being considered to rise too rapidly (I do not know what that is about)
  • there are alpha values for which the jacobian ends up with a reciprocal condition number that is small enough to be of distinct concern, that a single bit of round-off difference could result in a change about 3E12 larger than the value associated with the bit. Those cases happened regularly enough that the entire calculation chain should be reviewed as being questionable. But still, I was able to find a boundary where when the reciprocal condition number was above that boundary, the calculations sometimes succeeded. For this range of reciprocal condition number, I just set up a warning, but still have the code path use the J\b calculation.
  • there are alpha values for which the jacobian ends up small enough that MATLAB was warning about likely failure of the J\b operation. I was able to find a boundary where when the reciprocal condition number was smaller than the boundary, that the overall calculation has never succeeded. Failure was not typically immediate in such cases, but left to enough iterations, there has always been failure if the reciprocal condition number was smaller. However, when I detect that case, I do not terminate iteration, but I do avoid the singularity warning by switching that one operation from J\b to pinv(J)*b
  • 20 iterations was too small. I changed the limit. There were some calculations going at least as high as 55 iterations, and successes over 30.
  • So far, every alpha 0.401767 or higher has failed. 0.401766 succeeded even though some entries slighty lower failed.
  • In the case where jacobian (near-) singularities were detected, the failures distribute fairly evenly between exceeding lambda_u or b going complex (I thought I had detected a pattern that lambda_u was "nearly always" the failure cause in this situation, but further testing showed that they are pretty balanced.)
As to why b sometimes goes complex: I traced it down to some calculations of the form sqrt(b - spline interpolation involving gamma and tau and different x locations), which produces a complex value if the interpolated value of the spline exceeds b. The calculation involved three different terms similar to that, added together; by splitting up the code my experience was that the first and third term both produced complex values and (so far in my testing) the second term was real-valued.
I have not tracked down the circumstances under which the interpolation gets too large. In at least some circumstances it gets triggered when the new gamma value created in the main loop and inserted into the last gammah vector entry, is lower than the second-last entry. As the gammah values are effectively x values in the interpolation, that corresponds to having a series of x values that doubles back on itself, leading to a distinct hook in the gammah vs tau plot. My testing earlier seemed to indicate that some doubling back was tolerated by the calculation, that the spline did not end up projecting a value too large, but that the doubling back "too much" was a problem in other cases.The next step would be to examine that more closely to try to figure out what was safe and what was not.
I am not comfortable with the robustness of the algorithm. Reciprocol condition numbers do not have exact failure boundaries, but it is generally considered that by the time you get down to 1e-10 that your algorithm has probably failed (design error ==> needs more robust code at the very least, but quite possibly that the algorithm is Wrong) -- or else that your equations have a singularity in that area. rcond like 3.73266e-13 tells you that errors are being magnified by a factor of a trillion. You should not trust any answers involving such calculations. And yet the algorithm produces those jacobians often, including for alpha = 0 (equivalent to the original version of the code with no time-dependent modification.) The algorithm needs a good going-over.
I am also not comfortable with the way it handles alp value, switching between 1/3 and 3/8 as it goes, but passing the last one it happened to be using into the plotting routine as-if it had always been that value. And I am not comfortable that there is a definite difference between that alp value and the alpha value that you asked to have put in: I worry that they are really the same factor and that the code should not be switching between 1/3 and 3/8 on the fly.
Current code version is attached.

Sign in to comment.

More Answers (0)

Categories

Find more on Interpolation in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!