Setting Break Point Changes Answer

6 views (last 30 days)
I recently observed that setting a break point inside a function had the effect of changing the calculation being made downstream (and changing the answer). I distilled it to a small example:
function z = breakpointcode
b = single(0.75); % Setting break point here will change answer downstream
c = single(0.866084992885589599609375);
z = b / c - c;
fprintf('%x %15.8e\n',typecast(z,'uint32'),z);
d = single(double(b)/double(c)-double(c));
fprintf('%x %15.8e\n',typecast(d,'uint32'),d);
If you run the code without setting a breakpoint, it is apparent that in the background the z calculation is being done in double precision (presumably to "help" you out with precision) because the z and d results match. But if you set a breakpoint on the line indicated, the z calculation is instead done in single precision (it doesn't match the double precision calculation).
I suppose the JIT is trying to "help" me here, but actually it is causing me headaches. I am trying to emulate a single precision calculation on a different machine ... I WANT the entire calculation to happen in single precision. That's why I made the variables single in the first place.
Well, I guess I already knew that I can't trust PC calculations for single variables to do single precision calculations in the background (I have the same general problem with C/C++ and Fortran compilers, not just MATLAB). But what threw me in this case was that setting a breakpoint could actually change how the calculation was done downstream. You can keep setting and unsetting the breakpoint and the answer will keep flipping back and forth between the single and double precision result. I spot checked R2006b and R2013a and they both do the same thing (32-bit WinXP). My suspicion is that there may be two different parsed versions of the function in memory at the same time ... one gets run when there are no breakpoints and the other gets run when there are breakpoints. Just a guess ...
NOTE: FYI, although the example has "constants" for b and c, I get the same behavior if b and c are input as arguments. I.e., this behavior is not just because the JIT recognizes b and c as constants and can pre-calculate the answer.
James Tursa
James Tursa on 9 Apr 2013
Interesting. I found a Win7 machine in our area and re-ran the test. 32-bit R2013a exhibits the difference, but 64-bit R2013a does not (just like your result).

Sign in to comment.

Accepted Answer

Jan on 9 Jun 2013
Under R2009a/64/Win7 I get:
breakpointcode % No breakpoints
% b8f9e000 -1.19149685e-004
% b8f9ed1c -1.19174103e-004
% ans = -1.1915e-004
And the same result with a breakpoint inside the code. Then I disable the JIT:
feature jit off
% b8f9ed1c -1.19174103e-004
% b8f9ed1c -1.19174103e-004
% ans = -1.1917e-004 % !!! different result for single(double(b)/double(c)-double(c))!!!
This looks like it is not the debugger itself, but the JIT, which causes the differences. Because the JIT can reorder the commands if it increases the speed, for debugging (and profiling) the JIT must be disabled.
Now the same for the 32 bit version:
R2009a/32, no break point (and the same result with JIT=off):
% b8f9ed1c -1.19174103e-004
% b8f9ed1c -1.19174103e-004
% ans = -1.1917e-004
R2009a/32, with break point:
% b8f9e000 -1.19149685e-004
% b8f9ed1c -1.19174103e-004
% ans = -1.1915e-004
Conclusion: Under 2009a/32 the debugger modifies the calculations, but not the JIT. And for 2009a/64 it is the other way around. I'm not sure what this means.

More Answers (0)


Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!