confidence interval from lsqnonlin function while retaining constraints

66 views (last 30 days)
Is it possible to extract the confidence interval but also maintain constraints using the lsqnonlin function? Please see the code below.
initialguess = [3,25.5013,0.5,-0.8793]; %[n,tau(fs),m,c]
options = optimoptions('lsqnonlin','Algorithm','levenberg-marquardt','Display','iter');
lb = [0, 1, 0.5, -1];
ub = [inf, 2000, 0.5,0];
x = lsqnonlin(res_ds_model_real,initialguess,lb,ub);
  4 Comments
ke an
ke an on 1 Dec 2025 at 12:27

@Torsten @Star Strider thanks a lot guys your information is helpful.

Sign in to comment.

Accepted Answer

Umar
Umar on 2 Dec 2025 at 7:34
Edited: Torsten on 2 Dec 2025 at 12:24
@ke an,
I saw your question and thought I'd chime in. @Torsten and @Star Strider are absolutely right that the standard Jacobian approach doesn't work with active constraints, but there's actually a pretty straightforward workaround using bootstrap that doesn't need the Statistics toolbox.
I threw together a quick demo with synthetic data to show what's happening. The basic idea is you resample your residuals, add them back to your fitted values to make new synthetic datasets, then refit each one with the same constraints. Do this a thousand times and you get an empirical distribution of your parameters that automatically respects the bounds.
Looking at the results, you can see why the constraint issue matters. In my test case, the parameter m sits right at its lower bound (0.5) and the bootstrap correctly shows zero uncertainty there - the CI is literally [0.5000, 0.5000]. That's not a bug, that's telling you the data is pushing that parameter against the constraint. Meanwhile the other free parameters like n show nice symmetric intervals around +/-0.07, while tau has much wider uncertainty (roughly 20 to 35) because it's just harder to nail down from noisy data. The parameter c is interesting because it hits the lower bound on one side but has room to vary on the other, so you get an asymmetric interval [-1.0, -0.72] which is exactly what you'd expect near a boundary.
The code is pretty simple - just wrap your lsqnonlin call in a loop, resample residuals each iteration, and collect the parameter estimates. Takes about a second for 1000 bootstrap samples. The key part is just making sure you add the resampled residuals to your fitted values (not your original data) before refitting.
One thing to watch for: if you consistently see parameters jammed against their bounds across most bootstrap samples, that might mean either your bounds are too tight or the data genuinely wants that value. Worth thinking about whether your constraints make physical sense for the problem. In your case with the stretched exponential, if m keeps hitting 0.5, maybe the model wants something below that or maybe 0.5 is actually the right answer for your system.
The *Bottom line* is bootstrap gives you valid confidence intervals even with constraints active, while the Jacobian method breaks down because it assumes parameters can move freely in all directions. The bootstrap distributions naturally truncate at boundaries which is the correct statistical behavior. It's more computationally intensive but still totally practical for most problems.
Hope this helps!
%% Bootstrap Confidence Intervals for Constrained lsqnonlin
% This script provides confidence intervals for constrained optimization
% without requiring any toolboxes beyond basic MATLAB + Optimization
%Toolbox
%% Generate synthetic data for demonstration
fprintf('Generating synthetic data...\n');
Generating synthetic data...
% True parameters for data generation
true_params = [3.0, 25.0, 0.5, -0.8]; % [n, tau, m, c]
% Generate x data (e.g., time points)
rng(42); % For reproducibility
x_data = linspace(0.1, 100, 100)';
% Generate y data with noise
y_true = true_params(1) * exp(-(x_data/true_params(2)).^true_params(3)) + true_params(4);
noise_level = 0.1;
y_data = y_true + noise_level * randn(size(y_true));
% Visualize the data
figure('Position', [100, 100, 800, 400]);
subplot(1, 2, 1);
plot(x_data, y_data, 'ko', 'MarkerSize', 4, 'MarkerFaceColor', [0.7, 0.7, 0.7]);
hold on;
plot(x_data, y_true, 'r-', 'LineWidth', 2);
xlabel('x');
ylabel('y');
title('Synthetic Data');
legend('Noisy data', 'True model', 'Location', 'best');
grid on;
fprintf(' True parameters: n=%.2f, tau=%.2f, m=%.2f, c=%.2f\n', ...
true_params(1), true_params(2), true_params(3), true_params(4));
True parameters: n=3.00, tau=25.00, m=0.50, c=-0.80
%% Step 1: Solve the original problem
initialguess = [3, 25.5013, 0.5, -0.8793]; % [n, tau(fs), m, c]
lb = [0, 1, 0.5, -1];
ub = [inf, 2000, 0.5, 0];
options = optimoptions('lsqnonlin', 'Algorithm', 'levenberg-marquardt', ...
'Display', 'off');
% Define objective function (nested so it can access x_data, y_data)
objective = @(params) compute_residuals(params, x_data, y_data);
% Get the full solution with Jacobian
[x_opt, resnorm, residual, exitflag, output, lambda, jacobian] = ...
lsqnonlin(objective, initialguess, lb, ub, options);
fprintf('\nOptimal parameters:\n');
Optimal parameters:
fprintf('n = %.4f (true: %.2f)\n', x_opt(1), true_params(1));
n = 3.0485 (true: 3.00)
fprintf('tau = %.4f (true: %.2f)\n', x_opt(2), true_params(2));
tau = 27.0510 (true: 25.00)
fprintf('m = %.4f (true: %.2f)\n', x_opt(3), true_params(3));
m = 0.5000 (true: 0.50)
fprintf('c = %.4f (true: %.2f)\n', x_opt(4), true_params(4));
c = -0.8577 (true: -0.80)
% Plot the fit
subplot(1, 2, 2);
y_fitted = compute_model(x_opt, x_data);
plot(x_data, y_data, 'ko', 'MarkerSize', 4, 'MarkerFaceColor', [0.7, 0.7, 0.7]);
hold on;
plot(x_data, y_fitted, 'b-', 'LineWidth', 2);
plot(x_data, y_true, 'r--', 'LineWidth', 1.5);
xlabel('x');
ylabel('y');
title('Fitted Model');
legend('Data', 'Fitted model', 'True model', 'Location', 'best');
grid on;
%% Step 2: Check if constraints are active
tolerance = 1e-6;
active_lower = abs(x_opt - lb) < tolerance;
active_upper = abs(x_opt - ub) < tolerance;
constraints_active = any(active_lower) || any(active_upper);
fprintf('\nConstraint Status:\n');
Constraint Status:
param_names = {'n', 'tau', 'm', 'c'};
for i = 1:length(x_opt)
if active_lower(i)
fprintf('%s is at LOWER bound (%.4f)\n', param_names{i}, lb(i));
elseif active_upper(i)
fprintf('%s is at UPPER bound (%.4f)\n', param_names{i}, ub(i));
else
fprintf('%s is FREE (%.4f)\n', param_names{i}, x_opt(i));
end
end
n is FREE (3.0485) tau is FREE (27.0510)
m is at LOWER bound (0.5000)
c is FREE (-0.8577)
%% Step 3: Method 1 - Simple Standard Error (only valid if NO constraints active)
if ~constraints_active
fprintf('\n=== METHOD 1: Standard Error Approach ===\n');
fprintf('(Valid only because no constraints are active)\n\n');
n_data = length(residual);
n_params = length(x_opt);
% Estimate variance
sigma2 = resnorm / (n_data - n_params);
% Covariance matrix
JtJ = jacobian' * jacobian;
covar = sigma2 * inv(JtJ);
% Standard errors
se = sqrt(diag(covar));
% 95% confidence intervals (assuming t-distribution)
alpha = 0.05;
df = n_data - n_params;
t_crit = tinv(1 - alpha/2, df);
ci_lower = x_opt - t_crit * se;
ci_upper = x_opt + t_crit * se;
fprintf('Parameter Estimates with 95%% CI:\n');
for i = 1:length(x_opt)
fprintf('%5s: %.4f [%.4f, %.4f] ± %.4f\n', ...
param_names{i}, x_opt(i), ci_lower(i), ci_upper(i), se(i));
end
fprintf('\nWARNING: These CIs may extend beyond constraint boundaries!\n');
else
fprintf('\n=== METHOD 1: Not applicable (constraints are active) ===\n');
end
=== METHOD 1: Not applicable (constraints are active) ===
%% Step 4: Method 2 - Residual Bootstrap
fprintf('\n=== METHOD 2: Residual Bootstrap ===\n');
=== METHOD 2: Residual Bootstrap ===
fprintf('(Works with constraints, more reliable)\n\n');
(Works with constraints, more reliable)
n_bootstrap = 1000; % Number of bootstrap samples
bootstrap_params = zeros(n_bootstrap, length(x_opt));
% Store fitted values
y_fitted = compute_model(x_opt, x_data);
% Create waitbar for progress
fprintf('Running bootstrap (this may take a minute)...\n');
Running bootstrap (this may take a minute)...
tic;
% Bootstrap options - suppress output
boot_options = optimoptions('lsqnonlin', 'Algorithm', 'levenberg-marquardt', ...
'Display', 'off', 'MaxIterations', 100);
for b = 1:n_bootstrap
% Resample residuals with replacement
boot_indices = randi(length(residual), length(residual), 1);
boot_residuals = residual(boot_indices);
% Create bootstrapped data by adding resampled residuals to fitted values
y_boot = y_fitted + boot_residuals;
% Define bootstrap objective function
boot_objective = @(params) compute_residuals(params, x_data, y_boot);
% Solve with same constraints
try
x_boot = lsqnonlin(boot_objective, x_opt, lb, ub, boot_options);
bootstrap_params(b, :) = x_boot;
catch
% If optimization fails, use the original solution
bootstrap_params(b, :) = x_opt;
end
% Progress indicator
if mod(b, 100) == 0
fprintf(' Completed %d/%d iterations\n', b, n_bootstrap);
end
end
Completed 100/1000 iterations Completed 200/1000 iterations Completed 300/1000 iterations Completed 400/1000 iterations Completed 500/1000 iterations Completed 600/1000 iterations Completed 700/1000 iterations Completed 800/1000 iterations Completed 900/1000 iterations Completed 1000/1000 iterations
elapsed = toc;
fprintf('Bootstrap completed in %.1f seconds\n\n', elapsed);
Bootstrap completed in 0.9 seconds
%% Step 5: Calculate bootstrap confidence intervals
alpha = 0.05;
fprintf('Bootstrap 95%% Confidence Intervals:\n');
Bootstrap 95% Confidence Intervals:
fprintf('%-5s %10s %12s %12s %10s\n', 'Param', 'Estimate', 'CI Lower', 'CI Upper', 'Std Error');
Param Estimate CI Lower CI Upper Std Error
fprintf('%s\n', repmat('-', 1, 60));
------------------------------------------------------------
for i = 1:length(x_opt)
% Percentile method
ci_lower_boot = prctile(bootstrap_params(:, i), alpha/2 * 100);
ci_upper_boot = prctile(bootstrap_params(:, i), (1 - alpha/2) * 100);
se_boot = std(bootstrap_params(:, i));
fprintf('%-5s %10.4f [%10.4f, %10.4f] ± %.4f\n', ...
param_names{i}, x_opt(i), ci_lower_boot, ci_upper_boot, se_boot);
end
n 3.0485 [ 2.9221, 3.1828] ± 0.0679 tau 27.0510 [ 20.4260, 35.1022] ± 4.0409 m 0.5000 [ 0.5000, 0.5000] ± 0.0000 c -0.8577 [ -1.0000, -0.7161] ± 0.0833
%% Step 6: Visualize bootstrap distributions
figure('Position', [100, 100, 1200, 800]);
for i = 1:length(x_opt)
subplot(2, 2, i);
% Histogram of bootstrap samples
histogram(bootstrap_params(:, i), 50, 'Normalization', 'pdf', ...
'FaceColor', [0.3, 0.5, 0.8], 'EdgeColor', 'none', 'FaceAlpha', 0.7);
hold on;
% Add vertical line at estimate
yl = ylim;
plot([x_opt(i), x_opt(i)], yl, 'r-', 'LineWidth', 2);
% Add confidence interval bounds
ci_low = prctile(bootstrap_params(:, i), alpha/2 * 100);
ci_high = prctile(bootstrap_params(:, i), (1 - alpha/2) * 100);
plot([ci_low, ci_low], yl, 'r--', 'LineWidth', 1.5);
plot([ci_high, ci_high], yl, 'r--', 'LineWidth', 1.5);
% Add constraint bounds if applicable
if isfinite(lb(i))
plot([lb(i), lb(i)], yl, 'k-', 'LineWidth', 2);
text(lb(i), yl(2)*0.9, 'LB', 'HorizontalAlignment', 'center');
end
if isfinite(ub(i))
plot([ub(i), ub(i)], yl, 'k-', 'LineWidth', 2);
text(ub(i), yl(2)*0.9, 'UB', 'HorizontalAlignment', 'center');
end
xlabel(param_names{i});
ylabel('Density');
title(sprintf('%s: %.4f [%.4f, %.4f]', ...
param_names{i}, x_opt(i), ci_low, ci_high));
grid on;
hold off;
end
sgtitle('Bootstrap Parameter Distributions with 95% Confidence Intervals');
%% Helper functions
function residuals = compute_residuals(params, x_data, y_data)
% Compute residuals between data andmodel predictions
y_pred = compute_model(params, x_data);
residuals = y_data - y_pred;
end
function y_pred = compute_model(params, x_data)
% Generic stretched exponential decay model
% y = n * exp(-(x/tau)^m) + c
n = params(1); % Amplitude/scaling parameter
tau = params(2); % Time constant (characteristic time scale)
m = params(3); % Stretching exponent
c = params(4); % Baseline offset
% Compute model predictions
y_pred = n * exp(-(x_data/tau).^m) + c;
end
%% Additional Notes
fprintf('\n=== IMPORTANT NOTES ===\n');
=== IMPORTANT NOTES ===
fprintf('1. The bootstrap approach respects constraints automatically\n');
1. The bootstrap approach respects constraints automatically
fprintf('2. Bootstrap CIs are valid even when parameters are at boundaries\n');
2. Bootstrap CIs are valid even when parameters are at boundaries
fprintf('3. This uses RESIDUAL BOOTSTRAP - resampling residuals from thefit\n');
3. This uses RESIDUAL BOOTSTRAP - resampling residuals from thefit
fprintf('4. Increase n_bootstrap to 5000-10000 for publication-qualityresults\n');
4. Increase n_bootstrap to 5000-10000 for publication-qualityresults
fprintf('5. If parameters consistently hit boundaries, consider whether your\n');
5. If parameters consistently hit boundaries, consider whether your
fprintf(' model/constraints are appropriate for the data\n');
model/constraints are appropriate for the data
fprintf('\n=== HOW TO ADAPT THIS CODE ===\n');
=== HOW TO ADAPT THIS CODE ===
fprintf('1. Replace the data generation section with your actual data\n');
1. Replace the data generation section with your actual data
fprintf('2. Modify compute_model() to match your specific model equation\n');
2. Modify compute_model() to match your specific model equation
fprintf('3. Adjust the parameter bounds (lb, ub) as needed\n');
3. Adjust the parameter bounds (lb, ub) as needed
fprintf('4. Keep the bootstrap resampling logic - it works for any model!\n')
4. Keep the bootstrap resampling logic - it works for any model!
  2 Comments
Umar
Umar on 2 Dec 2025 at 7:46

Sorry, while I was trying to write complete code over here, it popped up an error, please see attached.

Sign in to comment.

More Answers (1)

Matt J
Matt J on 1 Dec 2025 at 14:35
Edited: Matt J on 1 Dec 2025 at 20:28
After solving the original problem,
initialguess = [3,25.5013,0.5,-0.8793]; %[n,tau(fs),m,c]
options = optimoptions('lsqnonlin','Algorithm','levenberg-marquardt','Display','iter');
lb = [0, 1, 0.5, -1];
ub = [inf, 2000, 0.5,0];
[x,r2,~,ex,~,~,J] = lsqnonlin(res_ds_model_real,initialguess,lb,ub);
sig=r2/(height(J)-numel(x));
You need to devise a surjective change of variables such that x satisfies the constraints for every , e .g.,
x=[y(1)^2; 1999*(sin(y(2))+1)+1; sin(y(3))/2; sin(y(4))/2-1/2];
Then compute its Jacobian,
y=[sqrt(x(1)); asin((x(2)-1)/1999-1); asin(2*x(3)); asin(2*x(5)+2)];
dTdy=diag([2*y(1); 1999*cos(y(2)); cos(y(3))/2; cos(y(4))/2 ]);
and then the standard errors on y,
Jy=J*dTdy;
ySE=sqrt(diag(sig*inv( Jy.'*Jy ))) %standard errors on y
The standard errors on y will give you a confidence interval on y. You will need to propagate it through T() to get a confidence interval on x.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!