# Documentation

### This is machine translation

Translated by
Mouseover text to see original. Click the button below to return to the English verison of the page.

## Bayesian Linear Regression Workflow

Econometrics Toolbox™ includes a self-contained framework that allows you to implement Bayesian linear regression. In general, you can use this workflow to estimate features of the posterior distributions, and then forecast observations given predictor data.

1. Choose a joint prior distribution for (β,σ2). Then, using `bayeslm`, create the Bayesian linear regression model object that completely specifies your beliefs of the joint prior distribution. This table contains the available prior model objects.

ObjectJoint Prior Distribution of (β,σ2)When to Create
`conjugateblm`
• π(β|σ2) is Gaussian with mean `Mu` and covariance σ2`V`.

• π(σ2) is inverse-gamma with shape `A` and scale `B`.

Create one when all of the following are true:

• You are fairly confident that the parameters have the corresponding joint prior, and that β depends on σ2.

• You want to incorporate your prior knowledge of the prior mean and covariance of β and the shape and scale of σ2.

• You want analytical forms for the marginal and conditional posteriors. These assumptions yield the normal-inverse-gamma conjugate distributions for both types.

`semiconjugateblm`
• π(β) is Gaussian with mean `Mu` and covariance `V`.

• π(σ2) is inverse-gamma with shape `A` and scale `B`.

• β and σ2 are independent.

Create one when all of the following are true:

• You are fairly confident that the parameters have the corresponding joint prior, and that β and σ2 are independent.

• You want to incorporate your prior knowledge of the prior mean and covariance of β and the shape and scale σ2.

• You want analytical forms for the conditional posteriors. These assumptions yield the normal-inverse-gamma conjugate conditional distributions.

`diffuseblm`$\pi \left(\beta ,{\sigma }^{2}\right)\propto \frac{1}{{\sigma }^{2}}.$

Create one when all of the following are true:

• You want the posterior to be much more influenced by information in the data than the prior.

• The joint prior distribution is inversely proportional to σ2, that is, Jefferys noninformative prior[1].

• You want analytical forms for the marginal and conditional posteriors. These assumptions yield the normal-inverse-gamma conjugate distributions for both types.

`customblm`A function handle of a custom function that computes the log of the joint prior distribution.Create one when you want to specify the log of the joint prior distribution. This specification allows for maximal flexibility.

1. Given data, estimate features of the posterior distributions. The functions to use in this step depend on your analysis goals.

FunctionGoal
`estimate`
• You want to obtain a posterior model object for forecasting. Posterior model objects include:

• Estimates of the mean and covariance matrix of the marginal posterior π(β|y,x), and the mean and variance of π(σ2|y,x).

• Marginal posteriors and their parameter values. Analytical solutions are available for `conjugateblm` and `diffuseblm` prior models. For all other prior models, `estimate` must use Monte Carlo sampling.

• 95% equal-tailed, credible intervals. For nonanalytical posteriors, 95% equal-tailed, credible intervals are the 0.025-quantile and the 0.975-quantile of the retained Monte Carlo sample.

• You want to estimate the mean and covariance of the conditional distribution π(β|σ2,y,x), that is, implement linear regression with σ2 held fixed.

• You want to update an existing posterior distribution based on new data.

`simulate`
• You want to approximate the expected value of a function of the parameters with respect to the joint posterior π(β,σ2|y,x). That is, you want to draw many samples of (β,σ2) from their joint posterior, apply a function to each draw, and then compute the average of the transformed draws.

• You want to draw from the conditional posterior distributions π(β|σ2,y,x) and π(σ2|β,y,x). Such a selection is convenient for running an MCMC sampler, for example, a Gibbs sampler.

2. If you have a custom prior model (`customblm` object), then choose a Markov chain Monte Carlo (MCMC) sampler when you call `estimate` or `simulate`. This table contains a list of supported MCMC samplers. After choosing a sampler, try the default tuning parameter values first.

MCMC SamplerSpecify UsingDescription
Hamiltonian Monte Carlo (HMC)`'Sampler',"hmc"`

Because the HMC sampler tunes itself, and resulting samples mix well and converge to their stationary distribution more quickly, try this sampler first.

To increase sampling speed, supply the gradient of the log PDF for all or some of the parameters.

Random walk Metropolis`'Sampler',"metropolis"`

If the sample size is reasonably large and the prior does not dominate the likelihood, then try this sampler.

Supported proposal distributions are multivariate normal and multivariate t distributions.

Tuning parameters include the distribution, its scale matrix, and its degrees of freedom.

Slice`'Sampler',"slice"` (default)To achieve adequate mixing and convergence, carefully tune the typical sampling-interval width. Values are application dependent.

After estimating a nonanalytical posterior by using an MCMC sampler, inspect the posterior or conditional posterior draws for adequate mixing. For more details, see Posterior Estimation and Simulation Diagnostics.

If the quality of the samples is not satisfactory, then create a sampler options structure by using `sampleroptions`, which allows you to specify the tuning parameter values that are appropriate for the sampler. For example, to specify a random walk Metropolis sampler that uses a multivariate t proposal distribution with 5 degrees of freedom, enter:

```options = sampleroptions('Sampler',"metropolis",'Distribution',"mvt",... 'DegreeOfFreedom',5)```
After you create the sampler options structure, specify it when you call `estimate` or `simulate` by using the `'Options'` name-value pair argument.

2. Forecast responses given new predictor data using `forecast`. `forecast` constructs forecasts from the posterior predictive distribution. Analytical posterior predictive distributions are available for `conjugateblm` and `diffuseblm` prior models. For all other prior models `forecast` resorts to Monte Carlo sampling. As with estimation and simulation, you can choose an MCMC sampler for `customblm` models. If `forecast` uses an MCMC sampler, you should inspect the posterior or conditional posterior draws for adequate mixing.

## References

[1] Marin, J. M. and C. P. Robert. Bayesian Core: A Practical Approach to Computational Bayesian Statistics. New York: Springer Science+Business Media, LLC, 2007.