...one of the most highly
regarded and expertly designed C++ library projects in the
world.

— Herb Sutter and Andrei
Alexandrescu, C++
Coding Standards

The scaled-inversed-chi-squared distribution is the conjugate prior distribution
for the variance (σ^{2}) parameter of a normal distribution with known expectation
(μ). As such it has widespread application in Bayesian statistics:

In Bayesian inference, the strength of belief into certain parameter values is itself described through a distribution. Parameters hence become themselves modelled and interpreted as random variables.

In this worked example, we perform such a Bayesian analysis by using the scaled-inverse-chi-squared distribution as prior and posterior distribution for the variance parameter of a normal distribution.

For more general information on Bayesian type of analyses, see:

- Andrew Gelman, John B. Carlin, Hal E. Stern, Donald B. Rubin, Bayesian Data Analysis, 2003, ISBN 978-1439840955.
- Jim Albert, Bayesian Compution with R, Springer, 2009, ISBN 978-0387922973.

(As the scaled-inversed-chi-squared is another parameterization of the inverse-gamma distribution, this example could also have used the inverse-gamma distribution).

Consider precision machines which produce balls for a high-quality ball
bearing. Ideally each ball should have a diameter of precisely 3000 μm
(3 mm). Assume that machines generally produce balls of that size on
average (mean), but individual balls can vary slightly in either direction
following (approximately) a normal distribution. Depending on various
production conditions (e.g. raw material used for balls, workplace temperature
and humidity, maintenance frequency and quality) some machines produce
balls tighter distributed around the target of 3000 μm, while others produce
balls with a wider distribution. Therefore the variance parameter of
the normal distribution of the ball sizes varies from machine to machine.
An extensive survey by the precision machinery manufacturer, however,
has shown that most machines operate with a variance between 15 and 50,
and near 25 μm^{2} on average.

Using this information, we want to model the variance of the machines. The variance is strictly positive, and therefore we look for a statistical distribution with support in the positive domain of the real numbers. Given the expectation of the normal distribution of the balls is known (3000 μm), for reasons of conjugacy, it is customary practice in Bayesian statistics to model the variance to be scaled-inverse-chi-squared distributed.

In a first step, we will try to use the survey information to model the general knowledge about the variance parameter of machines measured by the manufacturer. This will provide us with a generic prior distribution that is applicable if nothing more specific is known about a particular machine.

In a second step, we will then combine the prior-distribution information in a Bayesian analysis with data on a specific single machine to derive a posterior distribution for that machine.

Using the survey results, we try to find the parameter set of a scaled-inverse-chi-squared
distribution so that the properties of this distribution match the results.
Using the mathematical properties of the scaled-inverse-chi-squared distribution
as guideline, we see that that both the mean and mode of the scaled-inverse-chi-squared
distribution are approximately given by the scale parameter (s) of the
distribution. As the survey machines operated at a variance of 25 μm^{2} on
average, we hence set the scale parameter (s_{prior}) of our prior distribution
equal to this value. Using some trial-and-error and calls to the global
quantile function, we also find that a value of 20 for the degrees-of-freedom
(ν_{prior}) parameter is adequate so that most of the prior distribution
mass is located between 15 and 50 (see figure below).

We first construct our prior distribution using these values, and then list out a few quantiles:

double priorDF = 20.0; double priorScale = 25.0; inverse_chi_squared prior(priorDF, priorScale); // Using an inverse_gamma distribution instead, we could equivalently write // inverse_gamma prior(priorDF / 2.0, priorScale * priorDF / 2.0); cout << "Prior distribution:" << endl << endl; cout << " 2.5% quantile: " << quantile(prior, 0.025) << endl; cout << " 50% quantile: " << quantile(prior, 0.5) << endl; cout << " 97.5% quantile: " << quantile(prior, 0.975) << endl << endl;

This produces this output:

Prior distribution: 2.5% quantile: 14.6 50% quantile: 25.9 97.5% quantile: 52.1

Based on this distribution, we can now calculate the probability of having
a machine working with an unusual work precision (variance) at <=
15 or > 50. For this task, we use calls to the `boost::math::`

functions `cdf`

and `complement`

, respectively,
and find a probability of about 0.031 (3.1%) for each case.

cout << " probability variance <= 15: " << boost::math::cdf(prior, 15.0) << endl; cout << " probability variance <= 25: " << boost::math::cdf(prior, 25.0) << endl; cout << " probability variance > 50: " << boost::math::cdf(boost::math::complement(prior, 50.0)) << endl << endl;

This produces this output:

probability variance <= 15: 0.031 probability variance <= 25: 0.458 probability variance > 50: 0.0318

Therefore, only 3.1% of all precision machines produce balls with a variance of 15 or less (particularly precise machines), but also only 3.2% of all machines produce balls with a variance of as high as 50 or more (particularly imprecise machines). Moreover, slightly more than one-half (1 - 0.458 = 54.2%) of the machines work at a variance greater than 25.

Notice here the distinction between a Bayesian
analysis and a frequentist
analysis: because we model the variance as random variable itself, we
can calculate and straightforwardly interpret probabilities for given
parameter values directly, while such an approach is not possible (and
interpretationally a strict *must-not*) in the frequentist
world.

In the second step, we investigate a single machine, which is suspected to suffer from a major fault as the produced balls show fairly high size variability. Based on the prior distribution of generic machinery performance (derived above) and data on balls produced by the suspect machine, we calculate the posterior distribution for that machine and use its properties for guidance regarding continued machine operation or suspension.

It can be shown that if the prior distribution was chosen to be scaled-inverse-chi-square distributed, then the posterior distribution is also scaled-inverse-chi-squared-distributed (prior and posterior distributions are hence conjugate). For more details regarding conjugacy and formula to derive the parameters set for the posterior distribution see Conjugate prior.

Given the prior distribution parameters and sample data (of size n), the posterior distribution parameters are given by the two expressions:

ν_{posterior} = ν_{prior} + n

which gives the posteriorDF below, and

s_{posterior} = (ν_{prior}s_{prior} + Σ^{n}_{i=1}(x_{i} - μ)^{2}) / (ν_{prior} + n)

which after some rearrangement gives the formula for the posteriorScale below.

Machine-specific data consist of 100 balls which were accurately measured
and show the expected mean of 3000 μm and a sample variance of 55 (calculated
for a sample mean defined to be 3000 exactly). From these data, the prior
parameterization, and noting that the term Σ^{n}_{i=1}(x_{i} - μ)^{2} equals the sample
variance multiplied by n - 1, it follows that the posterior distribution
of the variance parameter is scaled-inverse-chi-squared distribution
with degrees-of-freedom (ν_{posterior}) = 120 and scale (s_{posterior}) = 49.54.

int ballsSampleSize = 100; cout <<"balls sample size: " << ballsSampleSize << endl; double ballsSampleVariance = 55.0; cout <<"balls sample variance: " << ballsSampleVariance << endl; double posteriorDF = priorDF + ballsSampleSize; cout << "prior degrees-of-freedom: " << priorDF << endl; cout << "posterior degrees-of-freedom: " << posteriorDF << endl; double posteriorScale = (priorDF * priorScale + (ballsSampleVariance * (ballsSampleSize - 1))) / posteriorDF; cout << "prior scale: " << priorScale << endl; cout << "posterior scale: " << posteriorScale << endl;

An interesting feature here is that one needs only to know a summary statistics of the sample to parameterize the posterior distribution: the 100 individual ball measurements are irrelevant, just knowledge of the sample variance and number of measurements is sufficient.

That produces this output:

balls sample size: 100 balls sample variance: 55 prior degrees-of-freedom: 20 posterior degrees-of-freedom: 120 prior scale: 25 posterior scale: 49.5

To compare the generic machinery performance with our suspect machine, we calculate again the same quantiles and probabilities as above, and find a distribution clearly shifted to greater values (see figure).

inverse_chi_squared posterior(posteriorDF, posteriorScale); cout << "Posterior distribution:" << endl << endl; cout << " 2.5% quantile: " << boost::math::quantile(posterior, 0.025) << endl; cout << " 50% quantile: " << boost::math::quantile(posterior, 0.5) << endl; cout << " 97.5% quantile: " << boost::math::quantile(posterior, 0.975) << endl << endl; cout << " probability variance <= 15: " << boost::math::cdf(posterior, 15.0) << endl; cout << " probability variance <= 25: " << boost::math::cdf(posterior, 25.0) << endl; cout << " probability variance > 50: " << boost::math::cdf(boost::math::complement(posterior, 50.0)) << endl;

This produces this output:

Posterior distribution: 2.5% quantile: 39.1 50% quantile: 49.8 97.5% quantile: 64.9 probability variance <= 15: 2.97e-031 probability variance <= 25: 8.85e-010 probability variance > 50: 0.489

Indeed, the probability that the machine works at a low variance (<= 15) is almost zero, and even the probability of working at average or better performance is negligibly small (less than one-millionth of a permille). On the other hand, with an almost near-half probability (49%), the machine operates in the extreme high variance range of > 50 characteristic for poorly performing machines.

Based on this information the operation of the machine is taken out of use and serviced.

In summary, the Bayesian analysis allowed us to make exact probabilistic statements about a parameter of interest, and hence provided us results with straightforward interpretation.

A full sample output is:

Inverse_chi_squared_distribution Bayes example: Prior distribution: 2.5% quantile: 14.6 50% quantile: 25.9 97.5% quantile: 52.1 probability variance <= 15: 0.031 probability variance <= 25: 0.458 probability variance > 50: 0.0318 balls sample size: 100 balls sample variance: 55 prior degrees-of-freedom: 20 posterior degrees-of-freedom: 120 prior scale: 25 posterior scale: 49.5 Posterior distribution: 2.5% quantile: 39.1 50% quantile: 49.8 97.5% quantile: 64.9 probability variance <= 15: 2.97e-031 probability variance <= 25: 8.85e-010 probability variance > 50: 0.489

(See also the reference documentation for the Inverse chi squared Distribution.)

See the full source C++ of this example at ../../../example/inverse_chi_squared_bayes_eg.cpp