For a standard normal random variable *Z*, the probability that *Z* exceeds some cutoff *z* is given by

If you wanted to compute this probability numerically, you could obviously evaluate its defining integral numerically. But as is often the case in numerical analysis, the most obvious approach is not the best approach. The range of integration is unbounded and it varies with the argument.

J. W. Craig [1] came up with a better integral representation, better from the perspective of numerical integration. The integration is always over the same finite interval, with the argument appearing inside the integrand. The integrand is smooth and bounded, well suited to numerical integration.

For positive *z*, Craig’s integer representation is

## Illustration

To show that the Craig’s integral is easy to integrate numerically, we’ll evaluate it using Gaussian quadrature with only 10 integration points.

from numpy import sin, exp, pi from scipy import integrate from scipy.stats import norm for x in [0.5, 2, 5]: q, _ = integrate.fixed_quad( lambda t: exp(-x**2 / (2*sin(t)**2))/pi, 0.0, pi/2, n=10) print(q, norm.sf(x))

(SciPy uses `sf`

(“survival function”) for the CCDF. More on that here.)

The code above produces the following.

0.30858301 0.30853754 0.02274966 0.02275013 2.86638437e-07 2.86651572e-07

So with 10 integration points, we get four correct figures. And the accuracy seems to be consistent for small, medium, and large values of *x*. (Five standard deviations is pretty far out in the tail of a normal distribution, as evidenced by the small value of the integral.)

## Related posts

- Golden integration
- Numeric integral with a singularity
- Orthogonal polynomials and Gaussian quadrature

[1] J. W. Craig, A new, simple and exact result for calculating the probability of error for two-dimensional signal constellations, in TEEE MILCOM’91 Conf. Rec., Boston, MA (1991) рр. 25.2.1-25.5.5.

The post A better integral for the normal distribution first appeared on John D. Cook.