The symmetric generalized normal distribution, also known as the exponential power distribution or the generalized error distribution, is a parametric family of symmetric distributions. It includes all normal and Laplace distributions, and as limiting cases it includes all continuous uniform distributions on bounded intervals of the real line.
This family allows for tails that are either heavier than normal (when ) or lighter than normal (when ). It is a useful way to parametrize a continuum of symmetric,
platykurtic
densities spanning from the normal () to the uniform density (), and a continuum of symmetric,
leptokurtic
densities spanning from the Laplace () to the normal density ().
The shape parameter also controls the
peakedness
in addition to the tails.
Parameter estimation
Parameter estimation via maximum likelihood and the method of moments has been studied.[3] The estimates do not have a closed form and must be obtained numerically. Estimators that do not require numerical calculation have also been proposed.[4]
The generalized normal log-likelihood function has infinitely many continuous derivates (i.e. it belongs to the class C∞ of
smooth functions
) only if is a positive, even integer. Otherwise, the function has continuous derivatives. As a result, the standard results for consistency and asymptotic normality of
maximum likelihood
estimates of only apply when .
Maximum likelihood estimator
It is possible to fit the generalized normal distribution adopting an approximate
Given a value for , it is possible to estimate by finding the minimum of:
Finally is evaluated as
For , median is a more appropriate estimator of . Once is estimated, and can be estimated as described above.[7]
Applications
The symmetric generalized normal distribution has been used in modeling when the concentration of values around the mean and the tail behavior are of particular interest.
student t family can be used, which approximates the normal distribution as the degrees of freedom grows to infinity. The t distribution, unlike this generalized normal distribution, obtains heavier than normal tails without acquiring a cusp at the origin. It finds uses in plasma physics under the name of Langdon Distribution resulting from inverse bremsstrahlung.[10]
Properties
Moments
Let be zero mean generalized Gaussian distribution of shape and scaling parameter . The moments of exist and are finite for any k greater than −1. For any non-negative integer k, the plain central moments are[2]
Connection to Stable Count Distribution
From the viewpoint of the Stable count distribution, can be regarded as Lévy's stability parameter. This distribution can be decomposed to an integral of kernel density where the kernel is either a
The multivariate generalized normal distribution, i.e. the product of exponential power distributions with the same and parameters, is the only probability density that can be written in the form and has independent marginals.[14] The results for the special case of the Multivariate normal distribution is originally attributed to Maxwell.[15]
The asymmetric generalized normal distribution is a family of continuous probability distributions in which the shape parameter can be used to introduce asymmetry or skewness.[16][17] When the shape parameter is zero, the normal distribution results. Positive values of the shape parameter yield left-skewed distributions bounded to the right, and negative values of the shape parameter yield right-skewed distributions bounded to the left. Only when the shape parameter is zero is the density function for this distribution positive over the whole real line: in this case the distribution is a normal distribution, otherwise the distributions are shifted and possibly reversed log-normal distributions.
Parameter estimation
Parameters can be estimated via maximum likelihood estimation or the method of moments. The parameter estimates do not have a closed form, so numerical calculations must be used to compute the estimates. Since the sample space (the set of real numbers where the density is non-zero) depends on the true value of the parameter, some standard results about the performance of parameter estimates will not automatically apply when working with this family.
Applications
The asymmetric generalized normal distribution can be used to model values that may be normally distributed, or that may be either right-skewed or left-skewed relative to the normal distribution. The
distributions, but these do not include the normal distributions as special cases.
Kullback-Leibler divergence between two PDFs
Kullback-Leibler divergence (KLD) is a method using for compute the divergence or similarity between two probability density functions.[18]
Let and two generalized Gaussian distributions with parameters and
subject to the constraint .[19] Then this divergence is given by:
Other distributions related to the normal
The two generalized normal families described here, like the
inverse normal
distributions are defined as transformations of a normally-distributed value, but unlike the generalized normal and skew-normal families, these do not include the normal distributions as special cases.
Actually all distributions with finite variance are in the limit highly related to the normal distribution. The Student-t distribution, the Irwin–Hall distribution and the Bates distribution also extend the normal distribution, and include in the limit the normal distribution. So there is no strong reason to prefer the "generalized" normal distribution of type 1, e.g. over a combination of Student-t and a normalized extended Irwin–Hall – this would include e.g. the triangular distribution (which cannot be modeled by the generalized Gaussian type 1).
A symmetric distribution which can model both tail (long and short) and center behavior (like flat, triangular or Gaussian) completely independently could be derived e.g. by using X = IH/chi.