< List of probability distributions

In statistics, a **univariate distribution** represents the probability distribution of a single random variable, while a multivariate distribution represents the probability distribution of a random vector consisting of multiple random variables.

For example, the probability mass function (PMF) for the binomial distribution results in a univariate distribution because only one random variable is given in the formula:

Where *n* is the number of items, *p* is the probability and *x* is the random variable.

On the other hand, the multivariate gamma distribution has several variables in its probability density function (PDF):

Where Γ_{p}α is the multivariate gamma function — an extension of the gamma function for multiple variables.

## Types of univariate distribution

Discrete univariate distributions differ from continuous univariate distributions in that they have a finite or countable number of values available, while the latter can take any value within a particular range or interval.

- In a discrete distribution, a
*probability mass function (PMF)*represents the likelihood of each value. The PMF yields the probability of each feasible value of the random variable, and the sum of all probabilities must be equal to 1. - In a continuous distribution, a
*probability density function (PDF)*indicates the likelihood of each value. The PDF specifies the probability of the random variable taking on a value within a defined range, and the area under the PDF curve between two points is equal to the probability of the random variable taking on a value within that range.

Several univariate distributions exist, and some are more prevalent than others. These probability distributions are closely connected through transformations, and it is possible to invert some of them. The transformations include distributions of order statistics, taking a mixture of random variables, and truncating random variables.

**Common discrete univariate distributions:**

- Bernoulli Distribution: describes a random variable that has only two possible outcomes, success or failure.
- Beta binomial distribution: describes the number of successes in a sequence of independent Bernoulli trials.
- Binomial distribution: gives the number of successes in a sequence of independent yes/no experiments.
- Discrete uniform distribution: a probability distribution where all outcomes are equally likely.
- Gamma Poisson distribution: combines the properties of the Gamma and Poisson distributions.
- Hypergeometric distribution: gives the probability of
*k*successes in*n*draws, without replacement, from a finite population of*N*objects where*k*successes and*n-k*failures are known to exist. - Logarithmic distribution: describes the number of occurrences of a rare event in a population.
- Poisson distribution: describes the number of times an event occurs in a fixed interval.
- Polya distribution: gives the number of successes in a sequence of independent experiments where the probability of success changes after each experiment.
- Power series distribution: has a density that can be described by a power series.
- Zeta / Zipf distribution: a distribution where the probability of a word appearing in a text is inversely proportional to its rank.

**Continuous univariate distributions:**

- Benford distribution: a probability distribution that describes the frequency of occurrence of the leading digits in a set of numbers.
- Beta distribution: models the probability of a random variable falling between 0 and 1.
- Erlang distribution: models the waiting time between events in a Poisson process.
- Extreme value distribution: models the occurrence of extreme values, such as the largest or smallest values in a dataset.
- Gamma (generalized) distribution: describes the time it takes for a random variable to reach a certain value.
- Gamma normal distribution: a combination of a gamma distribution and a normal distribution.
- Gompertz distribution: models the time it takes for a random variable to reach a certain value, with an exponentially increasing failure rate.
- Hyperexponential distribution: a mixture of exponential distributions.
- Laplace distribution: a sharp-peaked distribution, symmetric around its mean, with equal probability on either side.
- Log logistic (Fisk) distribution: used to model data that has a sigmoidal shape.
- Log normal distribution: describes the distribution of random variables whose logarithms are normally distributed.
- Logistic distribution: a continuous probability distribution that has a sigmoidal shape.
- Lomax distribution: a right-skew distribution with a heavy tail.
- Muth distribution: a continuous probability distribution with a bathtub-shaped hazard rate function.
- Noncentral beta distribution: a noncentral generalization of the (central) beta distribution.
- Normal Distribution: also known as a Gaussian distribution or bell curve.
- Pareto distribution: a right-skew distribution with a long tail, often used to model data that has a disproportionate number of extreme values.
- Rayleigh distribution: a right-skew distribution used to model the time it takes for a random event to occur.
- Triangular distribution: often used to model data that is known to lie within a certain range, with a most likely value.
- Wald distribution: used in hypothesis testing and confidence interval estimation.

## References

[1] Leemis, L. & McQueston, J. Teacher’s Corner: Univariate Distribution Relationships. Retrieved February 19, 2021 from https://www.academia.edu/6823496/Univariate_Distribution_Relationships.