Discrete Joint Distribution

< List of probability distributions

Joint and marginal distributions of a pair of discrete, random variables X,Y having nonzero mutual information I(X; Y) [1]

A discrete joint distribution describes the probability of two or more discrete random variables taking particular values simultaneously.

In a discrete joint distribution, the outcome of one event is not influenced by the outcome of another event. For example, the probability of getting a head when flipping a coin is not influenced by the probability of rolling a 5 when rolling a die. However, the two events can be dependent, such as when flipping two coins and wanting to know the probability of both coming up heads.

Discrete joint distribution example

The joint PMF of two discrete random variables X and Y is defined as P(X = x and Y = y), where x and y are specific values of the random variables X and Y, respectively. This allows us to calculate the probability of both X and Y taking specific values simultaneously. As an example, suppose we have two dice, a red die and a blue die, and we want to find the joint PMF for the sum of the two dice.

Let X be the random variable for the value of the red die and Y be the random variable for the value of the blue die. Since each die has six sides with values 1 through 6, the possible values of X and Y are {1, 2, 3, 4, 5, 6}. To find the joint PMF, we need to calculate the probability of each possible sum of the red and blue dice. We can represent the joint PMF using a table, where the rows correspond to the values of X and the columns correspond to the values of Y. The entries in the table represent the probability of each possible sum.

This table shows one way to write the joint probability mass function of X and Y; it enumerates every probability of every pair of values.

For example, to find the probability of rolling a sum of 7, we would add up the probabilities of the outcomes (1,6), (2,5), (3,4), (4,3), (5,2), and (6,1), which is 6/36 = 1/6. Similarly, we can find the probability of any other possible sum by adding up the probabilities of the corresponding outcomes.


Discrete joint distributions are important in fields such as statistics, data analysis, and machine learning. They are used to analyze relationships between variables and can be used to identify correlations and dependencies. Examples of applications of a discrete joint distribution include analyzing survey data to understand the relationships between different variables or analyzing data from experiments to understand how different factors affect the outcome.

For example, a discrete joint probability distribution can be used to find out if a patient is infected with a specific disease, given that they have a certain test result or test the relationship between the number of goods sold in a certain day vs. the number of warranties.


[1] CaitlinJo, CC BY 3.0 https://creativecommons.org/licenses/by/3.0, via Wikimedia Commons

Scroll to Top