Suppose you are given the joint probability distribution of two variables X and Y. Then the marginal probability distribution is the distribution of one of the variables alone without consideration to the other variable. The other variable has effectively been marginalized or side-lined (this being the reason for the name “Marginal Distribution”).

**Marginal Probability Distribution For Discrete Random Variables**:

Let p(x,y) denote the joint probability distribution of two discrete variables. The marginal probability mass function can be found as,

p(x)=∑ p(x,y) where the sum is taken over all possible values of y.

p(y)=∑ p(x,y) where the sum is taken over all possible values of x.

**Example**: Suppose we are given the following joint probability distribution. Find the marginal probabilities of X and Y.

**Solution**: We sum up the rows one at a time to obtain the marginal distribution of X. Similarly summing up the columns gives us the marginal distribution of Y.

**Marginal Probability Distribution For Continuous Random Variables**:

Let f(x,y) be the joint probability distribution of the continuous random variables X and Y.

1. Then the marginal probability distribution of X is given as,

f(x)= ∫ f(x,y)dy where, the integral is taken over the range of values taken by Y.

2. Similarly, the marginal probability distribution of Y is given as,

f(y)=∫ f(x,y)dx where, the integral is taken over the range of values taken by X.

**Conditional vs Marginal probability distribution**

As opposed to marginal distribution where we are interested in one variable alone, in a conditional distribution, we are interested in the probability distribution of one variable when we are given the range of values taken by the other variable are fixed.

For example, we may be interested in the conditional distribution of heights of people whose weight lies between 55 to 65kg. Whereas in the case of marginal distribution we would simply be interested in their height irrespective of considerations of the weight of the individuals.