# Probability Mass Function

-

The probability mass function (p.m.f) for a discrete random variable is a function that allows us to calculate the probability that a random variable takes a particular value.

### What is a Probability Mass Function?

Let X be a discrete random variable with probability mass function p(x). Then the probability that the random variable takes a value x is given as,

P(X=x) = p(x)

This means that that the probability that X takes the value x can be found by substituting the value of x in the p.m.f.

The probability mass function has two properties:

1. It always takes non negative values.
2. The sum of all the probabilities always adds up to 1.

### Probability Mass Function Examples:

1. Suppose that we toss two coins. The possible outcomes are {HH, HT, TH, TT}. Let X denote the number of heads obtained. Here X is our random variable. The probability mass function of X looks like,

The probability that X=0, that is, we get 0 heads is one out of four because TT is the only favorable outcome. By similar reasoning, we obtain the p.m.f shown above.

We can see that the total probabilities add up to 1. This is one of the properties of a probability mass function.

2. Suppose that a dice is thrown. Let X denote the number on the uppermost face of the die. The probability mass function looks like,

Once again we can see that the total probabilities add up to 1.

### Probability Density Function vs Probability Mass Function:

While the probability mass function gives us the probabilities for a discrete random variable, the probability density function gives us the probabilities for a continuous variable.

For a discrete random variable, we can obtain the probabilities that X takes a range of values by summing the probability mass function over that range of values. On the other hand, for a continuous random variable, we can obtain the probabilities that X takes a range of values by integrating the probability density function over that range of values

### Binomial Probability Mass Function:

Suppose that a trial with only two outcomes – success and failure is repeated ‘n’ number of times. Let ‘p’ denote the probability of success in a single trial. Let X denote the total number of successes in ‘n’ trials. We can calculate the probabilities of X using the binomial probability mass function given as,

P(X=x) = {n \choose x} p^n(1-p)^{n-x}

Example: Suppose that a coin is tosses 10 times. Calculate the probability of getting 8 heads out of 10 tosses.

Solution: Here n=10 and p=probability of getting heads = 0.5 and x=8

So, P(X=8) = {10\choose 8} (0.5)^8(1-0.5)^{2} = 0.0439

### Poisson Probability Mass Function:

If we want to calculate the probablities of occurence of rare events such as accidents, etc. then we can use the poisson distribution. Let \lambda denote the mean of the random variable X. The probability mass function of the Poisson distribution is given as,

P(X=x) = e^{-\lambda} \frac{\lambda^x}{x!}

Example: Suppose that an average of 2 accidents occur per year on a particular road. Calculate the probablity of occurence of 3 accidents in a particular year using the Possion distribution.

Solution: Here \lambda=2

So, P(X=3) = e^{-2} \frac{2^3}{3!} = 0.1804

### Joint Probability Mass Function:

Let X and Y be two discrete random variables. The joint probability mass function of X and Y gives us the probabilities that X and Y take some values simultaneously.

Example: Suppose that two dice are thrown. There are 36 possible outcomes. Let X and Y denote the numbers of the dice.

When two dice are thrown the joint mass probability function of X and Y looks like,

Since each outcome has a one out of thirty-six chance of occurring, the joint probabilities are all equal to 1/36.

Hey 👋

I have always been passionate about statistics and mathematics education.

I created this website to explain mathematical and statistical concepts in the simplest possible manner.

If you've found value from reading my content, feel free to support me in even the smallest way you can.