The squared value of the correlation coefficient r is called the coefficient of determination, denoted as r^{2}. It always has a value between 0 and 1. By squaring the correlation coefficient we retain information about the strength of the relationship but we lose information about the direction.

**What does the coefficient of determination tell us?**

The coefficient of determination tells us the proportion (or percentage) of the total variability of the dependent variable, y that is accounted for or explained by the independent variable x. If the value is 1, then the values of y are completely explained by x. There is a perfect association between x and y.

A value closer to 0 shows a low proportion of variation in y explained by x. On the other hand value closer to 1 shows that the variable x can predict the actual value of the variable y.

**Can Coefficient of Determination be Negative? Why or Why Not?**

The coefficient of determination is defined as the square of the correlation coefficient (r). Since the square of any real number can never be negative we conclude that the coefficient of determination can never be negative. Note that it can be equal to 0, but it cannot take any value lower than that.

On the other hand, the coefficient of correlation r can be either positive or negative depending on the relationship between the two variables. If an increase in the value of one variable leads to an increase in the other, then the correlation is positive. If an increase in the value of one variable leads to a decrease in the other, then the correlation is negative.

#### If the correlation between two variables is 0.496, what is the coefficient of determination?

As explained earlier we should square the correlation coefficient to obtain the coefficient of determination.

Coefficient of Determination = 0.496^{2} = 0.246

**If the coefficient of determination is a positive value, then what can we say about the regression equation?**

If the coefficient of determination is a positive value, then the regression equation can have either a positive slope or a negative slope. In short, we cannot make any assumptions about how the variables are correlated. If the variables are positively correlated, the regression equation will have a positive slope. If the variables are negatively correlated then the regression equation will have a negative slope.

The reason we cannot make any conclusions about the slope of the line, in this case, is *that the coefficient of determination is always positive*. So no extra information about the regression lines can be extracted from this fact.

**What is the coefficient of determination in linear regression?**

The coefficient of determination represents the ratio of SSR (Regression Sum of Squares) to SST (Total Sum of Squares). The SSE (Error Sum of Squares) can be calculated as the difference between the SSR and SST.

**In a regression analysis if SSE = 200 and SSR = 300, then the coefficient of determination is equal to?**

Since SSE (Error Sum of Squares) is equal to 200 and SSR (Regression Sum of Squares) is equal to 300 we can find the value of SST (Total Sum of Squares) by adding them.

SST = SSR + SSE = 200 + 300 = 500.

Coefficient of Determination = SSR/SST = 300/500 = 0.6

**In a regression analysis if SST = 4500 and SSE = 1575, then the coefficient of determination is equal to?**

Since SSE (Error Sum of Squares) is equal to 1575 and SST (Total Sum of Squares) is equal to 4500 we can find the value of SSR (Regression Sum of Squares) by subtracting them.

SSR = SST – SSE = 4500 – 1575 = 2925.

Coefficient of Determination = SSR/SST = 2925/4500 = 0.65

**What does a coefficient of determination equal to zero indicate?**

If the value of the coefficient of determination is equal to zero then it means that the coefficient of correlation of the two variables is equal to zero. This means that there is no linear relationship between the two variables.

It is important to note that even though there may be no linear relationship, the two variables can have a non-linear dependence on each other. For example, it is possible that one variable is equal to the square of the other variable.