Slope and Y-intercept of The Regression Line | 회귀선의 기울기와 y절편
In a simple linear regression line (LMS), the regression line can be expressed as following equation:
y = ax + b
where
- y = The variable that you want to predict (예측하고 싶은 값) | Dependent variable (종속 변수)
- x = The variable that you are using to predict (예측에 사용하는 값) | Independent variable (독립 변수)
- a = Slope (기울기)
- b = y-intercept (y 절편)
그렇다면, y = ax+b 에서 slope(a)와 y-intercept(b)는 어떻게 구하는지 알아보겠습니다.
Recall,
r (correlation coefficient) = Average of product of (x in standard units) and (y in standard units)
or in python <datascience>:

Then, the slope of the best fit line is:
Slope (a) = r * (standard deviation of y) / (standard deviation of x)
That is,
a = r * SD_y / SD_x (*)
where
- r = correlation coefficient
- SD_y = standard deviation of y
- SD_x = standard deviation of x
And, the y-intercept of the regression line is:
y-intercept (b) = (average of y) - (slope)(average of x)
That is,
b = y_average - a * x_average
where
- y_average = average of y
- x_average = average of x
- a = slope = r * SD_y / SD_x (*)
In summary, the equations for slope and y-intercept are in the following image:
