English 中文(简体)
Statistics Tutorial

Selected Reading

Residual sum of squares
  • 时间:2024-09-17

Statistics - Residual Sum of Squares


Previous Page Next Page  

In statistics, the residual sum of squares (RSS), also known as the sum of squared residuals (SSR) or the sum of squared errors of prediction (SSE), is the sum of the squares of residuals (deviations of predicted from actual empirical values of data).

Residual Sum of Squares (RSS) is defined and given by the following function:

Formula

${RSS = sum_{i=0}^n(epsilon_i)^2 = sum_{i=0}^n(y_i - (alpha + eta x_i))^2}$

Where −

    ${X, Y}$ = set of values.

    ${alpha, eta}$ = constant of values.

    ${n}$ = set value of count

Example

Problem Statement:

Consider two populace bunches, where X = 1,2,3,4 and Y = 4, 5, 6, 7, consistent worth ${alpha}$ = 1, ${eta}$ = 2. Locate the Residual Sum of Square (RSS) values of the two populace bunch.

Solution:

Given,

${X = 1,2,3,4 Y = 4,5,6,7 alpha = 1 eta = 2 }$

Arrangement:

Substitute the given quapties in the recipe, Remaining Sum of Squares Formula

${RSS = sum_{i=0}^n(epsilon_i)^2 = sum_{i=0}^n(y_i - (alpha + eta x_i))^2, \[7pt] = sum(4-(1+(2x_1)))^2 + (5-(1+(2x_2)))^2 + (6-(1+(2x_3))^2 + (7-(1+(2x_4))^2, \[7pt] = sum(1)^2 + (0)^2 + (-1)^2 + (-2)^2, \[7pt] = 6 }$ Advertisements