Additional notes on linear regression, The method of least squares, Additional notes on linear regression ,18-50 – HP 50g Graphing Calculator User Manual

Page 617: The method of least squares ,18-50

Advertising
background image

Page 18-50

Therefore, the F test statistics is F

o

= s

M

2

/s

m

2

=0.36/0.25=1.44

The P-value is P-value = P(F>F

o

) = P(F>1.44) = UTPF(

ν

N

,

ν

D

,F

o

) =

UTPF(20,30,1.44) = 0.1788…

Since 0.1788… > 0.05, i.e., P-value >

α, therefore, we cannot reject the null

hypothesis that H

o

:

σ

1

2

=

σ

2

2

.

Additional notes on linear regression

In this section we elaborate the ideas of linear regression presented earlier in
the chapter and present a procedure for hypothesis testing of regression
parameters.

The method of least squares

Let x = independent, non-random variable, and Y = dependent, random
variable. The regression curve of Y on x is defined as the relationship between
x and the mean of the corresponding distribution of the Y’s.
Assume that the regression curve of Y on x is linear, i.e., mean distribution of
Y’s is given by

Α + Βx. Y differs from the mean (Α + Β⋅x) by a value ε, thus

Y =

Α + Β⋅x + ε, where ε is a random variable.

To visually check whether the data follows a linear trend, draw a scattergram or
scatter plot.

Suppose that we have n paired observations (x

i

, y

i

); we predict y by means of

y = a + b

⋅x, where a and b are constant.

Define the prediction error as, e

i

= y

i

-

y

i

= y

i

- (a + b

⋅x

i

).

The method of least squares requires us to choose a, b so as to minimize the
sum of squared errors (SSE)

the conditions

2

1

1

2

)]

(

[

i

n

i

i

n

i

i

bx

a

y

e

SSE

+

=

=

=

=

0

)

(

=

SSE

a

0

)

(

=

SSE

b

Advertising