Lab 8: Model fitting

Not graded, just practice

Author

Katie Schuler

Published

October 24, 2024

1 Model fitting

  1. True or false, gradient descent and orinary least squares are both iterative optimization algorithms.
  1. What cost function have we been using to perform our gradient descent?
  1. True or false, when performing gradient descent on the model given by the equation \(y = w_0 + w_1x_1 + w_2x_2\), we might arrive at a local minimum and miss the global one.
  1. Which of the following would work to estimate the free parameters of a nonlinear model?
  1. True or false, in gradient descent, we search through all possible parameters in the parameter space.

2 Model fitting in R

Questions 6-9 refer to the code and output below, performing gradient descent with optimg:

optimg(data = data, par = c(0,0), fn=SSE, method = "STGD")
$par
[1] 3.37930046 0.06683237

$value
[1] 959.4293

$counts
[1] 6

$convergence
[1] 0
  1. How many steps did the gradient descent algorithm take?

  2. What was the sum of squared error of the optimal paramters?

  3. What coefficients does the algorithm converge on?

  1. What parameters were used to initialized the algorithm?

Questions 10-12 refer to the output below from lm():


Call:
lm(formula = y ~ x, data = data)

Coefficients:
(Intercept)            x  
    3.37822      0.06688  
  1. Use R notation to write the model specification.
answer
y ~ x  # this works (implicit intercept)

y ~ 1 + x # this also works (explicit intercept)
  1. Given the model is specified by the equation \(y = w_0+w_1x_1\), what is the parameter estimate for \(w_0\) = and \(w_1\) = .

  2. True or false, for this model, optimg() with gradient descent would converge on the same parameter estimates?