Linear Regression vs Gradient Descent Programming Worksheet

1

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

.

Linear regression vs. Gradient descent. The files DataLoanTrain.csv

(training data) and DataLoanTest.csv (testing data) contain the history

of approved and denied mortgage loans for one thousand and five hun-

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

dred different clients respectively. The bank makes the decision based on

the annual income of the applicant (first column), and the numbers of

years that the applicant has been working in the same company (second

column); based on these data, the loan is approved or denied (third col-

umn, +1→ approved, -1→ denied). Notice that the annual income data

is divided by 100k.

(a) Find the w coefficients related to the decision boundary using the lin-

ear regression concept (pseudo − inverse) and DataLoanTrain.csv.

Plot the training points (DataLoanTrain.csv), and testing points

(DataLoanTest.csv) and the decision boundary given by w.

(b) Use the w coefficients to calculate the error Ein related to the train-

ing data (DataLoanTrain.csv), and Eout related to the testing data

(DataLoanTest.csv).

(c) Find the w coefficients related to the decision boundary using the

gradient descent algorithm and DataLoanTrain.csv. Plot the training

points (DataLoanTrain.csv), and testing points (DataLoanTest.csv)

and the decision boundary given by w. For that do the following:

Calculate the autocorrelation matrix R = 1

N XT X (slide 32).

Calculate the cross-correlation p = 1

N XT y (slide 32).

Implement the iterative algorithm w(n+1) = w(n)−μ (Rw(n) − p)

(slide 35). Start w in zero, use 10000 iterations, and choose a

μ value that guarantees convergence (you might try with values

around 0.01)

(d) Plot a curve that shows how Ein evolves with the number of itera-

tions.

(e) Plot a curve that shows how Eout evolves with the number of itera-

tions.

(f) Compare the final training and testing errors obtained with Gradi-

ent descent and the training and testing errors obtained with linear

regression. Compare the values of w obtained using linear regression

and Gradient descent. Discuss.

What happens with the Gradient descent algorithm if μ is increased

to 10. Explain.

.

A Jupyter notebook template is attached. Please fill the blank spaces

(#Your code goes here#).

Still stressed from student homework?
Get quality assistance from academic writers!

Order your essay today and save 25% with the discount code LAVENDER