Hello, I am doing some homework for my computational methods class and need some help. We are covering linear regressions and least fit squares. So the problem I am doing is having me derive by hand the least fit squares for the equation y=a1x+e, meaning determine the slope that results from a straight line with a zero intercept. I did it by hand and got a slope of 0.61436, but was wondering how I would go about checking on matlab if this was correct?
The x and y data are as follows: X: 2 4 6 7 10 11 14 17 20 Y: 4 5 6 5 8 8 6 9 12 I understand basic plotting but am unsure of how to fit the line to it on here. Thanks!