And that's week two done and dusted.
This week we moved from one variable in our data to many via the fancily titled multivariate linear regression. This entails the additional step of feature scaling, which brings the variables into a similar range, lest they swamp the regression algorithms (making them slower rather than failing, I believe).
Edit: What this enables is finding a straight line through a set of data so that other values can be determined. e.g. determine property price from size. One variable is a weak indicator, of course, so the multivariate approach improves on this. e.g. determine property price from size, and number of bedrooms, and etc. This can only get more complex (with, say, location), thus creating lots of local minima, so I guess we'll get around to that a some point.
To solve the regression algorithms, we've been using the gradient descent method (a numerical method for finding local minima). This week, the normal equation was introduced, although we've not done anything with it yet. It looks much easier to use (or less bother), but at O(n^3), it can be expensive on large data sets. (Edit: This was used in the optional work and is indeed w-a-y simpler to deal with. \o/ powerful machines.)
We also got an introduction to Octave and its matrix & vector handling. The basic syntax and usage is straightforward, but the matrix and vector handling is new to me, which makes problem solving tricky atm.
This week was a lot more intense than last week, mostly due to having to learn a new programming environment with unfamiliar data types in a few hours, then being tested on it in a fairly wonky test/runtime environment. It's been an interesting and challenging week.
For you delectation, here's the homework output: