Anybody else doing the Machine Learning course with Andrew Ng (Stanford) this time around?
I've been meaning to do this for the last year or so, as it has a great reputation and I've seen lots of positive comments from former students. But it always seemed to clash with other courses I also wanted to do e.g. this time I'd already signed up for Mining Massive Datasets, another Stanford course which is taught by the guys who wrote the book.
Unfortunately, I don't think my maths can cope with MMD - it looks like a great course but quite demanding. So I bailed out of MMD, which means now I'm free to do ML after all.
Machine Learning is already in Week 2, but I worked through the material for Week 1 yesterday and managed to complete the compulsory assignment questions OK. I'm hoping to catch up on Week 2 in the next few days. So far, it's really interesting, good fun and sufficiently challenging for my ageing brain.
So far it's going pretty well and I'm enjoying the course. Admittedly, I'm struggling a bit to keep up with the work, due to other commitments, but the workload itself is roughly as I expected i.e. around 8 hours a week in my case (they predict 5-7 hours but I'm always slower!). The course materials are very good, and the choice of topics is well-structured and interesting. The exercises are also well-structured and pitched at a reasonable level. You have to code in Octave, which is new to me, but tutorials are provided and it seems like a good choice for this kind of work. The forums are active and - importantly - actively monitored by the teaching assistants, who are really excellent on this course e.g. one of the TAs (Tom Mosher) provides lots of extra material around each week's programming exercises, which is very helpful.
We're about halfway through the course right now, so I'll come back with more info when I've finished. But based on what I've seen so far, I'd definitely recommend this course.
Well, I just finished my last exercise for this course so I thought I'd come back and update my comments.
This was a really good course, with a good mix of materials covering a range of ML concepts, algorithms and real-world topics. Each week's material consists of a set of video lectures presenting a pretty thorough exploration of that week's topics at an introductory level, plus a set of "review questions" i.e. online quiz on the week's topics. These are graded immediately, and you can take the quiz as many times as you like, although obviously the point is to try and get the questions right first time if you can.
There is also a practical programming exercise each week (except in the final week), which typically involves implementing various parts of the algorithms or techniques covered by the week's videos. You are normally provided with a set of Octave scripts and data to support this exercise, so you only have to complete specific parts of the process. The instructions for each exercise usually guide you towards a traditional iterative (loop-based) implementation, but in many cases it is easier and more efficient to use vector-based arithmetic in Octave, and one of the teaching assistants (Tom Mosher) provided a set of tutorials to help with this approach for the first few weeks. The programming exercises are fairly challenging, because you need to have a reasonable understanding of what you are trying to achieve, but you can often complete the implementation in just a few lines of code, especially if you use the vector-based approach.
You have to complete several steps in the processing for each programming assignment, and each step can be submitted for grading in turn via the Octave command-line interface. You can re-submit as many times as you like, up to the course deadline. Each week's exercise has a due date (usually two weeks ahead), and there is a "hard deadline" about 2 weeks after the final week of the course. If you submit your exercises after the relevant due date (but before the hard deadline), your submission score will be penalized 20%. This grading policy is more generous than some Coursera courses, where you can only re-submit work a couple of times (if at all) and you lose 50% or more if you submit your work late.
There is also a set of tutorials for Octave, which are to be completed as part of the first week or two, and these obviously provide a foundation for subsequent weeks.
The maths content is fairly straightforward, even for a liberal arts graduate like me, and there is a quick introductory session on linear algebra to help you get to grips with the use of vectors/matrices in Octave. The main problem I found was simply reading the various algorithms, as the notation involves a lot of superscripts/subscripts to keep track of the various elements being processed. It's often much easier to read the Octave code that uses vectors/matrices instead. But conceptually it's all pretty straightforward and very well explained, so you should be able to understand what's going on at each stage.
Most of the algorithms were for supervised learning, but we also looked at clustering for unsupervised learning. There was plenty of good advice on how to organise your machine learning "pipeline", how to decide the size of your training/cross-validation/test data-sets and validate/evaluate your results etc, and other useful techniques e.g. ceiling analysis to help decide where to invest effort in improving your machine learning process. Various application areas were looked at e.g. recommender systems, OCR, image processing etc, which helped to relate the maths to real-world practice.
I had very little knowledge of ML before I started the course, so I learned a lot, at least in general terms. I would probably have to work hard to implement a given algorithm from scratch, but I have a much better understanding of how to select a suitable algorithm from an ML library (such as Apache Spark's MLlib or Python's scikit-learn) for a given application, and about the many other issues to consider when designing an ML pipeline.
I would highly recommend this course to anybody who's curious about the world of Machine Learning and its applications in the age of Big Data.
I did register for this one this time around, but due to other commitments, I was not able to do it. One of my friend who did this course also gave a positive review. I will definitely do it the next time.
SCJP 1.4, SCWCD 1.4 - Hints for you, Certified Scrum Master
Did a rm -R / to find out that I lost my entire Linux installation!
Legend has it that if you rub the right tiny ad, a genie comes out.
Building a Better World in your Backyard by Paul Wheaton and Shawn Klassen-Koop