As part of my Machine Learning journey, I have signed up for Kaggle and started their courses. I have completed the Python and have moved on to the Intro to Machine Learning.
Yet again, I encountered a well put together course. They take you through an example of basic data janitorial work. After that making a basic decision tree. Past that, it takes to setting up features and teaches the basics of under-fitting and over-fitting data. Then moves from a decision tree to a random forest model. Finally in its most valuable bit of education, it takes you through the basics of submitting a Kaggle competition entry.
Explained well with examples in small doses, it allows you to learn and adapt at a comfortable speed.
The Bad- Kaggle – Intro to Machine Learning
Overall the worse part was the code would not just let me copy and paste from the example to the exercise. I know, that is actually a good thing but many of the challenges took some tweaking and I wanted to blast through it quickly since there was not much new for me here. If you are using it as a review, the exercises could be a bit annoying. You also end up with a feeling of no conclusion as the Kaggle never ends so you don’t know how you actually did.
Small nitpicks but those are the downsides I experienced.
So far I have been really impressed in the quality of their microcourses. I’m not yet ready to say that you should go through the Kaggle before starting a book or an university style course but it really does show the basics well.
With all that said, if you have read a beginners guide to deep learning or machine learning then this will not add much to that education. For beginners, I can see this course as being an invaluable introduction to machine learning. So take that in consideration when you decide to take the Kaggle, Intro to Machine Learning.