Some Quick Impression on MIT DL4SDC Class by Lex Friedman

Many of you might know about the MIT DL4SDC class by Lex Friedman. Recently I listen through the 5 videos and decide to write a "quick impression" post. I usually write these "impression posts" when I only gone through some parts of the class' material. So here you go:

* 6.S094, compared to Stanford cs231n or cs224d, is a more a short class which takes <6 hours to watch through all materials.

* ~40-50% of the class was spent basic material such backprop or Q-learning. Mostly because the class is short, the treatment of these topics feels incomplete. e.g. You might want to listen to Silver's class to understand systematically about RL and the place of Q-learning. And you might want to listen to Kaparty's at cs231n to know the basic of backprop. Then finish Hinton's or Socher's to completely grok it. But again, this is a short class, you really can't expect too much.

Actually, I like Friedman's stand on these standard algorithms: he asked audience tough questions on whether human brain ever behave as backprop or RL.

* The rest of the class is mostly on SDC, planning with RL, steering with all-in-one CNN. The part which is gem (Lecture 5) is Friedman's own research on driver's state. If you don't have too much time, I think that's the lecture you want to sit through.

* Now, my experience doesn't quite include the two very interesting homeworks, DeepTraffic or DeepTesla. Both I heard great stories from students. Unfortunately I never try to play with them.

That's what I have. Hope the review is useful for you. πŸ™‚

Leave a Reply

Your email address will not be published. Required fields are marked *