A nice summary of deep learning techniques from Ilya Sutskever.
Another great post by Pete Warden.
I recently came across a blog post by Pete Warden that described how to get the Caffe deep learning framework up and running on the new NVIDIA TK1 development kit. It looked like the toolkits have come a long way in the year since I last looked at this, so I decided to order a kit and try it out.
The development kit arrived on Friday and following Pete’s instructions, I was able to get the demo applications up and running fairly quickly.
Three years ago I signed up for the original MOOC’s: Introduction to Artificial Intelligence, taught by Sebastian Thrun and Peter Norvig and Machine Learning, taught by Andrew Ng. One year later, I took the Neural Networks course, taught by Geoff Hinton on Coursera and was introduced to the concept of Deep Learning.
The following summer, I signed up for the Artificial Intelligence program at the Stanford Center for Professional Development and started exploring AI and Deep Learning in greater depth. I took the Introduction to AI class in Summer 2013 and built a simple network for MNIST digit recognition. That fall, I took the real Machine Learning class at Stanford (again, taught by Andrew Ng). For that class, I did a project that reimplemented the MNIST example application using convolutional networks with Theano. I tried it out on the Kaggle competition and posted a result that ranked in the top 10. I then used the convolutional network and applied it to the CIFAR-10 dataset for image recognition, which worked OK but not great.
Last spring, I also took the Computer Vision course at Stanford (taught by Silvio Savarese), but didn’t get a chance to explore any further Deep Learning concepts (although some were covered in class in a presentation by Marc’Aurelio Ranzato). I’d been planning to take the Natural Language Processing class (taught by Chris Manning) this fall, but it’s not being offered. So now that it’s near the end of 2014, it’s time to start spending some more time on exploring deep learning. Stay tuned …