COMP 640: Graduate Seminar in Machine Learning
- Instructor: Anshumali Shrivastava (anshumali AT rice.edu)
- Class Timings: Monday 3pm-4:30pm
- Location: AEL A121
- Office Hours: Monday 4:30pm - 5:30pm, Duncan Hall 3118
This research seminar is intended to discuss recent advances and trends in machine learning. We will be presenting and discussing 1-2 recent related technical papers each week. The aim is to understand the fundamental ideas, tricks, and concepts involved with the aim of using them in practice and stimulating research. The stress will be on reading and grasping maximum out of recent research papers. Whenever necessary, some concepts will be introduced for clarification and to make connections.
This year, the theme is "Deep-Learning in Practice".
Grading and Logistics
Class participation (5 min quiz), one paper presentation, and one paper summarization for 1 credit. In addition students can undergo a semester long research project for 3 credits. There will be a quiz on the readings in the first 5 minutes. It is important to read the listed papers (as much as you can) before coming to the class.
A rigorous course in machine learning is required. We will be discussing advanced papers in ML papers every week.
Presentations and Scribe Logistics
Each student should sign up for 1 class to present (2 students per class) and 1 class to scribe the discussions(2 students per class). You cannot scribe the same class that you presented. You should give a dry run of your presentation to the instructor a week before the class in the office hours (or some other scheduled time). Several rounds of modification may be needed before a presentation is ready for the class, so make sure to schedule early. The scribe should be submitted no later than a week of the presentation.
Please sign-up for scribe and presentation assignment at Google Spreadsheet
- 08/26 : Introduction, Logistics and Model Compression slides
- K-Class Classification in Log(K) Memory pdf
- Know More About Count-Min Sketch pdf
- 09/02 : Labor Day
- 09/09 : Model Compression via Knowledge Distillation (Dark Knowledge)
- Do Deep Nets Really Need to be Deep?
- Distilling the Knowledge in a Neural Network
- 09/16 : The Simple Hashing Trick
- Hash Kernels (Feature Hashing) pdf
- Compressing Neural Networks with the Hashing Trick pdf
- 09/23 : Network Pruning
- Optimal Brain Damage pdf
- RETHINKING THE VALUE OF NETWORK PRUNING
- 09/30 : Quantization in Neural Network
- Regularization of neural networks using dropconnect pdf
- Binaryconnect: Training deep neural networks with binary weights during propagationspdf
- Compressing deep convolutional networks using vector quantizationpdf
- 10/07 : Sparsity for Efficient Training of Deep Networks
- Scalable and Sustainable Deep Learning via Randomized Hashing pdf
- (Optional) SLIDE : In Defense of Smart Algorithms over Hardware Acceleration for Large-Scale Deep Learning Systems pdf
- Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
- 10/14 : MIDTERM BREAK
- 10/21 : Recurrent Networks are Slow, but Attention is All You Need
- Attention Is All You Need pdf
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- 10/28 : Information Retreival using Deep Learning
- A Survey on Learning to Hash
- From Neural Re-Ranking to Neural Ranking:
Learning a Sparse Representation for Inverted Indexing
- 11/04 : On Regularization and Generalization
- Lecture Notes from Maria-Florina Balcan at CMU
- Understanding deep learning requires rethinking generalization
- 11/11 : Generative Adversarial Networks (GANs)
- Generative Adversarial Networks (GANs)
- Generative OpenMax for Multi-Class Open Set Classification
- 11/18 : Preventing Adversarial Attacks
- EXPLAINING AND HARNESSING
- ARE ADVERSARIAL EXAMPLES INEVITABLE?
- 11/25 : Privacy and Fairness
- 11/02 : Project Presentations
Students with Disability
If you have a documented disability that may affect academic performance, you should: 1) make sure this documentation is on file with Disability Support Services (Allen Center, Room 111 / email@example.com / x5841) to determine the accommodations you need; and 2) meet with me to discuss your accommodation needs.