Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (3.6 / 5):
Workload: 15 hours/week
Pros:
1. A good amount of material is covered in the course. Everyone without prior experience with machine learning should walk away from this class with the feeling they've learned a great amount of material
2. The course is nearly exclusively a theory course. The instructors do a great job explaining the foundation of each topic and provide students with adequate knowledge to read and understand research papers in academia.
3. Dr. Liu provides access to his in-progress textbook that is actually quite useful, and it covers material covered in Dr. Klivans section.
Cons:
1. There is very little interaction with the professors on Piazza, and the TA's aren't much better with communication. I would rate communication as a whole a 2/10
2. Only 4 of the 6 homework assignments have a programming section. All of the programming sections were very simple, although they did do a good job tying the lecture material together. The most interesting facet of any of the programming assignments was coded by the professor rather than trying to get students form their own solution.
3. Peer grading doesn't seem very realistic for the depth of material covered in this course coupled with how sparse the grading rubrics are. There are plenty of examples of either the graders not understanding the material at all or otherwise not caring about their grading responsibility
Detailed Review:
Obligatory - I took this class along with Probability while working full-time.
There were a lot of negatives in this course, but overall I still found the class quite enjoyable. To be fair, though, I think the only reason why I found the class so enjoyable was because of the amount of application-style self learning I did to supplement what was covered in class. You will most likely have a very different experience if you expect this class to provide a fully self-contained, practical education in machine learning.
The course is divided into two sections - Dr. Klivans teaches the first half and Dr. Liu teaches the second half. Dr. Klivans has a very dry lecture style and makes many assumptions regarding the mathematical capability of his students. If you have a strong foundation with linear algebra and formal proofs then this won't be an issue for you, but students requiring a gentle introduction may struggle with the first half of the semester. I personally found Dr. Klivans lectures great and very informative, but I know many students felt very differently. The textbook Dr. Klivans uses is nearly useless if you already have an overview knowledge of the subjects or are capable of self-learning through very dense machine learning theory.
Dr. Liu makes fewer assumptions on the mathematical capabilities of his students and goes more in-depth with his explanations. He has a much slower teaching style than Dr. Klivans. The textbook he's working on is quite useful and it's very much appreciated that he's sharing this work in progress with his students. You will most likely feel that Dr. Liu and his textbook do a much better job explaining overlapping topics like MLE better than the Probability instructors/textbook.
The difficulty and time requirements for this course start out very high and monotonically decrease as the class progresses. The first homework took me many hours to complete while the final homework took maybe 15 minutes. I easily spent 15-20 hours in the first few weeks of the course, but ended the course with maybe 5 hours of required effort each week. Since this class is theory based, you will realistically need to work on side-projects and practice applying your newly gained theoretical knowledge to gain a wholistic understanding on the topics discussed. While you could certainly get by without putting in this extra effort, this self learning is essential in gaining real-world insight in how the topics discussed in course are used.
The programming assignments are woefully inadequate for this type of course. Each assignment is largely a "fill in the blank" or "complete the function" style Jupyter Notebooks. It's nice having a means of visualizing the topics learned through the previous week, but I wish there was a more rigorous means of applying our machine learning knowledge. Side note - the most interesting part of any of the programming assignments was pre-coded by the professor!
Instructor interaction was minimal and borderline hostile on some occasions. Many questions go unanswered, and a good deal of the questions that are answered either raise additional questions or don't really provide any insight on how the answer was derived in the first place.
This course contains two exams and both are proctored by a service known as Panopto. There was a great deal of push-back between some students and the teaching staff regarding the use of this proctoring service. Please, if you have any concerns with the service, then please contact the instructors calmly and privately when the class begins.
This class is difficult, but solid effort will easily put you within the grade needed for an A in this course.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (3.6 / 5): ★★★★☆
Difficulty (0.7 / 5):
Workload: 2 hours/week
Pros:
1. Dr. Wilke is highly knowledgeable in R (specifically in ggplot2).
2. Course load is manageable.
3. Dr. Wilke is very responsive on Piazza.
Cons:
1. A lot of the course covers content that could be googled, but the course provides a structured context and timeline in which to learn these things.
2. Peer grading lacks the depth of feedback one might see from a professor in undergrad.
3. Data is often too clean to show the difficulties of data science in the real world.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (3.6 / 5): ★★★★☆
Difficulty (3.6 / 5):
Workload: 25 hours/week
Pros:
1. Textbook is fantastic, very readable. If you just want to get by the midterm, just watch lectures. If you want to go more in depth and learn, read the textbook.
2. Lectures are overall not too bad, worth watching at 1.5x. The lecture content hasn't changed since 2019 afaik, so all the previous reviews mentioning lectures apply.
3. After grinding out the projects, I got way better at C/systems programming.
Cons:
1. Grouping mechanism needs work. Had unauthorized people try to join last minute before a project was due w/o Canvas issuing any notification whatsoever that someone joined your group. In general, people can join your group at will. TA said Vijay will be looking into this.
2. Like the previous 2024 review mentions, the class switched from xv6 to Pintos and the projects became significantly harder.
3. Some say harder than Parallel Systems now (I haven't taken PS, so no comment)
Detailed Review: Last two projects took 30-40 hours/week easy. A few TAs carried the whole semester while the other ones were afk. Lecture material on xv6 hasn't been removed, and only one Pintos lecture (an overview/intro guide) has been released so far, so our cohort got scant Pintos guidance other than the assignment descriptions and EdDiscussion responses. Everyone was watching Youtube videos from other universities for project-specific Pintos guidance.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (2.9 / 5): ★★★☆☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (4.3 / 5):
Workload: 25 hours/week
Taking this course dropped my GPA from a 4.0 to 3.9 (I could not dedicate 20-30 hours required, due to work and doing the Thesis).
However, I feel like it was worth doing this course just to learn the material. In a sense, Reinforcement Learning is the heart of AI. It is undoubtedly one of the most difficult problems in modern AI, and it really changes your approach to thinking about ML problems and "learnability" in general. As an ML researcher at my day job, RL was one my must-do courses.
Pre-reqs:
You really must be very good at Probability, otherwise this course is going to be unmanageable. Reinforcement learning is almost entirely Probability and Statistics. There's no linear algebra and very basic calculus.
Probability notation (particularly, conditional expectation and Adam's Law) is used from the second chapter. You're expected to be well versed with it. If you are not, I really would suggest revisiting this course later on.
Blitzstein (probabilitybook.net) is a great Intro Probability book; you can get up to speed by reading chapters 2, 3, 4, 7, 9 and 10.
Pros:
- The deadlines are stated upfront. You know in advance which chapters you'll have to read and when the homeworks must be done. This is great for planning your life.
- You read a great textbook end-to-end. People hate on Sutton's book because the math is not lightweight, but it's a well-written book that introduces a lot of complex concepts in an intuitive way. You do need to read each chapter multiple times, but that's because the concepts are a bit difficult; the writing is clear.
If you don't like the book, "Mathematical Foundations of Reinforcement Learning" is an alternative (it's even more mathy but a bit more straightforward).
- I actually really liked the fact that we had to read each chapter and summarize it; I feel it helped me grasp the concepts better. It's not a lot of work (use a text-to-speech app like MacWhisper, and it becomes much easier).
- The Homework grading is quite lenient, you get 500 attempts for any mathematical question.
- You get hands-on experience implement a lot of the Deep RL algorithms using OpenAI's gym env. This is valuable IMO because you kind of realise how tricky it is.
- TAs were fine and helped clarify some of the more complex questions.
Cons:
- The workload was quite high (20-30 hours/wk i.e. 2 or 3 full working days). Each week you have to submit one Reading Response (3-4 hours to read the chapter and summarize all sections) and one homework (6-7 calculation problems, each taking 30 mins). Every 3 weeks you have to submit a programming assignment, which can easily take 10+ hours (the code is tricky)...this is on top of the RR and HW workload each week. So every 3rd week becomes super hectic.
- The lectures are not very useful; I did not watch them beyond week 1. Profs were not active on Ed.
- You get 60% credit for all the homeworks if you complete them by the end of the semester. This is good in theory, but it's quite easy to fall into the trap of imagining you'll do all the homeworks later. This makes a big difference to your final grade (I got an 85% for B+, while an A was 90%...I calculated and just missed it because of the late HWs which I could not do due to my day job).
- The theory Multiple-choice questions can be hard, and you only get a few attempts.
- The final exam (30% of the grade) only tests your speed of calculation. It's a completely open book exam (you can use Jupyter etc), but it's very hard to complete on-time. Theoretically, if you do all the homeworks and a few of the extra exercises from the book (and particularly if you code them all up beforehand), then you can potentially score very high on the exam. But in absence of that, it's an uphill battle to complete the calculations in time. I only got 66% on the exam, despite coding up many of the HWs beforehand.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (3.2 / 5): ★★★☆☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (3.2 / 5):
Workload: 12.5 hours/week
Pros:
1. Good breath of topics from algorithms to statistical frameworks
2. Homework has both theoretical and programming aspects
3. Generous grade curve
Cons:
1. Peer grading on assignments
2. First 2-3 weeks are very fast-paced and can be stressful
3. Exam 1 is intentionally difficult
Detailed Review:
This is a good survey course of machine learning techniques with topics that cover both algorithms and statistics.
The homeworks are useful as they have both theoretical and programming aspects. The first 2-3 weeks, however, move very quickly and the time commitment on homeworks is much higher than the rest of the course.
Homework marks are almost always high, but the downside is that they are peer graded. I experienced at least 3 instances of students deliberately trying to lower my grade; fortunately, the T.A.s can correct this.
Exam 1 (algorithms) is quite a bit harder than Exam 2, and marks reflect that (class average of 59% compared to ~78%). The final grade curve was quite generous.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (1.4 / 5): ★☆☆☆☆
Lecture Rating (1.4 / 5): ★☆☆☆☆
Difficulty (3.6 / 5):
Workload: 10 hours/week
Pros:
1. Pintos was pretty good and interesting.
2. The book was pretty good.
3. The exams were easier
Cons:
1. Horrible TA support
2. They ask you to pick teams - The ones I chose didn't lift a finger, I had to do all the pintos assignments by myself and submit. I got 99% and guess what, they also got 90+% because the projects were weighted 70%. Thats frustrating.
3. Instructor's videos were heavily outdated. He never engaged with students.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (3.6 / 5): ★★★★☆
Lecture Rating (4.3 / 5): ★★★★☆
Difficulty (2.9 / 5):
Workload: 15 hours/week
Pros:
1. Knew absolutely nothing about machine learning before this, so this was my first course. Gave me a theoretical motivation of machine learning and it seems way more like statistics and applied math than computer science. Was not as hard as I thought it would be
2. Two separate professors so you get exposure to two different teaching styles.
3. No final project, only two exams, so not as stressful at the end. And the exams cover completely separate material!
4. The very last lecture directly connects to the Deep Learning class
5. Textbook for second half was very helpful
6. Curve was very generous honestly, an 85 was the cutoff for a solid A, A- was even lower. But most of the class was in the 70s/80s, so the cutoffs were actually quite close to each other.
7. You can literally print all lectures and homeworks (and textbook for the second exam) to use in the exams
8. TAs are super lenient and very nice
9. You don't have to but it's a great opportunity to learn LaTeX for math typesetting
Cons:
1. Very little practical experience. The first half did have us use scikit-learn but honestly most of the work was really just python.
2. Still not fully sure how all of this directly relates to the practical application of Machine Learning nowadays. Should study online resources to understand that
3. Textbook for first half was somewhat helpful for one or two questions on the homeworks, but other than that, don't worry about it, especially for the first exam.
3. Exams are actually somewhat difficult. The material is related to what is covered in the lectures and homeworks, so that's the not the problem. The bigger issue, especially for the first exam, is you really don't have enough time to complete it. You are only supposed to have 2 hours, and I finished all 6 questions in exactly 2 hours, but they give you an extra 30 minutes to account for uploading your video recording and taking pictures of and submitting your work. Make sure you really *know* the material. Just understanding the homeworks and lectures and especially practice exam at face value is not enough. Think about how the homework and practice exam questions could be asked in reverse, or with tangentially related questions. You really need to understand each major concept like PCA, SVD, K-means, etc which might entail watching some separate videos on them. You also for sure do not have enough time to be searching through all of your printed out reference material for each exam question. It takes a while to write out answers for the exam questions honestly. And to expand on the point of really knowing each topic, one example would be K-means. For K-means and other algorithms covered, they'll give you the psuedocode and explanations in words about how they work, with maybe one easy question on the homework covering a simple or an easier edge case. But do you really understand it enough to actually run the algorithm given the inputs? That's what they test you on the exam. Apply this line of thinking to other topics, in addition to what was already mentioned, and you'll be very well prepared to get a legit A without a curve.
4. *****The overall difficulty of this class is HEAVILY dependent on your comfort with mathematics, period. I have an engineering background, so I took a lot of math classes, and have always genuinely loved math, and took more math classes beyond what was required for my minor. If this describes you or you even double majored in math, this class honestly will be fine for you and pretty straightforward. If you have not taken many math classes, like the bare minimum required for most computer science and software engineering curriculums, this class will definitely be hard. The difficulty rating of a 5 I think reflects the average but is definitely bimodal. 4 or even less than 4 for the former background, definitely 6 or higher for the latter background. It's not that you necessarily need to go take a full linear algebra class before hand, it's more you just need to be comfortable with the honestly annoying mathematical notation being used. And that just comes with time and repetition. If you think you fall under this, then to be safe, I would not take another class with this class. Also, it's unfortunate, but success on the exams definitely is influenced by comfort with math, purely because of the time constraints. Navigating matrix notation and operations, log and exponential operations, basic probability rules, recognizing groups of variables that are constants in complex mathematical equations, etc are required to get the correct answer. Sounds pretty basic, but can be tough during time constraints. I realize this paragraph has gone on very long, but I just wanted to address all of the discussion I saw on the discussion forums when the semester started, and what I read a lot online about the mathematical background required. I don't you need to know math concepts x, y, and z to say you are ready for this class, imo it's more learning x, y, and z indirectly gives you comfort with the topics I mentioned above. Nevertheless, don't let it deter you from the class, the curve really was pretty good, especially since most employers require a B- for reimbursement anyways.
5. You have to peer grade other homeworks to get credit for your own homework which is extremely annoying. Please set reminders so you don't forget to do this
Detailed Review:
TAs were extremely lenient with grading. As long as you demonstrate understanding, they'll give you a lot of credit which was really nice. Combined with the generous curve, it's awesome. Course is divided into two halves; both professors are fine, but overall, I can say I understand parts of Machine Learning better, and can kinda see how the ideas could be used practically, but there's still a big gap in my understanding from the theory presented here and something useful in the real world. Homeworks are due roughly every 3 weeks and it's mostly theory, with some pretty easy programming since they have jupyter notebook templates you just fill in. Lectures are directly related to the homework, and even more so to the exam so in that sense they are very useful. But again, there is a gap in practical applications.
I took this class together with Deep Learning and it was honestly fine. They're on the complete opposite spectrums for theory and practice so it worked out okay, but if you want to motivate and understand the theory behind Deep Learning better, then for sure take this first.
I mention it in the emphasized #4 Con above, but if you like math and are comfortable with complicated notation, you can take this class with another class. Otherwise, do not take another class with this one.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (3.6 / 5): ★★★★☆
Difficulty (4.3 / 5):
Workload: 25 hours/week
Pros:
1. Well designed course structure and materials
2. Good textbooks for self learners
3. Interesting and important materials in algorithms and discrete structures.
Cons:
1. First Programming assignment created uneven workload and requires significant amount of implementation and is graded with no partial credit for partial implementation (you get partial credit based on test success)
2. Dafny (used for the last part) is not that well understood or well documented
3. MC format problems were very tricky.Some quiz questions (MC) were ambiguous and even incorrectly graded and had to be regraded
Detailed Review:
This is a decent introduction to program verification and teaches a number of important concepts in computer science that are of great theoretical and practical importance.
Course comprises of three modules. SAT solving is widely used for solving NP complete problems and is of great theoretical importance as well due to its generality. SMT solving is used for theorem proving. Both forms the building blocks for automated program verification which is used in practice for mission critical software (safety systems, blockchain etc. to name a few). Focus of the course is theory behind them, not applications.
The assignments, especially the first one, were found to be extremely difficult and even those who did competitive coding spent more than 80 hours. Quizzes used logical problems and involve pure thinking, uncorrupted by practical concerns such as measurement error or implementation compromises. For these reasons, when you do well in quizzes and assignments, you could feel validated on your skills, but it requires great deal of time.
Prerequisite is discrete math at undergraduate level, with particular focus on logic, set theory and state machines. Understanding of computational complexity and graph theory will give you an edge. Course also touches simplex programming, lattice theory and first order logic theory, so if you're familiar, it is advantageous. Understanding java data structures at advanced level is needed to do first two assignments.
I think some of the criticisms of the course are unjustified -- when quizzes are 50% of the overall scores, why would someone expect a cakewalk? The feedback from those who approached it logically (instead of intuitively) was overwhelmingly positive.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (2.1 / 5): ★★☆☆☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (2.9 / 5):
Workload: 7 hours/week
Pros:
1. Good review of topics you should've picked up in an undergrad DS course
2. Projects will help you develop programming skills
3.
Cons:
1. Test cases are a pain and unclear
2. Quiz questions require a ton more thought than what is covered in lecture
3. Very surface-level covering of algorithms
Detailed Review:
Okay, the quizzes can be hell but you'll get a lot out of this class, especially if you have a weak programming/DSA background. As someone with a good DSA background, this class was largely an exercise in programming rather than learning. This class is by no means sufficient to go into programming interviews with: I'd recommend doing some supplementary reading each week to help add similar data structures/algorithms into your arsenal.
I say that this feels as if it is an undergrad class. Nothing here is really graduate level. The professor said in a video somewhere that this class was added because DS students generally had a weaker programming background. That's great and all, but this should be a graduate class. In my opinion, they really should've kept the algorithms class that DS students were allowed to take in the past for the stronger students and also offer this for students who need some help.
Curve is pretty lenient in my opinion. Raw ~85% is enough for an A. The first three assignments are pretty easy. #4 and 5 are difficult and require a lot of debugging. #6 just requires some thought on strategy. The quizzes are hard right from the start and continued to be hard right up to the end.
TAs when I took this were great, especially Joseph. Prof was absent pretty much the whole class as is to be expected. Was there to essentially give a copy-paste introduction at the beginning and to roast cheaters at the end.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (3.6 / 5): ★★★★☆
Lecture Rating (3.6 / 5): ★★★★☆
Difficulty (4.3 / 5):
Workload: 15 hours/week
Pros:
1. Great content and lectures
2. Very useful in CS and Engg.
3. Builds mathematical maturity to some extent
Cons:
1. Poor format of exams
2. Not enough problems to guide
3. Some real analysis is needed to make sense of everything, but course is not honest about it
Detailed Review:
This course is a good intro. to convex opt.
This course is significantly more accessible than ee364 (Boyd), and both professors deserve credit for that. Those who say, just take Boyd's youtube lectures (instead of this) are either math geniuses or have no clue what they are talking about. Professors avoid real analysis as much as possible and use simple concepts. Both lecturers are good, one is not a native speaker and sometimes struggle a bit, but otherwise he is also very good.
However, while taking the course, you do not get enough practice in solving numerical problems that show up in MC exams. If you do the exams fully honestly, I believe it is hard to get an A or A-. There was only a nominal curve.