Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (1.4 / 5): ★☆☆☆☆
Lecture Rating (1.4 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 4 hours/week
Pros:
1. No tests or quizzes, 12 homeworks are the only grade
2. Not super time consuming
3.
Cons:
1. Peer evaluation on most homework assignments
2. Undocumented provided R code
3. Frequent derivation problems on homeworks
Detailed Review:
This is a math course. There aren't really any lecture topics about practical applications of regression modeling. If I got an interview question about regression, this course did not really prepare me for that. That being said, it is a pretty simple course structure with only 12 homework assignments where you get 7 days to complete them.
Piazza was not super helpful this time around. The TA was good but the professor would be pretty rude to people by providing
non-answers and accusing them of not watching the lectures in the Piazza. There is no textbook, so if you are stuck you basically have to Piazza or Google your questions.
The provided R code is undocumented and usually raises more questions than answers. I always started clean when working through the coding problems.
You can get partial credit for showing your work and being on the right track on the peer evaluated homework, but some peers are more generous than others and your grade will be dependent on who you get.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (2.9 / 5): ★★★☆☆
Lecture Rating (2.1 / 5): ★★☆☆☆
Difficulty (1.4 / 5):
Workload: 4 hours/week
Pros:
1. Extremely low time commitment (if you value that)
2. Ok intro to PyTorch for those with no experience
Cons:
1. Undergraduate level depth, and that’s being generous
2. Course redesign seemingly not complete before start of summer 2024 semester
Detailed Review:
Please note that this was my final course in the program, which may skew my expectations.
Course was redesigned for Summer 2024. http://www.philkr.net/dl_class/ at time of this review does not reflect this redesign. There were only 8 modules for summer 2024, most of which seem like scaled-down versions of the existing first 8 modules at the link above, with some extremely introductory content (~1.5h worth) on transformers added. There are now only 4 assignments, no final project. There are ~30 quizzes that you can easily get full credit on if you halfway pay attention to the lecture modules. I have not seen any confirmation that future iterations of this course will keep the same amount of graded content as summer 2024.
I spent a total of ~40h on this course, roughly one quarter of the mean time I spent per course in the program. The content in the lectures was cursory introductory surveys of deep learning topics, and the assignments effectively were piece-mealed introductions to PyTorch. About a third of the lectures showed baseline PyTorch model architectures for anything needed in the homework, so if you have a background with PyTorch, the assignments are not challenging. The most common complaint from the previous iteration of this course was divergence between local and remote grader performance — luckily, this was fixed.
Rumors that the course was being redesigned were circulating in Spring 2024. However, it wasn’t advertised until mid July 2024, in the middle of the summer semester, that "more advanced topics are covered in pt 2 of the class", i.e. deeper dives on advanced content from the original course (pre Summer 2024) found at http://www.philkr.net/dl_class/material. "pt 2 of the class" meaning a future separate course not previously advertised. This was extremely frustrating — this course previously had a reputation for being robust and valuable, and I had looked forward to closing out my time in the program
with a similar experience. Poorly communicated redesigns of courses will harm the program, especially when a course is revised to cover introductory content compared to robust prior offerings. Comparatively, NLP was redesigned in ~2023 to cover updated transformer, LLM, and overall network design content. NLP’s redesign was wonderfully executed. Until more information about DL pt 2 is available, I’d encourage students to take NLP instead of DL pt 1 for a more valuable experience.
Finally: the assignment release dates were pushed back throughout the semester, seemingly because the redesign was not complete. The TAs did an admirable job trying to keep students aware on the course Ed discussion board, but the delays were still frustrating. However, given the overall low time commitment for the course, this wasn’t ultimately a big deal. I imagine future iterations will not have this issue because of the assignment content generated during summer 2024.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (2.9 / 5): ★★★☆☆
Lecture Rating (1.4 / 5): ★☆☆☆☆
Difficulty (1.4 / 5):
Workload: 15 hours/week
Pros:
1.
2.
3.
Cons:
1.
2.
3.
Detailed Review:
This class was a significant disappointment and a troubling reflection on higher education, leading me to question its true value. The material was disorganized and lacked a logical structure, causing students to struggle with what should have been a guided learning process but instead felt like fumbling without direction.
The assignments and exams were an embarrassment, fit for a time when people still believed in witch trials. Most students already had access to these old, recycled assignments. Instead of rectifying this glaring issue, they simply threw a horde of TAs at it, tasked with hunting down plagiarism like it was some medieval inquisition. The sheer irony of this while suppressing the use of online resources, especially for programming, is a clear signal that the staff is woefully out of touch with any real life machine learning experience.
The exam process added further frustration to an already troubled experience. Students were required to record themselves, refrain from using any digital materials, and potentially print hundreds of pages of notes. This process was not only inconvenient but also showed a lack of environmental consideration.
In conclusion, this class is a significant disappointment, offering little in the way of coherent, current, and conscientious learning. Unfortunately, with no alternatives available, those considering this course should approach it with caution and be prepared for its shortcomings.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (1.4 / 5): ★☆☆☆☆
Lecture Rating (0.7 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 5 hours/week
Pros:
1. Pretty low time commitment
2. No finals or exams
Cons:
1. The HW is usually peer graded, and poorly worded
2. The professor is not active on Piazza
3. The content is too vague and never really explains anything well, so I don't feel like I actually learned anything
Detailed Review:
The professor put as little effort as possible into this class. All but two of the HWs are peer-graded, and the questions are extremely vague, which makes it hard to know what you are supposed to put in your answers. The office hours devolved into "what slide has the answer to this question" pretty quickly, and I don't feel like anything was actually learned. Disappointing for a foundational class.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (0.7 / 5): ★☆☆☆☆
Lecture Rating (0.7 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 6 hours/week
Pros:
1. TAs were pretty responsive.
2. Light workload.
Cons:
1. Programming assignments were pedagogically worthless.
2. Lectures were nearly unwatchable.
3. The textbook is not a good standalone learning source.
4. Final exam tested computation speed/accuracy instead of understanding.
Detailed Review:
The topic of Reinforcement Learning could make for a fun, project-based course. Sadly, this class misses the mark.
If you want to learn how to apply RL to real problems, this class is almost worthless. Far too much time is spent on fundamentals, tabular RL, and hand calculations. Modern RL techniques like PPO, SAC, TRPO, DDPG, and A2C are never even mentioned.
READING RESPONSES
The vast majority of the course consists of reading the textbook, which is Sutton & Barto's Reinforcement Learning. You have to write "reading responses" for each chapter. You can say anything you want as long as what you write shows that you've actually done the reading. The book itself is considered by many to be amazing, but I don't agree. It's verbose and covers a lot of detail that gets in the way of understanding the important parts. In general, the field of RL suffers from poor naming conventions and cumbersome, gratuitous notation, and this book does nothing to ameliorate those problems. A good textbook would rework RL into a more pedagogically effective format.
LECTURES
Ideally, class lectures would make up for textbook inadequacies, but in this class the video lectures are more like footnotes to various book details. The lectures don’t actually teach the material or present it in an easily digestible way. I don't know if it's the style of presentation, the content, or the professors' demeanor, but the lectures are grueling to watch. I know many other students felt the same. I watched less than 25% of the videos.
PROGRAMMING ASSIGNMENTS
There were four of these. I learned almost nothing from them because for the most part they just involved implementing algorithms verbatim from the book. All you need to do is put your head down and transfer the pseudocode into Python. Some people seemed to struggle a bit with these, but for anyone who is proficient in Python/PyTorch and can write efficient code, they are trivial. I think I spent under two hours on each of them.
EDX QUESTIONS
Each book chapter has a set of EdX questions associated with it. The majority are very simple, but occasionally a difficult one is thrown in. Most allow multiple (but not unlimited) attempts, so you can try a few times until you get it right. Quite a few of the problems are poorly phrased, which is annoying. Also many of them focus on minutiae from the book, as opposed to general principles.
TEACHING ASSISTANTS ("Learning Facilitators")
For the most part, they were fine. They were pretty responsive on Ed. I didn't attend any of their office hours.
PROFESSORS
Who? They were non-existent. I think one of them has left UT, so I can't blame him for not being around. I never saw a single post from the other one. No office hours either. There were a couple of "AMA" sessions with the professors--as if they were celebrities. UT should really rethink having classes where the professors don’t participate.
FINAL EXAM
There's one three-hour exam. It's worth 30% of your grade and is fully open book. A bunch of the questions are T/F or multiple-select. Those are fine. But there are also a lot of calculation questions that require numeric answers. Most of these are not trivial calculations. They are long chains of arithmetic that require you to scroll back and forth and reference diagrams and numbers on other parts of the page with unnecessarily intricate details. If I knew what it was going to be like, I would have pre-written code to use. If the exam were untimed, it might be fine, but under the pressure of the clock, it winds up being a really bad experience.
I got an A in the class, so this review is not just sour grapes. UT needs to do better.
Difficult, Time Consuming, and Unintuitive (Half Semester Review to be Updated)
Spring 2023CS 388G · Algorithms
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (1.4 / 5): ★☆☆☆☆
Lecture Rating (1.4 / 5): ★☆☆☆☆
Difficulty (5 / 5):
Workload: 30 hours/week
Pros:
1. Exposure to technical algorithmic analysis
2. Practice using difficult mathematical proving techniques
3. Some of the TAs are very helpful
Cons:
1. Unintuitive lectures that are disjoint from problem sets
2. Mathematically rigorous theoretical homework
3. Highly theoretical
Detailed Review:
I'm currently halfway through the semester and will update this review accordingly with a breakdown of grades and perspective on the final exams.
This is by far one of the most difficult courses I've taken in this program. Like mentioned in Cons, the lectures are essentially uncorrelated with the problems sets. While there may be some theoretical overlap, it won't help you complete the homework at all. I should mention the bi-weekly problem sets are almost entirely proof based. If you're not a fantastic mathematician (think Google research intern), you're probably going to have a very rough time on the problem sets. Office hours are required but thankfully recorded. For reference: I've completed the entire calculus series in undergrad and have dabbled in linear algebra.
There are some reviews here that insinuate that the course has been "fixed". I would argue the contrary. Additionally, another review argued that this class "makes you a better programmer despite not having any coding exercises". I would argue against that as well. I have industry experience in software engineering and while some of the theoretical concepts may have applications in research and other parts of the SDE industry, this will class does very very little to make you a better programmer.
If I knew what I knew now, I would have most definitely have chosen to take another course. Good luck.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (0.7 / 5): ★☆☆☆☆
Lecture Rating (0.7 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 10 hours/week
Pros:
1. The course material is well organized.
2. There are only homework assignments, no tests.
3. It is definitely possible to get an A with sufficient effort.
Cons:
1. I learned almost nothing of value.
2. The lecture videos are unbelievably dry and boring.
3. I don't feel confident applying the material to a context outside of this class.
Detailed Review:
This was one of the worst courses I have ever taken as a student. Although the course was well organized and it was possible to receive an A in the course with sufficient effort, I learned almost nothing of value. The lecture material focuses far too much on derivations and not enough on the background information that provides context for the material and motivation for why the students should learn the material in the first place. I do not feel confident applying the information I learned in a context outside of the narrow confines of this course.
Dr. Walker is clearly very knowledgeable on the subject matter but he does not do a good job of communicating information in a way that is engaging or motivating for students. His teaching style is very dry and he mainly just repeats what is on the lecture slides without providing much useful, additional information. In fact, I stopped watching the lecture videos after the first couple of weeks because I realized that my time was better spent just directly reading the lecture slides. I wish he used the lecture videos as an opportunity to provide context for the material on the lecture slides and to provide examples of how this information could be applied in a practical setting.
The TAs for this course were incredibly helpful. Their responses on Piazza clarified a lot of misunderstandings and were essential for understanding the lecture material and completing the homework assignments.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (1.4 / 5): ★☆☆☆☆
Lecture Rating (1.4 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 6 hours/week
Pros:
1. Focuses on Chapters 1-17 of Sutton and Barto, which is the standard textbook for RL courses
2. Plenty of online resources that cover the textbook and associated materials
3. TAs were knowledgeable and helpful
Cons:
1. Unengaging and no practical real-world projects to tackle, just pseudocode
2. Not much coverage into Deep RL
3. Independent of this class: The switch from Piazza to Ed this semester made this class even less engaging; there was almost no discussion on the board and it seemed students must have sent the TAs a lot of private posts instead (which TAs complained about). Moreover, the Ed board was difficult to sort and manage. It's just a disorganized haystack of threads. Terrible decision for the program to leave Piazza for Ed imo.
4. EdX grader would crash on timeout constantly; it was up to the student to find the most efficient code to get it to pass the grader. So having a correct implementation of the pseudocode wasn't enough, it also had to run quickly. This was annoying and was the major difficulty of the class, in addition to the final exam.
Detailed Review:
I watched an insightful, well-produced, and animated Udemy class that covers Sutton and Barto instead of the class lectures. Others have sought out Silver's lectures on YT, etc. There are plenty of resources out there that cover the textbook's chapters that are far better than the class lectures. That said, some of the questions on the final were inspired more from the problems discussed in lecture rather than the HW. It was like a penalty to anyone who didn't watch the lectures, lol! I breezed through the Practice Final (which resembled the HWs) but lost a lot of points on the final due to the lecture-like problems. Watch out for that.
Coming away from this class, I'd say it's just an appetizer for RL. It covers the classic textbook on the subject, which is good, but it is left to the student to find real problems to work on or a Deep RL course to actually learn RL.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (0.7 / 5): ★☆☆☆☆
Lecture Rating (0.7 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 6 hours/week
Pros:
1. Pretty easy course to pass
2. Unlimited attempts on homework
3.
Cons:
1. Busywork book report-type assignments
2. Very limited lecture content, pretty much all textbook reading
3. Inactive/abandoned edstem forums
Detailed Review:
This class is pretty much a guided walk through the textbook. As someone who generally doesn't like reading through textbooks I was dreading this class. The video lectures are less than 30 minutes per chapter which usually goes over one example. The whole class felt rushed, probably because it was filmed in early 2020. The lack of lecture content was really disappointing, I supplemented the class with David Silver's lectures which really helped. You're basically paying $1k to read a textbook and fill out multiple choice answers.
## Assignments
The weekly homework and reading "responses" are easy. You get unlimited attempts for the most part on the homework and the grade-school level reading responses are easy to knock out. The programming assignments also had unlimited attempts but required quite a bit of debugging each time for me. We had several weeks to do them so I would recommend taking these on early. The last two programming assignments required the use of pytorch, so I would recommend you take NLP and DL before you take this class, as there's basically zero introduction to pytorch in this class.
## Final
I thought the final was fair. The time limit is a bit tight and tests how fast you can put your answers down, I used the entire time and just barely finished. Definitely would recommend to prepare for it. However, if you do all of the homeworks/PAs/reading responses on time you should have at minimum a 70 in the course heading into this final, so it isn't super high stakes it just determines your letter grade.
## Support
I generally didn't need a ton of support in this class (never attended office hours). However, the edstem forums were deserted for several days at a time from the TAs/LFs. I've never experienced this level of inattention from the course staff before where several posts have zero replies or answers. The programming assignment threads in particular would rarely (if ever) receive replies.
Overall Rating (1.4 / 5): ★☆☆☆☆
Professor Rating (1.4 / 5): ★☆☆☆☆
Lecture Rating (1.4 / 5): ★☆☆☆☆
Difficulty (2.1 / 5):
Workload: 2 hours/week
Pros:
1. Second half of the course is very simple
2. Fair exams
3.
Cons:
1. Overcomplicates basic topics
2. Doesn't cover several important topics
3. HW is a headache
Detailed Review:
My undergraduate class was much better than this course. It's a disappointment in every way. The first half of the course was just convoluted. Every topic could be explained in a better, simpler way than how was presented. It just felt like he was reading from the textbook (which was also a terrible choice. Try Wackerly). Many important topics weren't even covered or just barely touched upon such as hypergeom, neg binom, hierarchical models, mixture models, order stats, WLLN, pdf and cdf methods of transformations etc. Some were only touched on HW- here's how this is defined, go learn it on your own and come back and answer this question. The second half of the course was better, but I feel like a bit more theory there would've been good. StatKey (the simulation software used) was generally good but some questions had frustrating rounding errors done by StatKey to where if you calculate the same test statistic, you'll be off. I ended up not watching many lectures after some point and just working off my undergrad notes which were more than sufficient to get close to a 100% in the course. Absolutely disappointing, wouldn't call this graduate level by any means.