Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (3.6 / 5): ★★★★☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (4.3 / 5):
Workload: 16 hours/week
Pros:
1. Professor Mary Parker is very helpful, engaging, and responsive.
2. The students are knowledgeable and class community are collaborative and added significant value. Students actively engaged in clarifying content and correcting errors, particularly on Ed discussions
3. Statistics component is much more application based and the overall class did enhance my understanding of statistical concepts.
Cons:
1. The probability content leaned heavily on theory, often making it challenging to grasp. The lecture notes and videos, being directly derived from the textbook and therefore does not add additional value.
2. Need external resources to supplement learning and understanding.
3. Homework are very theory based and requires a lot of time to do.
DSC381 was a challenging yet rewarding course, deepening my understanding of both theoretical and practical aspects of probability and inference. While there were gaps in the execution of the probability section, Mary Parker’s support and the collaborative class environment were invaluable to my learning.
The required textbooks, available online for free, were a mixed experience. The probability textbook was highly theoretical and lacked practical applications, making it difficult to follow. A more balanced textbook or supplementary materials focusing on real-world examples would improve the learning experience.
Mary Parker’s lectures were clear, well-paced, and engaging, providing solid explanations of the material. However, Peter Muller’s lectures were dry and at times unclear, which hindered engagement. More examples and practical applications in these lectures would have made the content easier to understand.
The lecture notes closely mirrored the textbook but did not provide much additional insight. They lacked sufficient context or application-based explanations, which would have been helpful for students struggling with the theory. More detailed, application-oriented notes would have been beneficial.
The workload was substantial but justified by the depth of the material. Study groups were essential for managing the workload and filling in the gaps left by unclear examples in the lecture notes.
Professor Mary Parker’s support was excellent—she was always available during office hours and went above and beyond to ensure students understood the material. I highly recommend attending her office hours. There was no office hours for Professor Muller.
Some of the TAs were helpful but not exceptional. While they provided basic support, they were not always effective at addressing common questions. I found attending Professor Parker’s office hours to be more useful.
Grading was delayed, which made it difficult to assess progress in a timely manner. While answer keys for homework and quizzes were provided, detailed feedback on specific mistakes was lacking. Timelier grading and more thorough feedback would help students identify knowledge gaps and improve performance.
Overall, the course content was strong, particularly in the statistics section. However, the probability portion was more theoretical and could have benefitted from more practical applications to align with the applied nature of the statistics content.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (4.3 / 5): ★★★★☆
Difficulty (2.1 / 5):
Workload: 13 hours/week
Pros:
1. Felt like I obtained a good understanding of the theory behind RL
2. Overall class is pretty easy, perfect to take during the summer
3. Professors were nice and had an AMA at the very end of the class
Cons:
1. Writing summaries for each chapter was extremely annoying
2. You're graded on your final answer and its very easy to mess up the computation purely due to arithmetic errors
3. Yes, I'm saying it again: writing summaries for each chapter was super annoying
Detailed Review:
For one last time, having to write summaries for each chapter made me feel like in high school or even middle school again. They should just make the problem sets after each module bigger and make us do those instead. I will admit though, it actually was interesting to read the book at times, especially when they try to compare all the techniques. They really only do that occasionally though, sometimes it gets confusing to follow the logical ordering they're going in and you have to synthesize the comparisons yourself.
Since RL calculations depend on other RL calculations, if you make one dumb mistake early on, the rest of your calculations are wrong then. Not really a way around this, but just be very very careful and always double check your work. You do get multiple tries on the problem sets which is really nice, but you of course do not get on this on the final exam.
Like I said, my memory is hazy, but I think there was only a single exam, the final. It was pretty fair with a couple tricky questions, but be careful: you have access to pretty much everything, including the lectures and textbook (no LLMs though), but given the number of questions, you definitely do not have enough time to search for all the answers. Funnily enough, being forced to write the summaries helps you a lot here, and so does reviewing the lectures where the professor walks through specific numeric examples (ESPECIALLY the grid ones).
Overall, the class was decently easy. Took it in the summer, so with the shorter timeframe, there was always something due every week, namely a chapter summary. I think like every 3-4 weeks there was a programming assignment. They could take a bit to get your mind wrapped around on what to actually do, but they gave you test cases you needed to pass, so it's straightforward at the end of the day.
Finally, I took this class after taking ML and DL. I don't really think the order matters, but I do like that I took it after those (and before NLP). You don't directly need ML for this, but that class exposes you to ML in general, while this class is focused on RL directly. As for DL, there is an assignment or two where you need to make a neural network, and if you have taken DL, it's very easy. Moreover, there was a chapter touched on the concepts of DL and helped solidify those concepts for me.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (2.9 / 5): ★★★☆☆
Lecture Rating (4.3 / 5): ★★★★☆
Difficulty (3.6 / 5):
Workload: 12 hours/week
Pros:
1. Wide range of topics covered.
2. Challenging homework assignments during first half of course.
3. I enjoyed lectures from both professors despite the different styles. I thought Klivans's lectures were engaging and packed a punch in terms of content covered. Liu's lectures were not as engaging but he covered example problems in detail that we would later see in homeworks or exam.
Cons:
1. Programming assignments towards end of class were too simple.
2. It seemed like it took a while for an instructor/TA to respond to questions on Piazza.
3. Piazza getting hijacked over use of Panopto. Concerns about Panopto should have been contained in one thread so as not to distract from learning.
4. Textbook was difficult to understand.
Detailed Review:
Fun class for me. Especially the first half. While I had to do quite a bit of learning outside of lectures to catch up, I don't fault the instructor since this is a graduate level course where we're expected to come in having math/stats background. I was also unfamiliar with ML terminology so I had to rewatch some lectures after getting more familiar with it.
Workload was high at the start mostly because I was reviewing math/stats concepts I had forgotten since undergrad while also spending time learning how to set up/learn Python and Latex. I would say I started off spending about 15-20 hours per week in this course which dwindled down to 5 hours per week at the end of the course.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (5 / 5): ★★★★★
Lecture Rating (5 / 5): ★★★★★
Difficulty (3.6 / 5):
Workload: 9 hours/week
Pros:
1. A lot of different techniques covered
2. Both the lecturers taught their content very well
3. Helpful TAs
Cons:
1. Some of the assignments in the first half of the course were hard
2. Difficult midterm exam
3. First half of the course was very fast-paced
Detailed Review:
One of the pre-requisites of this course is Optimization, but you don't need it. Apart from that, you will require knowledge of linear algebra, probability, and derivatives. I hadn't taken Optimization before this course, so the first half of this course was very fast-paced for me. Both the professors taught their subject material comprehensively, even providing basic reviews of mathematical concepts wherever needed. There were 14 assignments in total, 2 of which had nothing to submit. The assignments made up 70% of the total grade and were peer-graded. There was a mid-term and a final exam, each counting for 15% of the total grade.
The first half of the course was very math-intensive and required a lot of effort on my part to complete the assignments. Some of the assignments were related to softmax regression which wasn't taught during this course, so I had to spend a lot of time researching it and trying to implement it. The assignments were to be submitted almost every week which made this course very fast-paced. The final two assignments for the first half of the course and the mid-term exam were in quick succession, so that was very exhausting. The mid-term exam was very hard, even though it was multiple choice.
The second half of the course was a lot lighter than the first half. The professor would explain the possible uses of the online learning algorithms and the assignments were easy to complete. The courseload dropped significantly during the second half.
The course workload for optimization was around 12 hours/week and around 4 to 5 hours/week for online learning.
It was a fairly decent class, with a high workload at the start, and reduced after the mid-terms. The TAs were responsive on Piazza, especially Alex Stoken. I would recommend this class, but you need to manage your time properly, at least for the first half of the course. I am glad that I took this course, instead of taking Optimization for an entire semester because I am not very interested in that kind of material.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (4.3 / 5): ★★★★☆
Difficulty (2.9 / 5):
Workload: 11 hours/week
Pros:
1. UT/program/professors actively updating content for this course given changing NLP landscape
2. Manageable workload; front-loaded and predictable
3. Lecture clear and concise
Cons:
1. Lack of in-depth coverage on derivations/theory etc. in lectures / testable and project content
2.
3.
Detailed Review:
Good course, very doable even without having taken other ML courses / any experience in pytorch.
The amount you get from this course will be proportional to the effort put in, specifically if you refer to the research papers referenced by the professor's website. Referring to just the lectures and putting in minimal effort to the projects can still get you an A, but won't truly prepare you for further NLP endeavors without significantly more initiative from the student.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (4.3 / 5): ★★★★☆
Difficulty (1.4 / 5):
Workload: 10 hours/week
Pros:
1. First iteration of course is not difficult
2. I learned a bit from the lectures and assignments
3. Quizzes are 20% of your grade, and it is basically a participation grade.
4. Well structured (after kinks are worked out) and a great introduction to DL.
Cons:
1. More difficult material being moved to second iteration of course
2. Assignments (especially 1 & 2) were trivial for those with experience in DL.
Detailed Review:
Summer 2024 this course has been revamped. Going into the course, I expected it to be more work than it was based on previous iterations. It seems that the course has now been split into two iterations, so this course is easier now. Depending on your background, it will either be very easy, or a manageable learning curve class that I would recommend to people just entering the program. Especially those with limited pytorch background. I have taken NLP, ALA, and now this, and had this course been the same a year ago as it is today, I would have taken this as my first class. A TA mentioned roughly halfway through our semester that some advanced topics have been removed, and a second iteration of the course is coming in the future. I think that will be a great (and more difficult) class.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (4.3 / 5):
Workload: 25 hours/week
Pros:
1. You get exposure to a variety of programming languages C++, CUDA, Go, and Rust
2. You actually get to use said languages and learn a lot about debugging memory, concurrency, and multithreading issues
3. The professors seem like really smart people; hence, the quality of the lectures is really high. Honestly too high because a ton of it went over my head. I've saved them for reference to hopefully go through them one day once I understand Computer Architecture a lot more
4. TAs were AMAZING. Can't stress it enough. Only had 2 TAs managing the entire class and they did an awesome job considering the sheer workload.
5. Never bought the textbook and never needed it, so please don't buy it
Cons:
1. Although the professors are quite intelligent and the lectures are filled with detailed insights, the lectures are rarely useful for the homeworks. For HW1, some of them were good to answer the conceptual questions, but you could mostly bank on that knowledge for the rest of the class. Just look at HW descriptions, and then look at the lecture titles to figure out which ones you need to watch. You can skip the rest. And because the homeworks already took so long, I gained a lot more practical experience from this class, but not as much theoretical or hard knowledge about parallelization as I would have liked.
2. Stemming from the above, the homeworks are difficult, even quite hard at times. I probably should have not chosen this as my first class in the program, and since I am doing this while working full-time, life was pretty rough near homework deadlines. The lectures aren't super useful either so you gotta figure out most of the homework on your own.
3. You need to create graphs and charts and answer conceptual questions in addition to programming. It does give you useful insights to realize how much overhead matters, just always budget a few hours to gather the data, graph it, and then prepare a typed up document with everything. Don't skimp out either, they really want to see especially graphs. I got lazy on the last HW and didn't make graphs, just charts, and that lost me enough points to push me down to a 93.4, so A- :(
4. *****PLEASE get some exposure to coding in C++, ESPECIALLY understanding when to and how to delete memory AND using debugging tools like gdb. I'm so fortunate my work experience has been fully in C++ so I had a massive leg up, but even then the homeworks took really long. I literally cannot imagine how someone who has never coded in C++ and only done web development work would be able to finish these homeworks in a reasonable amount of time. The prior knowledge IMMENSELY helped with coding, compiling, and debugging weird errors. Otherwise, get ready to spend even more time.
Detailed Review
HW was due every 3 weeks. Told myself I would start on them early, sorta did towards the end, but let's be honest, waited till a few days before the due date to do most of the work. As such, it's difficult to give an average workload per week. Last homework was the easiest, the CUDA one was the hardest. A breakdown and tips for each:
HW1: Google and watch some videos on prefix sum. Was really annoying to figure out the general parallel case when you didn't have a power of 2 for your input or threads. To save you MASSIVE headache: the referenced paper has an algorithm written in pseudocode, that's exactly what you need. I was confused thinking it wouldn't work without recursion when you have a very large input - not going to say more, just try some handwritten examples and you'll understand how it works and that its what you need. For the barrier implementation and conceptual questions, the lectures are pretty useful, just find the right ones.
HW2: By far the hardest homework, especially because of that really annoying Thrust implementation. I just straight up just ran out of time on that part. And you have to code everything from scratch here. Start very early, and get each piece working. Official CUDA tutorials from Nvidia are fantastic. Shared memory implementation is actually very easy. Save that thrust implementation for the very end after you have everything done, including the writeup. I was really annoyed I couldn't figure it out in time, so I worked on it after the semester ended and figured some stuff out. It makes you think in a different way which is why it's harder. You can't naturally store multidimensional arrays in CUDA as you're used to - you have to assume they're laid out as 1D. There's some Stack Overflow answer out there which finally made it click: you can use constant iterators in thrust along with modulus, counting, and other operators to represent index arrays like 1, 2, 3... k, 1, 2, 3, ..., k, ... and 1, 1, ... 1, 2, 2, ... Once you've done all of the work and figure out this insight, it should make sense from there. Ask the TAs about the tools you need to answer some of the conceptual questions.
HW3: Go tutorial from Google is really good. The hardest part is making a concurrent buffer: there's some good YouTube videos on how to do this in C++. Follow the logic and copy it in Go.
HW4: Second hardest homework, but mainly because I lost a lot of time trying to learn Rust. Definitely use the official tutorial, but you don't really need to go through too many of them. It's a little daunting to learn it, but I would figure out the language as you do the homework. The important part to understand is Rust's ownership. Read through the section a few times and it was still unclear but it finally clicked when programming the homework, especially with the errors. Oh yeah figure out tuples for the send and receive channels. The IpcOneShotServer documentation was pretty sparse. It gets you part of what you need. To save you MASSIVE headache here: one side made a working channel, the other side has this and made a new working channel. Well you need to let the first side know this new channel. How the heck do you do that? Just send the new channel over the working channel!
HW5: You're pretty used to the course at this point so it's straightforward. I did have an engineering background so that definitely made it easier, but just remember some basics from your mechanics class. Also, I never needed the 3d visualizations, so I wouldn't waste time on it. I think on one of the last lectures the professor shows part of his MPI code. That was pretty helpful. There's no explicit instructions for the conceptual questions since they assume you know the drill by now. I was kinda burnt out from cramming to finish these homeworks last minute by now, so I figured I could just do charts instead of graphs this time. Please don't do that, you'll get docked a lot of points. Just make some nice visual graphs.
If you are working full time, I would not take this class with another class.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (2.9 / 5):
Workload: 10 hours/week
Pros:
1. Prof. Constantine's explanations are excellent
2. Really helps with understanding how ML actually works under the hood
3. Homework assignments really helped to solidify concepts
Cons:
1. Prof. Sanghavi's lectures were dry and sometimes hard to follow
2. Some topics covered once in lecture and never seen again
3. Pacing was a little off - algorithms felt rushed, LPs dragged on
Detailed Review:
I thought this was a solidly above average course, but it isn't for everyone. This is a math-heavy theory course. If you would rather focus on applications, this probably isn't for you. I think the first half of ML is a good comparison - if you found that interesting, you'll probably like this. If you didn't like ML, you'll hate Optimization.
It helps to have a good grasp on linear algebra (particularly eigenvalues and eigenvectors) going into the class. There is some opportunity to pick it up if you don't, but a lot of the math revolves around that, especially after the first exam.
Lectures are ok. This is a math class, after all, it's hard to make them super interesting. The general rhythm is Prof. Sanghavi giving a formal overview with proofs and then Prof. Constantine diving deeper into key concepts with worked examples. Prof. Sanghavi's lectures are decent if a little dry, and he moves pretty quickly sometimes. There are also a lot of errors in his typed slides (the downloadable PDFs - the handwritten slides in the videos are mostly fine).
Prof. Constantine doesn't cover as much breadth, but he spends a lot of time explaining concepts in detail. His lectures were the ones I re-watched to do homework problems and study for exams. My only complaint is that Prof. Constantine disappears in the last couple of weeks - I honestly had a less solid grasp on the material without him.
I thought the textbook was useful, but it's not an easy read. The value is that it gives slightly different explanations and examples for a lot of topics, and there were definitely a couple of exam questions I got right because I had read the relevant section of the book. You could get through the course without it, and I definitely didn't read the whole thing, but it helped in several cases when I didn't quite get an idea presented in lecture. And it's free online anyway.
The exams in this class are pretty challenging and make up 70% of the grade combined, but there is a significant curve at the end of the semester. A lot of the questions are fairly tricky and will test your ability to think through multiple details of a problem. Homeworks are a mix of straightforward problems and some slightly tricky, nuanced applications. The good news is you'll get full credit as long as you make a reasonable attempt at a problem, even if you don't get it right. Peer grading is a little annoying, but not any worse than ML. Plus almost everyone gets the full 5% for grading quality, so it's a nice boost to your grade at least.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (4.3 / 5): ★★★★☆
Difficulty (2.1 / 5):
Workload: 15 hours/week
Pros:
1. Good breadth of information related to Android programming
2. The projects ramp up in difficulty in a nice way, while later projects can be a bit time consuming the actual difficulty of them never gets too bad
3. If you already have a good amount of CS or SWE experience, the concepts found in this class aren't too difficult to grasp your mind around
4. Professor's pretty cool, his lectures were fun to watch, he explains a lot while keeping it lighthearted
Cons:
1. As mentioned earlier, while the actual DIFFICULTY of the concepts isn't bad, the time commitment for the course is pretty substantial. I have nearly 3.5 years of SWE experience so many of the things we implement weren't difficult to do necessarily, but navigating around Android Studio, Kotlin, understanding the project requirements, then actually coding it out took a good chunk of time for me near the end of the semester
Detailed Review:
Overall, if you have prior SWE experience coming into this class, you can expect it to be easier than most other classes you've probably taken. Keep in mind, however, the time commitment is not as low as you may have expected. There's a good amount of time to be put into the class, though the actual material itself is not challenging per se.
Overall Rating (3.6 / 5): ★★★★☆
Professor Rating (4.3 / 5): ★★★★☆
Lecture Rating (2.9 / 5): ★★★☆☆
Difficulty (2.1 / 5):
Workload: 8 hours/week
Pros:
1. Both professors were extremely responsive and helpful on Piazza. Went out of their way to make extra materials available to help with certain topics. They genuinely care about helping their students to learn the material and succeed. Very lenient in regards to dropping hw/quiz grades, allowing extra time for HWs due to the Texas Freeze, etc.
2. Workload is pretty light in general, depending on your prob/stats background. Homeworks took some time and gave problems that required more digging into, while the quiz/exam questions were more straight-forward.
3. No video proctoring required; these instructors trust their students to follow the honor code.
4. A lot of the material is review of undergrad prob/stats that is a prerequisite for this degree, so most students will find this to be an easy course to get started in the program.
5. Getting to do statistics via simulation was the most interesting part of the course for me, as I had only ever been taught the theoretical methods as an undergrad. On many problems, it was easy to write some code to simulate it, and the instructors were perfectly fine with us using that method.
Cons:
1. The amount of errors in the homework problems and lecture videos was really excessive. I actually started waiting longer to start working on the homeworks each week so that other students would find all the errors before I wasted my time on them. Some of this will probably get cleaned up for future semesters. But I feel like if you make errors in a lecture video, you should at least take the time to edit or re-record that part of the video. Made it feel pretty amateurish and not like I was taking a course in a top graduate program.
2. Multiple choice is a terrible grading method for a graduate course. I understand there are limitations due to the number of people taking these classes, so they all use multiple choice and/or peer grading.. but it would be a huge improvement if UT would cough up a little money to hire some graders.
3. StatKey is a neat tool that is really easy to use, but I felt like a lot of the stats part of the course was "learn to use StatKey", and it would've been better if we had some projects where we coded simulations ourselves in more widely used languages like Python or R. At the end of the class, we can do ANOVA simulations, etc. by just clicking a couple buttons in StatKey but I think most students would have no idea how to do them without StatKey.