ECTS: 8
Semester hours: 6
Exam mode: Written
Language: En
Links: CIT, Courses
Exams: 2022WS Endterm
2021WS Endterm
2020WS Endterm, Retake
2019WS Endterm
2018WS Endterm, Retake
2017WS Endterm
2014WS Retake
2013WS Endterm, Retake

Reviews

Hello, here is somebody, who has passed the lecture and feel secure enough teaching me ML in an understandable way? I would also pay you!

  • By experience I can tell you the exams are totally different to the actual lectures. If you really wanna learn and not just pass, try to code the topics. The tests are very practical and almost no Mathematics (or actually none, that I remember) was present when I did it.
  • Have a look at the contents of the lecture, find a blog with some code and try it yourself. That is, in my opinion, the best way to learn it.

Anyone here took the Machine Learning exam and can share how it was?

  • Really easy.
  • Now for advise, don’t try to solve these questions by googling. These are very easy questions, try homework question after this, because retake should be harder.

Well, that ML retake exam was one astonishingly large pile of bullshit.

  • Someone took notes from the first exam and uploaded it to unistuff.org. It seemed pretty doable to me.

    I did the retake and that one was rather “surprising”. 2 theoretical questions (which where actually 1:1 from the mock exam: the probability one with the coins and the derivation of elbo for variational inference). Other than that it asked for much more practical understanding than the exercises would have suggested to me. In one about KNN, you had to transform the feature space in a way to reduce missclassifications. There was one where you had to apply k-means by hand on a small grid and show the result. Some understanding question about why you can’t reconstruct the input from the output of a autoencoder exactly. You had to design a neural network to produce a certain function. I think one about building decision trees…

    After the exam I felt like I would have been better off by really studying the lecture slides in detail and not investing so much time into exercises and homeworks.

Did someone take the retake exam of ML last semester and wrote down the exam questions?

  • It was much tough than the main exam. Questions were like:- Doing backward pass on a network. Find number of decisions required to build optimal decision tree, if your data is from M classes, N features. WIll doing a gradient descent update reduce the loss?(Answer no). WIll doing a gradient descent update reduce the loss always if the step size is small enough?(Answer no, what if you are at global optima).
  • Doing 1-NN on a set of points labelled as circle and cross. Circles were above y=3 and crosses were below y=3

    They also asked how you can preprocess this data, so that classification using 1-NN improves.

    3 Variational numericals - 2 were from textbook/slides, but one was way too complex for anyone, I think, to do in exam time? It was like an exercise question from the books.

    There was one sstupid question where few points were layed on y=0 and y=2 axis, between x=0 and x=10. They asked how KMeans would go, if you choose 2 cluster center, one before x=5 axis and one after. This is not exact recall, but I thought you might be find this question is the books for ML, cristopher or murphy.

    One convexity function question.

    You had f(x) = a(x) * b(x) + c(d(x)). For a, b, c and d, there were 5 functions given, and you had to tell which function should be assigned to a,b,c, and d. Here you had to compute second derivative of each function and find which one should be used. I remember, 2 functions being e^x and e^(-x), Both have second derivative as positive and they incorrectly checked the solution for all students on this. Later, they did a rechecking.