Back
Semester | fall semester 2024 |
Course frequency | Irregular |
Lecturers | Aurelien Lucchi (aurelien.lucchi@unibas.ch, Assessor) |
Content | The class focuses on the theoretical concepts behind Deep learning. We will discuss the following concepts: General introduction to linear networks, activations, etc Approximation Theory Complexity Theory Network Architectures Optimization Optimization Landscape of Neural Networks Neural Tangent Kernel Regularization Generalization bounds Adversarial examples |
Learning objectives | The main goal is to understand the theoretical foundations of Deep Learning. This includes the following important concepts: - Universal approximation: can a neural network approximate any arbitrary function? - Optimization: how do we optimize the parameters of a neural network? what theoretical guarantees do we have about finding a good solution? - Generalization: under what conditions does the solution of a neural network generalizes to unseen data? - Adversarial robustness: how robust is a neural network to adversarial attacks? |
Comments | Exercise sessions will start the second week of the semester. There will be a lecture instead on Monday September 16. |
Admission requirements | - Machine Learning (classification, regression, kernels, etc) - Linear algebra - Calculus, Basic concepts in topology - Probability theory (random variable, expectation, density, etc) - Some non-mandatory exercises will require coding in python (reasonable coding skills in another programming language should be sufficient to learn python) Note that a significant part of the class focuses on understanding theoretical aspects, we will thus be covering proofs that require a good knowledge of the mathematical concepts discussed above. |
Language of instruction | English |
Use of digital media | No specific media used |
Interval | Weekday | Time | Room |
---|---|---|---|
wöchentlich | Monday | 10.15-12.00 | Bernoullistrasse 30/32, Hörsaal 103 |
wöchentlich | Thursday | 16.15-18.00 | Spiegelgasse 5, Seminarraum 05.001 |
Modules |
Doctorate Computer Science: Recommendations (PhD subject: Computer Science) Module: Applications of Distributed Systems (Master's Studies: Computer Science) Module: Applications of Machine Intelligence (Master's Studies: Computer Science) Module: Concepts of Machine Intelligence (Master's Studies: Computer Science) Module: Electives in Data Science (Master's Studies: Data Science) Module: Machine Learning Foundations (Master's Studies: Data Science) Module: Methods of Machine Intelligence (Master's Studies: Computer Science) |
Assessment format | continuous assessment |
Assessment details | Continuous assessment Note the following split: 15% continuous assesment (short exercises and Q&As given in class) 20% homework 30% project (writeup and presentation) 35% written exam A 50% score on HW sets is required to participate in the final exam. A score of 3 out of 6 is required at the exam to pass the class. Expected date: Thursday, February 1, 2024, 10-12 a.m. |
Assessment registration/deregistration | Reg.: course registration, dereg: cancel course registration |
Repeat examination | no repeat examination |
Scale | 1-6 0,5 |
Repeated registration | as often as necessary |
Responsible faculty | Faculty of Science, studiendekanat-philnat@unibas.ch |
Offered by | Fachbereich Informatik |