Applied Machine Learning 2026

"Despite the connotations of machine learning and artificial intelligence as a mysterious and radical departure from traditional approaches, we stress that machine learning has a mathematical formulation that is closely tied to statistics, the calculus of variations, approximation theory, and optimal control theory."
[Introduction to Machine Learning, Particle Data Group (pdg.lbl.org) 2025]

Troels C. Petersen Daniel Murnane Johann "Janni" Nikolaides Gabriela Oliveira Clement Cherblanc Yousra Farhani Marc Jacquart
Lecturer - Associate Prof. Teacher - DDSA Fellow Teaching assistent - Ph.D. Teaching assistent - Ph.D. Teaching assistent - Ph.D. Teaching assistent - Ph.D. Teaching assistent - Ph.D.
High Energy Physics High Energy Physics Neutrino Physics Quantum Information Weather/Climate (DMI) Quantum Computing Neutrino Physics
26 28 37 39 93 83 89 58 31 50 23 59 50 34 90 29 20 26 76 48 71 38 03 61 93 95 58 67
Mac user Windows & Linux expert Mac & Linux expert Windows expert Linux expert Mac & Windows expert Mac & Linux expert
Course responsible Continuity responsible Initial Project resp. Final Project resp. Data responsible Quantum ML responsible GitHub responsible
petersennbi.dk daniel.murnane johann.nikolaides maria.oliveira clc yousra.farhani marc.jacquart
Office: NBB-3-I-034 nbi.ku.dk nbi.ku.dk nbi.ku.dk dmi.dk nbi.ku.dk nbi.ku.dk


What, when, where, prerequisites, books, curriculum and evaluation:
Content: Graduate course on Machine Learning and application/project in science (7.5 ECTS).
Level: Intended for students at graduate level (4th-5th year) and new Ph.D. students.
Prerequisites: Programming experience (essential, preferably in Python) and Math (calculus and linear algebra).
When: Mondays 13-14 / 14-17 and Wednesdays 9-10 / 10-12 & 13-14 / 14-17 for lectures/exercises (Week Schedule Group C).
Where (lectures): Mondays and Wednesdays: Lille UP1 at DIKU,
Where (exercises): Mondays and Wednesdays: HCO A104-107, see KU Room Schedule plan.
Format: Shorter lectures followed by computer exercises and discussion with emphasis on application and projects.
Text book: References to (the excellent!) Applied Machine Learning by David Forsyth.
Suppl. literature: We (i.e. you) will make extensive use of online ML resources, collected for this course.
Programming: Primarily Python 3.12 with a few packages on top, though this is an individual choice.
Code repository: All code we provide can be found in the AppliedML2026 GitHub respository.
Communication: All announcements will be given through Absalon. To reach me, Email is preferable.
Initial Project: Initial project (a la Kaggle competition) to be submitted Sunday the 17th of May at 22:00.
Final Project: Final project (Exam) presentations on Wednesday the 10th (all day) and Thursday the 11th of June (for as long as needed).
Evaluation: Initial project (40%), and final project (60%), evaluated by lecturers following the Danish 7-step scale.


"People often say that data is the new oil, and it's not. The rare asset is what TO DO with all this data, what's actionable - this is the power of AI."
[Matt Wilson, at Google]


Before course start:
An introduction to the course can be gotten from this ML subject overview and related film introducing the course subjects (23 min, 1.48 GB).
Specific course information can be found here: ML2026_CourseInformation.pdf
To better know who you are, and optimising the course accordingly, please fill in the course questionnaire.
To test your "Python & Packages" setup, you can try to run ML_MethodsDemos.ipynb (which is also meant to whet your appetite).



Course outline:
Below is the preliminary course outline, subject to possible changes throughout the course.

Week 1 (Introduction to course and Machine Learning concepts. Tree and Neural Network learning):
Apr 20: 13:15-17:00: Intro to course, outline, groups, and discussion of data and goals. Overview of Machine Learning and techniques (TP).
     Exercise: Inspecting data and making "human" decision tree and linear (Fisher) discriminant. Also, setup of Python, Github, Slack, etc.
Apr 22: 8:15-12:00: Introduction to course itself, ML concept, Loss functions, Training, Cross Validation, and Tree-based algorithms (TP).
     Exercise: Setting up. Classification on reference data sets with Boosted Decision Tree based methods.
Apr 22: 13:15-17:00: Introduction to NeuralNet-based algorithms (TP).
     Exercise: Classification (and regression) on reference data sets with Neural Net based methods.

Week 2 (Initial project kickoff, Hyper Parameter optimisation, Feature Importance, Introduction to unsupervised learning and clustering):
Apr 27: 13:15-17:00: Hyperparameters, Overtraining, and Early stopping (TP).
     Exercise: Hyperparameter optimisation of simple tree and NN algorithms.
Apr 29: 9:15-12:00: Initial project kickoff. Feature Importance calculated using permutations and Shapley values (TP).
     Exercise: Determine feature ranking for reference data sets, and cross check these with actual models.
Apr 29: 13:15-17:00: Introduction to Unsupervised Learning: Clustering and Nearest Neighbor algorithms (TP).
     Exercise: Try to apply the k-NN (and other) algorithms to reference data sets.

Week 3 (Convolutional Neural Networks (CNNs), Graph Neural Networks (GNNs), and Dimensionality reduction):
May 4: 13:15-17:00: Convolutional Neural Networks (CNNs) and image analysis (DM).
     Exercise: Recognize images (MNIST dataset and insoluables from Greenland ice cores) with a CNN.
May 6: 9:15-12:00: Graph Neural Networks (GNNs) and geometric learning (DM). IceCube example (TP).
     Exercise: Work on classic GNN example data (TBD).
May 6: 13:15-17:00: Dimensionality reduction with introduction to t-SNE and UMAP algorithms (TP).
     Exercise: Apply dimensionality reduction on wine and astronomical data, and work on initial project.

Week 4 (AutoEncoders and anamaly detection, Time-Series and Transformers, and Final Project kickoff):
May 11: 13:15-17:00: (Variational) AutoEncoders and anomaly detection (TP). Preparing groups and subjects for Final Project.
     Exercise: AutoEncoding the MNIST dataset and possibly detecting anomalies in the data sample.
May 13: 9:15-12:00: Time-series, Transformers, and Natural Language Processing (NLP) (Inar Timiryasov).
     Exercise: Predict future flight traffic and do NLP on IMDB movie reviews.
May 13: 13:15-17:00: Final projects kickoff. Discussion of projects and how to work on them (TP).
     Exercise: Getting, plotting and planning on final project data and discussion of project goals.

Initial project should be submitted Sunday (17th of May) by 22:00 on Absalon!.

Week 5 (GPU acceleration, Summary of Curriculum, and Foundation Models):
May 18: 13:15-17:00: GPU accelerated data analysis - Rapids (Mads Ruben Kristensen, formerly NBI now Nvidia).
     Exercise: Work on final project.
May 20: 9:15-12:00: Example of CNN at work on beer, AutoEncoders at work on food, and environmental techniques in ML (Carl Johnsen).
     Exercise: Work on final project.
May 20: 13:15-17:00: Fast data loaders and Foundation Models (Inar Timiryasov + TP).
     Exercise: Work on final project.

Week 6 (CNNs on Beer, Diffusion and Reinforcement Learning, Domain shifts and example Exam presentations):
May 25: 13:15-17:00: No teaching (Whit Monday).      Exercise: Work (possibly at home) on final project, or not.
May 27: 9:15-12:00: Diffusion Models, Reinforcement Learning, and Discussion of industry cases (TP).
     Exercise: Work on final project.
May 27: 13:15-17:00: Domain shifts and Hybrid/Adversarial training (TP). Exam format and example exam presentation (TBC).
     Exercise: Work on final project.

Week 7 (Results and Feedback on initial project, Ethics in ML, new ML developments, and Scientific ML):
Jun 1: 13:15-17:00: Results and Feedback on initial project. Ethics in Machine Learning usage (TP).
     Exercise: Work on final project.
Jun 3: 9:15-12:00: New developments in Machine Learning (optimal transport, diffusion models, generative production chains) (Malte Algren + DM).
     Exercise: Work on final project.
Jun 3: 13:15-15:00: Scientific Machine Learning - modelling data with differential equations and ML (Christian Michelsen, TICRA).
     Exercise: Work on final project.

Week 8 (Bonus self study and... Exam):
Jun 8: 13:15-17:00: No teaching (Whit Monday). Potentially, work on initial project.
     Bonus (self) study (by video): Infrastructure, Networks, Scaling, and Speed (Brian Vinter).
Jun 10: 9:00-12:00: Presentations of final projects (TP, DM, JN, NP, AA, and potentially others!).
Jun 10: 13:00-17:00: Presentations of final projects (continued).
Jun 11: 9:00-12:00: Presentations of final projects (TP, DM, JN, NP, AA, and potentially others!).
Jun 11: 13:15-17:00: Presentations of final projects (if needed!) (continued).




Presentations from previous years


Course comments/praise (very biased selection!):
"Best day of my life!" (Pressumably at the University, red.)
[Christian M. Clausen, on the day of final project presentations, 2019]

"Student 1: Damn..."
"Student 2: I was just thinking what a shame you didn't get to see a whole classroom worth of 'damn' faces! But the feeling is there."

[Reaction in Zoom chat, after having explained the capabilities of Reinforcement Learning examplified by AlphaZero, 2020]
[Fortunately, I got to see the reaction the year before!]

"Troels is the perfect shepherd guiding relatively inexperienced statisticians to machine learning in an approachable and fun way."
[Anon, course evaluation, 2021]

"This course (and Applied Statistics) were among the most useful and insightful courses I have taken in my academic life."
[Petroula Karakosta, 2022]

"I applaud the delivery with hands-on tutorial sessions, supported by overview lectures. The assessments excellently supported the learning with the initial project helping us get over the initial bump, and the group project showing us how to apply ML to our own interests. 5/5 stars!"
[Alice Patig, Ph.D. student at DTU, 2023]

"I have really enjoyed working on the final project, as it becomes super clear how important data preparation is. I also find that we discuss possibilities of using almost every ML method we've covered to tackle different issues in preparing, handling, and evaluating data."
[Anon, course evaluation, 2024]

"Tak for et helt igennem fantastisk kursus."
[Laurits Moeberg, first mail after course exam 2025]

"This Applied Machine Learning course ended up being one of the most valuable learning experiences I’ve had. I learned a lot without ever feeling overwhelmed or under pressure. The mix of clear theoretical explanations and practical projects helped me build real confidence in applying machine learning techniques. I genuinely feel like the 'queen of regression' now."
[Anon (female?), course evaluation, 2025]



Last updated 16th of March 2026 by Troels Petersen.