Advanced Methods in Applied Statistics 2017
Lecturer: D. Jason Koskinen
Email: koskinen (at) nbi.ku.dk
Basic Information
- Block 3 - Timetable A of the 2017 academic
calendar
- Tues 08:00 - 12:00 and Thurs 08:00- 12:00 & 13:00 - 17:00
- Actual
- 08:00 - 08:30 student study time for both Tues. and Thurs.
- 08:30 - 09:00 Q&A or discussion with Jason in Aud. M
- 09:00 lecture on new material (not 09:05 or 09:15)
- On Thursday there will very often be new material starting at
13:00
- On Thursday it is very unlikely that any new material, lectures,
or review will happen after 16:00.
- Auditorium M at the Blegdamsvej campus
- Odd-numbered classes are 4-hours while even-numbered consist of 2
blocks of 4-hours
- Classes will be composed of ~20-30% lecture and demonstrations
followed by exercise
- While assignments, projects, and exercises can be done in the
programming language of the students choice, the examples and
demonstrations will be mainly in Python and/or scientific packages
thereof, i.e. SciPy, PyROOT, etc.
- Required text or textbooks: None
- 2016 Advanced Methods in Applied Statistics webpage
- It is recommended, but not
required, to have at least reviewed the little sibling to this course,
i.e. "Applied Statistics - From data to results" which can be found here
Evaluation
- Oral
presentation and 1-2 page summary (10%)
- ~10 minute summary presentation. Plan on ~7 slides if you are doing
a PowerPoint-type presentation.
- Can work alone or in groups of up to 3.
- A 1-2 page summary including any and all group members names
- Presentation does NOT have to be given by all group members
- Talk or email with Jason if you have questions about the
appropriateness of your article
- Be sure to put down which article you are using here
to avoid duplication
- Example presentation
on Finite
Monte Carlo article
- Other example articles (and no, you cannot use any of these articles
for your report/presentation):
- Frequency Difference Gating: A Multivariate Method for Identifying
Subsets That Differ Between Samples (article)
- Probability binning comparison: a metric for quantitating
multivariate distribution differences (article)
- FIREFLY MONTE CARLO: EXACT MCMC WITH SUBSETS OF DATA (article)
- This is just a small sample. Find something related to your area
of physics.
- Include people names and article here
by March 3
- The 1-2 page summary as a .pdf file is due
via email. Submission date is March 8 by 16:00 CET
- Presentations will be selected at random and begin during class time
on March 9. At the discretion of Jason and if needed, some
presentations will be postponed for a later date.
- If you have any questions or concerns email Jason
- Project
(30%)
- Similar to the oral presentation, this project focuses on using a
method or statistical treatment related to your field of physics
research that you or your group select. Unlike the oral presentation,
the project includes not just understanding and explaining the method,
but also using it on a some appropriate data set of your own choosing.
- Can be done alone or in groups of up to 3 people
- The only hand-in is a 4-6 page written report. You can submit the
code as well if you would like.
- Due: Monday March 27 at
16:00 CET
- Final
exam (40%)
- Must work on your own!
- Take home exam
- 28 hour between start and submission
- Begins at 13:00 CET on Thursday March 30, 2017
- The exam must be submitted by 17:00 CET on Friday March 31,
2017
- The exam will be similar to problem set 2
- A handful of more intensive questions as opposed to numerous short
questions
- While the exam will contain problems from any portion of the
course material, the focus will be more on topics in the latter
portion of the course
- Here are two extra practice problems
similar to what will be on the 2017 exam
- Here is a link to the 2016
exam
- Note that the 2017 exam will focus more on latter topics in the
course than the 2016 version
- Also, the 2017 exam will be comparatively more difficult than the
2016 version
- EXAM LINK IS HERE
- (+2% to final course
grade average on a 1-100% scale)
- 2017 NCAA Men's Basketball Bracket submission due by tip-off of
initial 1st round game on March 16
- This is NOT a requirement, nor is it an obligation for the course
- Extra Credit Outline
- Due: Thursday March 16 by 17:00 CET
Course Syllabus
The outline is a rough sketch of the
course material, and is 100% likely to change throughout the course. Even
so, we are very likely to cover
the following topics which may require additional software support:
- Multivariate analysis (MVA) techniques including Boosted Decision
Trees (BDTs)
- The MultiNest bayesian inference tool
- Basis splines
- Markov Chain Monte Carlo
- Likelihood minimization techniques
Class notes will be posted here:
Class 0 - Pre_Course, attendance is not required
- Optional time to make sure your laptop is setup
- February 2, 2017
- 10:00-12:00 at Blegdamsvej (in Aud. B)
- Lecture 0
Class 1 – Start
- Course Information
- Chi-square
- Code chi-square
- Data for exercise 1 (FranksNumbers.txt)
- Review of 'basic' statistics
- Lecture 1
- Be knowledgeable about the Central Limit Theorem
- Start reading paper about how well Gaussian statistics compares to a
wide selection of scientific measurements
- "Not Normal: the uncertainties of scientific measurements" link at arXiv
or DOI
- If there's time, there may be discussion on Thurs. about the paper
- First problem set is assigned
Class 2 - Monte Carlo Simulation & Least Squares regression
- Lecture 2
- Monte Carlo (starting at 09:00)
- From the "Not Normal: the uncertainties of scientific measurements" paper:
- For the ambitious, create a 'toy monte carlo' of the sample and pair
distributions for the nuclear physics data in Sec. 2.A. For simplicity
assume that all the 'quantities' are gaussian distributed
- Write functions where you can produce multiple gaussian
distributions to sample from and generate a sample of "12380
measurements, 1437 quantities, 66677 pairs".
- Produce the z-distribution (using eq. 4) plot for just your toy
monte carlo and see if it matches a gaussian, exponential, student-t
distribution, etc...
- Least Squares lecture (starting at 13:00)
- Discussion of the "Not Normal: the uncertainties of scientific
measurements" paper
Class 3 - Introduction to Likelihoods and Numerical Minimizers
- Lecture 3
- Maximum likelihood method
- Gradient descent and minimizers
- Example code from Niccolo
(TA) and some from Jason
(course lecturer)
- Remember that the first assignment
is due on Wednesday
Class 4 - Finish Introduction Likelihoods and Minimizers, then
Intro. to Bayesian Statistics
Class 5 - Background Subtraction and sPlots
Class 6 - Markov Chain(s)
- Be sure to have an external package for Markov Chain Monte Carlo
(MCMC), e.g. emcee, PyMC
- Just like minimizers, syntax and options matter
- Be somewhat familiar with your chosen MCMC package
- Lecture 6 Markov Chain Monte
Carlo (MCMC)
- Some example python code for the exercises (caveat emptor)
- Using PyMC, which
wasn't the greatest package (at least last year), but it got the job
dones
- Using emcee,
the solution is graciously provided by Niccolo Maffezzoli (TA)
Class 7 - Parameter Estimation and Confidence Intervals
- Lecture 7 Confidence
intervals
- Numerical minimizers for best-fit values
- Data
file for one of the exercises
- Oral presentation and 1-2 page article reports will be due/covered
March 8&9 (look here)
Class 8 - Hypothesis Testing
- Lecture 8
- Likelihood ratio
- Data files for one of the exercises. Just use the first column in each
file. The second column is unimportant.
Class 9 - Interpolation and Splines
- Lecture 9
- Splines
- Data files for one of the exercises.
- Interesting article about use of splines and penalty terms
Class 10 - Presentations and Multivariate Analysis techniques
- In the morning we will have the oral presentations from the articles
chosen
- Boosted Decision Trees
- Lecture 10
- Exercise 1 TMVA python script (will appear next week)
- Exercise 2 TMVA python script (will appear next week)
- Data
- Exercise 2 (16 variable file)
- The first column is the index, hence there are 17 'variables', but
the index variable only for book keeping and has no impact on
whether an event is signal or background.
- Every even row is the 'signal' and every odd row is the
'background'. Thus, there are two rows for each index in the first
column: the first is the signal and the second is the background.
[Format is odd, but I got it from a colleague].
- Here is the solution data sets
separated into two files (benign
and malignant) for the last
exercise of the lecture. Here is also the
(python) code that I used to establish the efficiency for all
the submissions from all the students
Class 11 - Data Driven Density Estimation (non-parametric)
- Kernel Density estimation
- Lecture 11
- Extra credit is now available (see here)
- Problem set #2 is now assigned (see here)
Class 12 - Confidence Intervals, Failures, and Feldman-Cousins
- Guest lecture by Dr. Morten Medici
- Under/over coverage in hypothesis tests
- Flip-flopping confidence intervals and corrections via ranking and use
of Feldman-Cousins unified approach
- Paper about
unified approach by G. Feldman and R. Cousins
- Lecture 12
- Yes, this topic may appear on the exam :-)
- I have posted the solution data sets
to the webpage for the BDT classification exercise (see links for
Lecture 11 above)
- The due date for the project is now March 27, 2017
Class 13 - Nested Sampling, Bayesian Inference, and MultiNest
- Lecture 13
- External packages for conducting nested sampling, e.g. MultiNest, are
necessary and some python options are:
- Super awesome articles that are surprisingly easy to read
Class 14 - Signal and Data Processing (Wavelets)
- Guest lecture by Dr. James Monk
- To prepare for the class make sure that a wavelet package is available
- For example in Python - "pip install PyWavelets"
- Matlab - http://se.mathworks.com/products/wavelet/
- Lecture 14
- Some coding scripts
Class 15 - Non-Parametric Tests Lecture snippet and Course Review
Extra Projects of a more difficult nature, for those who want something
more challenging.