# Slides 2020

- Week 1-a: course organisation, whatâ€™s NLP about, overview of problems, first contact with text data
- Week 1-b: recap of probability theory, maximum likelihood estimation
- Week 2a and 2b:
- Week 3-a and 3-b:
- HMMs: parts of speech, hidden Markov models (HMM) for language modelling and POS tagging, Viterbi algorithm
- Textbook: Ch5 (2nd edition) or Ch8 (3rd edition) and Appendix A (for the forward algorithm)

- Week 4: no lectures
- Week 5

# Slides 2019

- Week 5
- Week 6-7
- Feature-rich models: logistic CPDs
- Applications of feature-rich models to NLP
- text classification and richer components in generative models

- Week 8: final exam