Interactive session, Thursday March 3
- Quiz
- Jupyter notebook: interactive_lin_reg_with_pizzas (updated March 8)
- Recording
Weekly lecture:
Slides
Video Recordings:
- Linear Regression and Classification
- The Logistic Function and its Derivative
- The Logistic Regression Classifier
- Cross-Entropy Loss
- Training the Logistic Regression Classifier
- Variants of Gradient Descent
- Multi-Class Classification: one-vs.-rest
- Multinomial Logistic Regression
Readings:
Paolo Perrotta, Programming Machine Learning (in O'Reilly library)
- Ch. 5 "A discerning Machine"
- Ch. 7 "The Final Challenge
Marsland does not cover all the stuff we have considered this week, in particular not Logistic Regression, Multinomial Logistic Regression and loss functions. Perrotta's book, ch. 5 has a useful dicussion of logistic regression and loss functions, while ch. 7 presents one-vs.-rest multi-class classification. Ch. 6 "Getting Real" presents the MNIST dataset to be used in ch. 7. We will also make use of MNIST later in the semester.
Observe, however, that Marsland, Sec. 4.6, "Deriving Back-Propagation", presents some of the same material as the lecture in the context of multi-layer neural networks (e.g., sec. 4.6.2, 4.6.3, 4.6.5, 4.6.6).
Weekly exercises