Neural networks: theory and applications (Spring 2015)

Spring 2015 schedule:  MW 1:30-2:50 in Jadwin A08

Organization of synaptic connectivity as the basis of neural computation and learning. Simple perceptrons and the delta rule. Multi-layer perceptrons and convolutional networks. Backpropagation learning. Hebb rule and unsupervised learning. Dynamical theories of recurrent networks: amplifiers, attractors, and hybrid computation. Associative memory networks. Neural Darwinism and reinforcement learning. Models of perception, motor control, memory, and cortical development.

Prerequisites

  • familiarity with linear algebra, multivariate calculus, and probability theory
  • knowledge of a programming language (MATLAB or Python recommended)
  • undergraduates by permission of the instructor

Class requirements

  • problem sets
  • midterm exam
  • final exam

AIs

  • Andrea Giovannucci               agiovann at princeton.edu          Office hours: Mon 3.15-5.15 PM
  • Mark Ioffe                               mioffe at princeton.edu              Office hours: Mon 3.15-5.15 PM

Lecture Schedule

  • Feb. 25. Hierarchical perceptrons for vision.
    • Assignment 4. Backpropagation for large datasets (ps4). Download HW4StarterCode (Due Fri March 6)
  • Mar. 2. Backprop-through-time for recurrent networks.
  • Mar. 4. Hebb rule and competitive learning.
  • Mar. 9. Oja’s rule and principal components analysis.
  • Mar. 11. Midterm Exam.
  • spring recess
  • Mar. 23. Linear networks. Amplification and attenuation.
  • Mar. 25. Invertebrate and vertebrate retinas.
  • Mar. 30. Hybrid analog-digital computation. Permitted and forbidden sets.
  • Apr. 1. Constraint satisfaction. Stereopsis.
  • Apr. 6. More ConvNets
  • Apr. 8. Deep reinforcement learning.
  • Apr. 13. LSTM and handwriting generation.
  • Apr. 15. Language modeling.
  • Apr. 20. Translation.
  • Apr. 22. Caption generation.
  • Apr. 27. Question answering.
  • Apr. 29. Symbolic reasoning.