 
Abstract/Syllabus:

Seung, Sebastian, 9.641J Introduction to Neural Networks, Spring 2005. (Massachusetts Institute of Technology: MIT OpenCourseWare), http://ocw.mit.edu (Accessed 08 Jul, 2010). License: Creative Commons BYNCSA
Neurons forming a network in disassociated cell culture. (Image courtesy of Seung Laboratory, MIT Department of Brain and Cognitive Sciences.)
Course Highlights
This course features a selection of downloadable lecture notes and problem sets in the assignments section.
Course Description
This course explores the organization of synaptic connectivity as the basis of neural computation and learning. Perceptrons and dynamical theories of recurrent networks including amplifiers, attractors, and hybrid computation are covered. Additional topics include backpropagation and Hebbian learning, as well as models of perception, motor control, memory, and neural development.
Technical Requirements
Special software is required to use some of the files in this course: .mat, and .m.
*Some translations represent previous versions of courses.
Syllabus
Course Philosophy
The subject will focus on basic mathematical concepts for understanding nonlinearity and feedback in neural networks, with examples drawn from both neurobiology and computer science. Most of the subject is devoted to recurrent networks, because recurrent feedback loops dominate the synaptic connectivity of the brain. There will be some discussion of statistical pattern recognition, but less than in the past, because this perspective is now covered in Machine Learning and Neural Networks. Instead the connections to dynamical systems theory will be emphasized.
Modern research in theoretical neuroscience can be divided into three categories: cellular biophysics, network dynamics, and statistical analysis of neurobiological data. This subject is about the dynamics of networks, but excludes the biophysics of single neurons, which will be taught in 9.29J, Introduction to Computational Neuroscience.
Prerequisites

Permission of the instructor

Familiarity with linear algebra, multivariate calculus, and probability theory

Knowledge of a programming language (MATLAB® recommended)
Course Requirements

Problem sets

Midterm exam

Final exam
Textbook
The following text is recommended:
Hertz, John, Anders Krogh, and Richard G. Palmer. Introduction to the Theory of Neural Computation. Redwood City, CA: AddisonWesley Pub. Co., 1991. ISBN: 0201515601.
Calendar
Calendar schedule.
Lec # 
Topics 
Key DATES 
1 
From Spikes to Rates 

2 
Perceptrons: Simple and Multilayer 

3 
Perceptrons as Models of Vision 

4 
Linear Networks 
Problem set 1 due 
5 
Retina 

6 
Lateral Inhibition and Feature Selectivity 
Problem set 2 due 
7 
Objectives and Optimization 
Problem set 3 due 
8 
Hybrid AnalogDigital Computation
Ring Network 

9 
Constraint Satisfaction
Stereopsis 
Problem set 4 due 
10 
Bidirectional Perception 

11 
Signal Reconstruction 
Problem set 5 due 
12 
Hamiltonian Dynamics 


Midterm 

13 
Antisymmetric Networks 

14 
ExcitatoryInhibitory Networks
Learning 

15 
Associative Memory 

16 
Models of Delay Activity
Integrators 
Problem set 6 due one day after Lec #16 
17 
Multistability
Clustering 

18 
VQ
PCA 
Problem set 7 due 
19 
More PCA
Delta Rule 
Problem set 8 due 
20 
Conditioning
Backpropagation 

21 
More Backpropagation 
Problem set 9 due 
22 
Stochastic Gradient Descent 

23 
Reinforcement Learning 
Problem set 10 due 
24 
More Reinforcement Learning 

25 
Final Review 


Final Exam 




Further Reading:

Readings
Course readings.
Lec # 
Topics 
READINGS 
1 
From Spikes to Rates 
Koch, Christof. Biophysics of Computation: Information Processing in Single Neurons. New York, NY: Oxford University Press, 2004, section 14.2, pp. 335341. ISBN: 0195181999.
Ermentrout, Bard. "Reduction of ConductanceBased Models with Slow Synapses to Neural Nets." Neural Computation 6, no. 4 (July 1994): 679695. 
2 
Perceptrons: Simple and Multilayer 

3 
Perceptrons as Models of Vision 
Marr, David. Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. New York, NY: W.H. Freeman & Company, 1983, section 2.2, pp. 5479. ISBN: 0716715678.
Hubel, David H. Eye, Brain, and Vision. New York, NY: W.H. Freeman & Company, 1987, chapter 3, pp. 3946. ISBN: 0716750201.
LeNet Web site 
4 
Linear Networks 

5 
Retina 
Adelson, E. H. "Lightness Perception and Lightness Illusions." The New Cognitive Neurosciences. Edited by Michael S. Gazzaniga. 2nd ed. Cambridge, MA: MIT Press, 1999, pp. 339351. ISBN: 0262071959.
Hartline, H. K., and F. Ratliff. "Inhibitory Interaction in the Retina of Limulus." Physiology of Photoreceptor Organs. Edited by Michelangelo G. F. Fuortes. New York, NY: SpringerVerlag, 1972, pp. 382447. ISBN: 0387057439. 
6 
Lateral Inhibition and Feature Selectivity 
Press, William H., Saul A. Teukolsky, William T. Vetterling, and Brian P. Flannery. Numerical Recipes in C: The Art of Scientific Computing. New York, NY: Cambridge University Press, 1992, chapters 12, and 13. ISBN: 0521431085.
Strang, Gilbert. Introduction to Applied Mathematics. Wellesley, MA: WellesleyCambridge Press, 1986, section 4.2, pp. 290309. ISBN: 0961408804. 
7 
Objectives and Optimization 

8 
Hybrid AnalogDigital Computation
Ring Network 
Hahnloser, R. H., R. Sarpeshkar, M. A. Mahowald, R. J. Douglas, and H. S. Seung. "Digital selection and analog amplification coexist in a cortexinspired silicon circuit." Nature 405, no. 6789 (June 22, 2000): 94751.
Hahnloser, Richard H., H. Sebastian Seung, and JeanJacques Slotine. "Permitted and Forbidden Sets in Symmetric ThresholdLinear Networks." Neural Computation 15, no. 3 (March 2003): 62138. 
9 
Constraint Satisfaction
Stereopsis 

10 
Bidirectional Perception 

11 
Signal Reconstruction 

12 
Hamiltonian Dynamics 


Midterm 

13 
Antisymmetric Networks 

14 
ExcitatoryInhibitory Networks
Learning 

15 
Associative Memory 

16 
Models of Delay Activity
Integrators 

17 
Multistability
Clustering 

18 
VQ
PCA 

19 
More PCA
Delta Rule 

20 
Conditioning
Backpropagation 

21 
More Backpropagation 

22 
Stochastic Gradient Descent 

23 
Reinforcement Learning 

24 
More Reinforcement Learning 

25 
Final Review 

Discussion Group
Educators, students, and selflearners interested in "Course 9.641J / 8.594J: Introduction to Neural Networks" are invited to interact with others utilizing these materials in their teaching and learning through the Discussion Group for this course.
This service, offered by MIT OCW and hosted by the Open Sustainable Learning Opportunities Research Group in the Department of Instructional Technology at Utah State University, offers individuals around the world the opportunity to connect with each other, collaborate, form study groups, and receive support for their use of MIT OCW materials in formal and informal educational settings.
OLS is a research project that is focused on building "social software" that enables informal learning communities to form around existing open educational content. The fundamental premise of OLS is that full educational opportunity requires a user to have the social access to other people who can answer questions and provide support. Since the sponsors of free and open Webbased materials cannot typically provide this access, the social support must come from other users. Therefore, OLS:

Operates independently of MIT OCW

Requires users to register and login to participate

Is not a degreegranting or certificategranting program

Does not provide formal access to MIT or Utah State University faculty
Connect to the Discussion Group for Course 9.641J / 8.594J: Introduction to Neural Networks now.
Open Learning Support is funded by a grant from The William and Flora Hewlett Foundation.



Rating:
0 user(s) have rated this courseware
Views:
5871




