Study Guide@lith   Link to LiU Homepage
 

Linköping Institute of Technology

Link to LiU Homepage
 
Valid for year : 2002
 
TBMI26 Classification, Learning and Neural Nets, 6 ECTS credits.
/Neuronnät och lärande system/

For:   D   I   Ii   IT   Y  

  Area of Education: Technology

Subject area: Electrotechnology

  Advancement level (A-D): D

Aim:
The course introduces methods and computing structures for learning and self-organization. The course describes how such methods can be used to find meaningful relations in multidimensional signals where the degree of complexity makes traditional model-based methods unsuitable or impossible to use. In practice such signals come as a rule rather than an exception. Examples of application areas are function approximation, pattern recognition, content addressable memories, prediction, optimization, process control and classification. Many methods, but not all, have been developed inspired by the function of the brain and a general aspiration to develop computing structures having features such as adaptation, ability to learn, fault tolerance, ability to generalization and extrapolate, distributed knowledge representation and massive parallelism. Examples of areas where techniques based on learning has proven to be competitive are industrial process optimization (paper/pulp, steel, ore), economical prediction of markets, text and speech recognition, document searching and image- and image-sequence analysis.

Prerequisites: (valid for students admitted to programmes within which the course is offered)
Requisite: Algebra, Calculus, Mathematical statistics. Recommended: Signal theory.

Note: Admission requirements for non-programme students usually also include admission requirements for the programme and threshhold requirements for progression within the programme, or corresponding.

Organisation:
Lectures, Exercises, Laboratory exercises (incl obligatory home work)

Course contents:
Classification: - Pattern Recognition - Discriminant functions - Statistical methods - Clustering
Content addressable memories: - State spaces - Hopfield memories - Auto- and Hetero-associative memories
Supervised learning: - The perceptron - The multi-layer perceptron - Stochastic gradient search - The "Error back-propagation" algorithm
Unsupervised learning: - Principal component analysis (PCA) - Canonical correlation analysis (CCA) - Independent component analysis (ICA) - "Winner take all" algorithms - Topology preserving methods - Self organizing maps (SOM)
Reinforcement learning: - Markov models - Reward/Punishment methods - Temporal difference methods (TD) - Q-learning
Genetic methods: - Genetic algorithms - The two-armed bandit - Genes and schemas - Genetic programming.


Course literature:
S. Haykin, Neural Networks, second edition, Prentice Hall 1999, Compendium: Exempelsamling Kompletterande material Lab-PM

Examination:
Written examination
Laboratory work.
2,5 p
1,5 p
 



Course language is Swedish.
Department offering the course: IMT.
Director of Studies: Göran Salerud
Examiner: Magnus Borga
Link to the course homepage at the department


Course Syllabus in Swedish

Linköping Institute of Technology

Link to top of pagep


Contact: TFK , val@tfk.liu.se
Last updated: 01/23/2003