SubjectsSubjects(version: 945)
Course, academic year 2023/2024
   Login via CAS
Theoretical Issues in Neural Networks - Efficiency - NAIL027
Title: Teoretické otázky neuronových sítí - efektivita
Guaranteed by: Department of Software Engineering (32-KSI)
Faculty: Faculty of Mathematics and Physics
Actual: from 2007
Semester: winter
E-Credits: 6
Hours per week, examination: winter s.:2/2, C+Ex [HT]
Capacity: unlimited
Min. number of students: unlimited
4EU+: no
Virtual mobility / capacity: no
State of the course: cancelled
Language: Czech
Teaching methods: full-time
Teaching methods: full-time
Guarantor: doc. RNDr. Jiří Šíma, CSc.
Class: Informatika Mgr. - volitelný
Classification: Informatics > Theoretical Computer Science
Co-requisite : NAIL002
Annotation -
Last update: T_KSI (07.05.2002)
The course provides a review of the computational theory of neural network models: taxonomy, descriptive and computational power, learning complexity. The seminar concentrates on recent results.
Literature - Czech
Last update: T_KSI (05.05.2004)

M. Anthony, P.L. Bartlett: Neural Network Learning: Theoretical Foundations. Cambridge, UK: Cambridge University Press, 1999.

V.P. Roychowdhury, K.-Y. Siu, A. Orlitsky (eds.): Theoretical Advances in Neural Computation and Learning. Boston: Kluwer Academic Publishers, 1994

K.-Y. Siu, V.P. Roychowdhury, T. Kailath: Discrete Neural Computation: A Theoretical Foundation. Englewood Cliffs. NJ: Prentice Hall, 1995.

J. Sima, R. Neruda: Teoreticke otazky neuronovych siti. Praha: MATFYZPRESS, 1996.

Syllabus -
Last update: T_KSI (07.05.2002)

1. Perceptron: integer representation, weight size, linear separability problem.

2. Feedforward networks: implementation of arithmetic and logical functions, universal threshold circuit, $TC^0$-hierarchy and its separation for small depths, total wire length, analog and probabilistic circuits.

3. Recurrent networks: neural language acceptors and Kolmogorov complexity of weights, infinite families of networks, probabilistic models.

4. Hopfield networks: convergence time, stable states, energy minimization, computational power, continuous time.

5. Alternative models: RBF networks, Kohonen networks, spiking neurons.

6. Learning complexity: loading problem, sample complexity and VC-dimension, PAC model.

LITERATURA:

M. Anthony, P.L. Bartlett: Neural Network Learning: Theoretical Foundations. Cambridge, UK: Cambridge University Press, 1999.

V.P. Roychowdhury, K.-Y. Siu, A. Orlitsky (eds.): Theoretical Advances in Neural Computation and Learning. Boston: Kluwer Academic Publishers, 1994.

K.-Y. Siu, V.P. Roychowdhury, T. Kailath: Discrete Neural Computation: A Theoretical Foundation. Englewood Cliffs, NJ: Prentice Hall, 1995.

J. Sima, R. Neruda: Teoreticke otazky neuronovych siti. Praha: MATFYZPRESS, 1996.

 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html