Témata prací (Výběr práce)Témata prací (Výběr práce)(verze: 368)
Detail práce
   Přihlásit přes CAS
Evoluce učitelných systémů
Název práce v češtině: Evoluce učitelných systémů
Název v anglickém jazyce: Evolution of Learnable Systems
Klíčová slova: evoluce, neuronové sítě, strojové učení
Klíčová slova anglicky: evolution, neural networks, machine learning
Akademický rok vypsání: 2015/2016
Typ práce: disertační práce
Jazyk práce:
Ústav: Katedra softwaru a výuky informatiky (32-KSVI)
Vedoucí / školitel: RNDr. František Mráz, CSc.
Řešitel: skrytý - zadáno a potvrzeno stud. odd.
Datum přihlášení: 26.10.2016
Datum zadání: 26.10.2016
Datum potvrzení stud. oddělením: 26.10.2016
Zásady pro vypracování
Recurrent neural networks (RNN) represent a powerful time-aware model, which plays a key role in the future of machine learning. However, with the current scientific knowledge, recurrent networks are difficult to train by standard gradient-based techniques [1]. To overcome this difficulty, various alternative network designs and training algorithms have been proposed, such as Long-Short Term Memory (LSTM, [2]), Gated Recurrent Unit (GRU, [3]), and Echo State Networks [4].

In his master thesis Maximizing Computational Power by Neuroevolution, Filip Matzner has demonstrated that the performance of Echo State Networks may be further improved by neuroevolutionary methods. Well known examples of such methods are, for instance, NeuroEvolution of Augmenting Topologies (NEAT, [5]) and Hypercube-based NEAT (HyperNEAT, [6]).

The goal of this thesis is to explore the promising combination of evolution and trainable recurrent networks to a greater extend. A strong focus will be kept on producing large networks capable of solving complex tasks, e.g. from image processing.
Seznam odborné literatury
[1] Razvan Pascanu, Tomas Mikolov, and Yoshua Bengio. On the difficulty of training recurrent neural networks. ICML (3), 28:1310–1318, 2013.
[2] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory. Neural computation, 9(8):1735–1780, 1997.
[3] Kyunghyun Cho, Bart van Merrienboer, Caglar Gulcehre, Dzmitry Bahdanau, Fethi Bougares, Holger Schwenk, and Yoshua Bengio. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation. arXiv preprint arXiv:1406.1078, 2014.
[4] Herbert Jaeger. The echo state approach to analysing and training recurrent neural networks-with an erratum note. German National Research Center for Information Technology GMD Technical Report, 148:34, 2001.
[5] Kenneth O. Stanley and Risto Miikkulainen. Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2):99–127, 2002.
[6] Kenneth O. Stanley, David B. D’Ambrosio, and Jason Gauci. A hypercube-based encoding for evolving large-scale neural networks. Artificial life, 15(2):185–212,2009.
 
Univerzita Karlova | Informační systém UK