A comparison of architectural alternatives for recurrent networks

Download files
Access & Terms of Use
open access
Altmetric
Abstract
This paper describes a class of recurrent neural networks related to Elman networks. The networks used herein differ from standard Elman networks in that they may have more than one state vector. Such networks have an explicit representation of the hidden unit activations from several steps back. In principle, a single-state-vector network is capable of learning any sequential task that a multi-state-vector network can learn. This paper describes experiments which show that, in practice, and for the learning task used, a multi-state-vector network can learn a task faster and better than a single-state-vector network. The task used involved learning the graphotactic structure of a sample of about 400 English words.
Persistent link to this record
Link to Publisher Version
Link to Open Access Version
Additional Link
Author(s)
Wilson, William Hulme
Supervisor(s)
Creator(s)
Editor(s)
Translator(s)
Curator(s)
Designer(s)
Arranger(s)
Composer(s)
Recordist(s)
Conference Proceedings Editor(s)
Other Contributor(s)
Corporate/Industry Contributor(s)
Publication Year
1993
Resource Type
Conference Paper
Degree Type
UNSW Faculty
Files
download peer-reviewed postprint.pdf 50.18 KB Adobe Portable Document Format
Related dataset(s)