1 AIT Asian Institute of Technology

A comparison of full and partial recurrent neural networks

AuthorRattaphon Sakchaicharoenkul
Call NumberAIT Thesis no.CS-02-08
Subject(s)Neural networks (Computer science)
Algorithms

NoteA thesis submitted in partial fulfillment of the requirements for t~e degree of Master of Engineering, School of Advanced Technologies
PublisherAsian Institute of Technology
Series StatementThesis ; no. CS-02-08
AbstractRecurrent Neural Networks (RNNs) are one in which self-loops and backward weight connections between neurons are allowed. As a result of these network characteristics, recurrent networks can address temporal behaviors which not possible in Feed Forward Neural Networks (FNNs), such as their behavior in the limit reaches a steady state (fixed point), an oscillation (limit cycle), and an aperiodic instability ( choas). Since RNNs have been increasingly applied to many dynamic system applications, there have been extensive eff01ts to develop a variety of architectures and training algorithms concerning on the enhancement of dynamic system characteristics. This work focuses on comparison of selected architectures between full and partial RNNs and also compares selected training algorithms, namely Error Back Propagation and Exponentially Weighted Least Squares algorithm, Accelerating Convergence Using Approximated Gradient algorithm, and Error Self-Recurrent Back Propagation with Recursive Least Squares algorithm. Empirical comparison is also made with respect to time and accuracy using three data sets: daily streamflow (rainfall-runoff) data, quaiterly data on exports and gross domestic product (GDP) of Thailand, and daily data on stock prices in Thai market. A proposed algorithm was obtained by applying the estimation of pre-image signals at the hidden layer with the fastest training algorithm among several algorithms considered. The algorithm was devised to speed upĀ· the convergence of networks and to enhance the network performance considered with the comparative algorithms. It was found that the proposed algorithm performed much better than the selected training algorithms when it was applied with Partially Recurrent Neural Networks (PRNNs).
Year2002
Corresponding Series Added EntryAsian Institute of Technology. Thesis ; no. CS-02-08
TypeThesis
SchoolSchool of Advanced Technologies (SAT)
DepartmentDepartment of Information and Communications Technologies (DICT)
Academic Program/FoSComputer Science (CS)
Chairperson(s)Huynh Ngoc Phien;
Examination Committee(s)Sadananda, Ramakoti;
DegreeThesis (M.Eng.) - Asian Institute of Technology, 2002


Usage Metrics
View Detail0
Read PDF0
Download PDF0