Abstract | Recurrent Neural Networks (RNNs) are one type of neural networks in which self-loops and
backward connections between nodes are allowed. One of the consequences of this fact is that
dynamic behaviors not possible with strictly feed-forward networks, such as limit cycles and
chaos, can be produced with recurrent networks. To gain further insight into RNNs, this study
focused on the network perfo1mance of two fast on-line algorithms, namely Error Back
Propagation and Exponentially Weighted Least Squares (EBP-EWLS), and Accelerating
Convergence using Approximated Gradient (ACAG).
To evaluate the performance of the two aforementioned algorithms in the forecasting problem,
three types of data, namely daily stock prices, quarterly export and gross domestic product,
and daily streamflow, were considered.
In terms of the efficiency index, root mean squared error, mean absolute deviation, and
maximum relative error, both algorithms perform very satisfactorily, with the EBP-EWLS
being slightly better. However, it talces more computation time than the ACAG does.
Based upon the results obtained, a new algorithm was developed by combining three different
methods; Error Back Propagation, Error self-recurrent, and Recursive Least Squares methods.
From its applications to the above data sets, it was found that the new algorithm is
considerably faster than the EBP-EWLS, at the same time, it can eliminate the very sensitive
parameter in the ACAG algorithm. Moreover, this new algorithm performs much better than
the ACAG, and reaches almost the same performance as the EBP-EWLS.
Finally, a simple analysis on the complexity of the RNNs was also carried out. It was found
that as the number of hidden nodes increases, both the storage and computation time increases.
With the same number of nodes added to hidden layer or the input layer, the increase in
storage and computation time is much higher than the former case. |