Predicting Electric Power Energy, Using Recurrent Neural Network Forecasting Model
DOI:
https://doi.org/10.21928/juhd.v4n2y2018.pp53-60Abstract
Electricity is counted as a one of the most important energy sources in the world. It has played a main role in developing several sectors. In this study two types of electricity variables have been used, the first was the demand on power energy, and the second was the consumption or energy load in Sulaimani city. The main goal of the study was to construct an analytic model of the recurrent neural network (RNN) for both variables. This model has a great ability in detecting the complex patterns for the data of a time series, which is most suitable for the data under consideration. This model is also more sensitive and reliable than the other artificial neutral network (ANN), so it can deal with more complex data that might be chaotic, seismic….etc. this model can also deal with nonlinear data which are mostly found in time series, and it deals with them differently compared to the other models. This research determined and defined the best model of RNN for electricity demand and consumption to be run in two levels. The first level is to predict the complexity of the suggested model (1-5-10-1) with the error function as (MSE: mean square error, AIC, and R2: coefficient of determination). The second level uses the suggested model to forecast the demand on electric power energy and the value of each unit. Another result of this study is to determine the suitable algorithm that can deal with such complex data. The algorithm (Levenberg-Marquardt) was found to be the most reliable and has the most optimal time to give accurate and reliable results in this study.
References
Azoff, E. M., (1994), “Neural Network Time series forecasting of financial markets“, Chi Chester , England , John Wiley @ sons.
Sutskever, Ilya, (2013), “TRAINING RECURRENT NEURAL NETWORKS”, a thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy Graduate Department of Computer Science University of Toronto.
C. Lee Giles , Steve Lawrence, A. C. Tsoi, (2011), “Noisy Time Series Predicting using a Recurrent Neural Network and Grammatical Inference” , Machine Learning , Volume 44, Number 1/2 , July/ August, pp. 161-183 .
Martens, James & Sutskever, Ilya, (2011), “Learning Recurrent Neural Networks with Hessian-Free Optimization “ , Appearing in Proceedings of the 28th International Conference on Machine Learning, Bellevue ,WA , USA, 2011.
Lipton, Zachary C & Berkowitz, John & Elkan, Charles, (2015), “A Critical Review of Recurrent Neural Networks for Sequence Learning”, University of California, San Diego.
Menhaj. Mohammed. B, (2013), “Computational Intelligence (Val.1) FUNDAMENTAL of NEURAL NETWORKS”, Eighth Edition, Amir Kabir Publisher, Iran.
Makridakis. Spyros, Steven C. Wheelwright and Rob Hyndman, (1998), “FORECASTING Methods and Application”, third Edition, (1998), Johan Wiely and Sons, Inc, United States of America.
Alex Graves, (2008), “Supervised Sequence Labelling with Recurrent Neural Networks”, Technischue Universitat Munchen Fakultat Fur Informatik , PHD. Thesis.
K Levenberg. “A Method for the Solution Of Certain Non-linear Problems In Least Squares”, Quarterly Of Applied Mathematics, 2(2):164-168, Jul.1944.
Paulo. C, Miguel. R, Jose. N. (2001), “Evolving Time Series Forecasting Neural Network Models”, Departmento de Informatica Universidad do Minho 4750-057Braga, Portugal.
Jerome Connor, Les E. Atlas, and Douglas R. Martin (1992), “Recurrent Neural Networks and NARMA Modeling”, Interactive Systems Design Laboratory, Dept. of Electrical Engineering and Dept. of Statistics, University of Washington Seattle, Washington 98195.
Johan A. K. Suykens Jone Van Gestel, Jos De Brabanter, Bart De Moor and Joos VandewalleK. U. Leuven, Belgium (first published 2002 Reprinted 2005), “Least Squares Support Vector machines”, British Library Cataloguing – in – Publication Data.