• Nenhum resultado encontrado

Two Machine Learning Approaches for Short-Term Wind Speed Time Series Prediction

5. CONCLUSION

Effective management of smart grids includes several elements of distributed intelligent control on the supplier and the consumer sides. On the supplier side, a successful integration of renewable power sources and handling of the associated uncertainties are pivotal for the reliability of the power network.

For wind power in particular, a crucial element is to have accurate and stable predictions of wind speeds, concurrently quantifying the associated uncertainties of the predictions. .

In this paper, we have proposed and compared two machine learning approaches, MOGA NN and ELM combined with the nearest neighbors approach for estimating prediction intervals.

The algorithms have been applied on a case study of short-term wind speed prediction using a real dataset of hourly wind speed measurements.

Contrary to classical time-series prediction approaches, both proposed approaches generate prediction intervals for the target of interest. Knowledge of PIs allows the decision makers and operational planners to efficiently quantify the level of uncertainty associated with the forecasts and to consider a multiplicity of solutions/scenarios for the best and worst conditions.

Both algorithms show a good accuracy and generalization ability on the conducted case study.

Paper III-R. Ak, O. Fink and E. Zio (2014), submitted to Special Issue on Neural Networks and Learning Systems Applications in Smart Grid (under review).

158 The results do not show significant differences in terms of the quality of the predicted PIs. We can conclude that both methods yield a reliable estimation of the PIs with a high coverage and a relatively small interval size.

The approaches to estimate the PIs are based on very different concepts and can be selected depending on the specific requirements of the user, including quality of the results, generalization ability, computational efficiency, flexibility and ease of use.

Generally, if an algorithm is specifically trained to optimize two objectives, it is expected to be superior to an algorithm that was trained to optimize a simple error criterion. However, this could not be observed in this research. The presented results indicate that the generalization ability of the ELM on the training dataset is representative of the performance on new data patterns and multi-objective optimization is, therefore, not required in this case.

Both applied algorithms are data-driven and depend highly on the representativeness of the training dataset. Therefore, the quality of PIs can decrease on datasets with large variability and uncertainty in the data.

A possible direction of future research is to implement online learning algorithms that are able to adjust their parameters while novel patterns evolve, without retraining the whole algorithm.

This would be particularly useful for applications, in which the available dataset is too short to cover all possible patterns or in which the environmental or operational conditions change.

REFERENCES

[1] Magazine, vol. 8, no.1, pp. 18-28, 2010.

[2] World Wind Energy Association, Half Year Report. Oct. 2013, 1-22.

http://www.wwindea.org/webimages/WorldWindEnergyReport2012_final.pdf; [Accessed on October 2013].

[3] X. Wang, P. Guo, and X. Huang, “A review of wind power forecasting models,” Energy Procedia, vol. 12, pp. 770-778, 2011.

[4] M. Lei, L. Shiyan, J. Chuanwen, L. Hongling, and Z. Yan, “A review on the forecasting of wind speed and generated power,” Renewable and Sustainable Energy Reviews, vol. 13, pp. 915-920, 2009.

[5] R.S. Tarade and P. K. Katti, “A comparative analysis for wind speed prediction,” in Proc. International Conference on Energy, Automation and Signal, Orissa India, Dec. 2011, pp. 556-561.

[6] A. M. Foley, P. G. Leahy, A. Marvuglia, and E. J. McKeogh, “Current methods and advances in forecasting of wind power generation,” Renewable Energy, vol. 37, pp. 1-8, 2012.

[7] I. G. Damousis, M. C. Alexiadis, J. B. Theocharis, P. S. Dokopoulos, “A fuzzy model for wind speed prediction and power generation in wind parks using spatial correlation,” IEEE Trans. Energy Convers., vol. 19 pp. 352- 361, 2004.

[8] W. Zhang, J. Wang, J. Wang, Z. Zhao, and M. Tian, “Short-term wind speed forecasting based on a hybrid model,” Applied Soft Computing, vol. 13, pp. 3225-3233, 2013.

Paper III-R. Ak, O. Fink and E. Zio (2014), submitted to Special Issue on Neural Networks and Learning Systems Applications in Smart Grid (under review).

159 [9] L. Gong and J. Shi, “On comparing three artificial neural networks for wind speed forecasting,” Applied

Energy, vol. 87, pp. 2313-2320, 2010.

[10] H. Liu, H. Q. Tian, D. F. Pan, and Y. F. Li, “Forecasting models for wind speed using wavelet, wavelet packet, time series and Artificial Neural Networks,” Applied Energy, vol. 107, pp. 191-208, 2013.

[11] A. Khosravi, S. Nahavandi, D. Creighton, and A. F. Atiya, “Comprehensive review of neural network-based prediction intervals and new advances,” IEEE Transactions on Neural Networks, vol. 22, no. 9, pp. 1341- 1356, Sep. 2011.

[12] A. Khosravi, S. Nahavandi, D. Creighton, and A. F. Atiya, “Lower Upper Bound Estimation Method for Construction of Neural Network-Based Prediction Intervals,” IEEE Transactions on Neural Networks, vol.

22, no. 3, pp. 337-346, March 2011.

[13] P. Pinson and G. Kariniotakis, “Conditional Prediction Intervals on Wind Power Generation,” IEEE Transactions on Power Systems, vol. 25, no. 4, pp. 1845-1856, Nov. 2010.

[14] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multi-objective genetic algorithm:

NSGA-II,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182-197, Apr. 2002.

[15] R. Ak, Y. F. Li, V. Vitelli, E. Zio, E. López Droguett, and C. Magno Couto Jacinto, “NSGA-II-trained neural network approach to the estimation of prediction intervals of scale deposition rate in oil & gas equipment,” Expert Systems with Applications, vol. 40, no. 4, pp. 1205-1212, March 2013.

[16] S. Salcedo-Sanz, E. G. Ortiz-Garcia, A. M. Pérez-Bellido, A. Portilla-Figueras, and L. Prieto, “Short term wind speed prediction based on evolutionary support vector regression algorithms,” Expert Systems with Applications, vol. 38, no. 4, pp. 4052-4057, 2011.

[17] D. L. Shrestha and D. P. Solomatine, “Machine learning approaches for estimation of prediction interval for the model output,” Neural Networks, vol. 19, no. 2, pp. 225-235, 2006.

[18] K. Hornik, M. Stinchcombe, and H. White. “Multilayer feedforward networks are universal approximators,” Neural Networks, vol. 2, no.5, pp. 359-366, 1989.

[19] X-S. Zhang, Neural Networks in Optimization. The Netherlands, Kluwer Academic Publisher, 2000.

[20] C. M. Bishop, Neural networks for Pattern Recognition. New York, Oxford university press, 1995, pp. 1- 477.

[21] P. Arabie, J. H. Lawrence, and G. De Soete, eds. Clustering and Classification. Singapore, World Scientific Publishing, 1996.

[22] T. Sorsa, H. N. Koivo, and H. Koivisto. “Neural networks in process fault diagnosis,” IEEE Transactions on Systems, Man and Cybernetics, vol. 21, no. 4, pp. 815-825, Jul/Aug. 1991.

[23] D. Svozil, V. Kvasnicka, and J. Pospichal, “Introduction to multi-layer feed-forward neural networks,”

Chemometrics and Intelligent Laboratory Systems, vol. 39, pp. 43-62, 1997.

[24] R. Rojas, Neutral Networks: A Systematic Introduction. Springer, 1996. Chapter 7.

[25] G. B. Huang, “Learning capability and storage capacity of two-hidden-layer feedforward networks,” IEEE Transactions on Neural Networks, vol. 14, no. 2, pp. 274-281, Mar. 2003.

[26] Y. Sawaragi, H. Nakayama, and T. Tanino. Theory of Multiobjective Optimization, Orlando, FL: Academic Press Inc., 1985, pp. 1-296.

[27] N. Sirinivas, and K. Deb, “Multi-objective optimization using non-dominated sorting in genetic algorithms,”

Journal of Evolutionary Computation, vol. 2, pp. 221-248, 1994.

[28] C. A. C. Coello, G. B. Lamont, and D. A. Van Veldhuizen. Evolutionary algorithms for solving multi- objective problems, 2nd ed. D. E. Goldberg and J.R. Koza, Eds., USA: Springer, 2007.

[29] A. Konak D.W. Coit and A.E. Smith “Multi-objective optimization using genetic algorithms: A tutorial,”

Reliability Engineering & System Safety, vol. 91, no. 9, pp. 992–1007, Sep. 2006.

[30] M. T. Jensen, “Reducing the run-time complexity of multiobjective EAs: The NSGA-II and other algorithms,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 5, pp. 503-515, Oct. 2003.

[31] S. Briesemeister, J. Rahnenführer, and O. Kohlbacher, “No longer confidential: Estimating the confidence of individual regression predictions,” PloS one, vol. 7, no. 11, e48723, Nov. 2012.

[32] G.B. Huang, H. Zhou, X. Ding, and R. Zhang, “Extreme learning machine for regression and multiclass classification,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 42, no. 2, pp. 513-529, Apr. 2012.

[33] J.H. Friedman, J.L. Bentley, and R.A. Finkel, “An algorithm for finding best matches in logarithmic expected time,” ACM Transactions on Mathematical Software (TOMS) vol. 3, no. 3, pp. 209-226, 1997.

[34] G.B. Huang, Q.Y. Zhu, and C.K. Siew, “Extreme learning machine: Theory and applications,”

Neurocomputing, vol. 70, pp. 489-501, 2006.

[35] G.B. Huang, D. Wang, and Y. Lan, “Extreme learning machines: a survey,” International Journal of Machine Learning and Cybernetics, vol. 2, no. 2, pp. 107-122, 2011.

[36] G.B. Huang and L. Chen, “Convex incremental extreme learning machine,” Neurocomputing, vol. 70, no.16, pp. 3056-3062, 2007.

Paper III-R. Ak, O. Fink and E. Zio (2014), submitted to Special Issue on Neural Networks and Learning Systems Applications in Smart Grid (under review).

160 [37] G.B. Huang and L. Chen, “Enhanced random search based incremental extreme learning machine,”

Neurocomputing, vol.71, no.16, pp.3460-3468, 2008.

[38] G.B. Huang, L. Chen, and C.K. Siew, “Universal approximation using incremental constructive feedforward networks with random hidden nodes,” IEEE Transactions on Neural Networks, vol. 17, pp. 879-892, July 2006.

[39] G.B. Huang, Q.Y. Zhu, and C.K. Siew, “Extreme learning machine: a new learning scheme of feedforward neural networks,” in Proc. IEEE International Joint Conference on Neural Networks, vol. 2, pp. 985-990, 2004.

[40] B. P. Chacko, V. R. V. Krishnan, G. Raju, and P. B. Anto, “Handwritten character recognition using wavelet energy and extreme learning machine,” International Journal of Machine Learning and Cybernetics, vol. 3, no.2, pp. 149–161, June 2012.

[41] O. Fink, E. Zio, and U. Weidmann, “Extreme learning machines for predicting operation disruption events in railway systems,” in Proc. European Safety and Reliability Conference (ESREL), Amsterdam, Netherlands, 2013.

[42] Y. Lan, Y.C. Soh, and G.B. Huang, “Ensemble of online sequential extreme learning machine,”

Neurocomputing, vol. 72, no. 13-15, pp. 3391 – 3395, Aug. 2009.

[43] Y. Miche, A. Sorjamaa, P. Bas, O. Simula, C. Jutten, and A. Lendasse, “Op-elm: Optimally pruned extreme learning machine,” IEEE Transactions on Neural Networks, vol. 21, no. 1, pp. 158–162, Jan. 2010.

[44] Website: http://www.weatheroffice.gc.ca/canada_e.html, (Dec., 2012).

[45] G. E. P. Box, G. M. Jenkins, and G. C. Reinsel. Time Series Analysis, Forecasting and Control, fourth ed., Wiley, 2008.

[46] A.E. Hoerl and R.W. Kennard, “Ridge regression: Biased estimation for nonorthogonal problems,”

Technometrics, vol. 12, pp. 55-67, 1970.

[47] Q. Hao, D. Srinivasan, and A. Khosravi. “Short-Term Load and Wind Power Forecasting Using Neural Network-Based Prediction Intervals,”

[48] A. Khosravi and S. Nahavandi, “Combined Nonparametric Prediction Intervals for Wind Power Generation,” IEEE Transactions on Sustainable Energy, vol. 4, no. 4, pp. 849-856, Oct. 2013.

[49] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” in Cognitive Modeling, Chapter 8. T. A. Polk and C. M. Seifert, Eds., 2002, pp. 213-220.

161

PAPER IV

An Interval-Valued Neural Network Approach for Prediction Uncertainty