• Ei tuloksia

Regulation price difference forecast for 2013

5.2 Regulation price difference

5.2.5 Regulation price difference forecast for 2013

Table 4 and Figures 49 -57 present analogical results for year 2013.

Table 5.Estimates for the four seasons in 2013

Seasons Estimate Error Minimum MSE Trees Maximum split Learning rate

Winter 45.4290 37.9877 11 32 0.25

Spring 10.7687 83.5897 11 265 0.25

Summer 86.9463 73.715 6 16 0.25

Fall 43.1449 40.0539 23 8 0.25

0 5 10 15 20 25 30

Min objective vs. Number of function evaluations

Min observed objective Estimated min objective

Figure 49.Ensemble regression with the least estimated cross-validation loss for Winter 2013.

50 100 150

Learning Rate = 0.10 Learning Rate = 0.25 Learning Rate = 0.50 Learning Rate = 1.00 Deep Tree Stump Min. MSE

Figure 50.Predictive ensemble for Winter 2013.

November December Jan 2013 Feb 2013 March 2013 Months

-50 0 50 100 150

Regulation Price

Forecasted Regulation price from Nov - Mar 2013

Original Regulation price diff Predicted Regulation price diff

Figure 51.Forecast made for regulation price difference for Winter 2013.

0 5 10 15 20 25 30

Function evaluations

4.35 4.4 4.45 4.5 4.55 4.6 4.65 4.7 4.75 4.8

Min objective

Min objective vs. Number of function evaluations Min observed objective Estimated min objective

Figure 52.Ensemble regression with the least estimated cross-validation loss for Spring 2013.

April May Months

0 50 100 150

Regulation Price

Forecasted Regulation price from April - May 2013

Original Regulation price diff Predicted Regulation price diff

Figure 53.Forecast made for regulation price difference for Spring 2013.

0 5 10 15 20 25 30

Function evaluations

4.26 4.28 4.3 4.32 4.34 4.36 4.38 4.4 4.42 4.44 4.46

Min objective

Min objective vs. Number of function evaluations

Min observed objective Estimated min objective

Figure 54.Ensemble regression with the least estimated cross-validation loss for Summer 2013.

June July August

Forecasted Regulation price from June - August 2013

Original Regulation price diff Predicted Regulation price diff

Figure 55.Forecast made for regulation price difference for Summer 2013.

50 100 150

Learning Rate = 0.10 Learning Rate = 0.25 Learning Rate = 0.50 Learning Rate = 1.00 Deep Tree Stump Min. MSE

Figure 56.Predictive ensemble for Fall 2013.

September October Months

-30 -20 -10 0 10 20 30 40 50

Regulation Price

Forecasted Regulation price from Sep - Oct 2013

Original Regulation price diff Predicted Regulation price diff

Figure 57.Forecast made for regulation price difference for Fall 2013.

The Forecast was done before the gate closure so that the obtained prediction can be used to make necessary adjustment in the demand and supply of electricity, bids can also be adjusted. It is seen from the tabulated estimates that the larger the number of trees used in the ensemble, the better the performance of the predictions.

For winter and summer it can seen that the predictions were reasonably accurate as the original prices were captured in the forecasts. During March the regulation price is low, which could be because of the melting of the snow which creates enough of water in rivers and reservoirs of hydro power stations and also, there is a reduction in the consumption of electrical power, thus regulation prices are offered at low prices. The forecast performance for spring and fall is not so accurate as most of the spikes are not very well captured.

6 Conclusions

As the proportion of wind and various other power generation grows in the European power supply there arises uncertainty in the predictability of power production. In most cases there always arises some unforeseen fluctuations between production and consump-tion causing the market to be unstable. But for the effectiveness of the market there always arises a necessity for stability in the power system. The result of the forecast of the reg-ulating power market is of much interest to the market players that has unpredictably varying production and demand. Coupling the prediction of regulation price difference and price direction gives useful information about the regulation market, as it enables the TSOs to strike a balance between areas and give the market players an opportunity to manage their risk in the market. Thus, this thesis concentrated on forecasting electricity prices in the regulation power market. The main goal was to obtain forecast for regulation price direction and regulation price difference. Variables with vital impact on electricity price, including temperature, wind power production, production difference and electric-ity prognosis were considered for the predictive ensemble.

The data was categorized yearly from 2013 to 2017 and then split into four seasons i.e winter, spring, summer, fall. The analysis started by dividing the data set into a training and test set which were used in building the forest ensemble. An estimate of generalized error was obtained for each season of the year using 10-fold cross validation. We went further to tune the hyper parameters and tuned the tree-complexity level in order to obtain a reliable predictive ensemble which was selected based on the learning rate, MSE and number of trees used in the ensemble.

From the results obtained, we were able to capture the original price differences to with reasonable level of accuracy, although the forecast for spring and fall were not so reliable as the spikes were not captured in the forecast. It can be concluded that the random forest algorithm can be used to obtain reliable predictions for regulation price direction and price difference if the tuning is deep enough and if more trees are incorporated in the ensemble.

REFERENCES

[1] Xin Lu, Zhao Yang Dong, and Xue Li. Electricity market price spike forecast with data mining techniques. Electric power systems research, 73(1):19–29, 2005.

[2] Nord Pool Spot. The nordic electricity exchange and the nordic model for a liberal-ized electricity market. Nord Pool Spot, Norway, 2009.

[3] Nord Pool Spot. Nord Pool Spot. Elspot Market, Oslo, Norway, 2011.

[4] Nord Pool Spot. The power market - how does it work, 2013.

[5] Mike Alperin and Kai Waehner. Machine learning in manufacturing: Data mining to real-time processing.

[6] Alex D Papalexopoulos and Timothy C Hesterberg. A regression-based approach to short-term system load forecasting. IEEE Transactions on Power Systems, 5(4):1535–1547, 1990.

[7] Market Data. Nordpool. Saatavissa: https://www. nordpoolgroup. com/historical-market-data/ Hakupäivä, 14, 2018.

[8] J Nuno Fidalgo and Manuel A Matos. Forecasting Portugal global load with artificial neural networks. InInternational Conference on Artificial Neural Networks, pages 728–737. Springer, 2007.

[9] Nord Pool Spot Market. Annual report, 2011. Austin, Texas: WFM. Recuperado em, 21, 2011.

[10] Juha Kännö et al. A short-term price forecast model for the Nordic electricity mar-kets. 2013.

[11] Rafał Weron. Electricity price forecasting: A review of the state-of-the-art with a look into the future. International journal of forecasting, 30(4):1030–1081, 2014.

[12] Deepak Singhal and KS Swarup. Electricity price forecasting using artificial neural networks. International Journal of Electrical Power & Energy Systems, 33(3):550–

555, 2011.

[13] Rafal Weron, Adam Misiorek, et al. Short-term electricity price forecasting with time series models: A review and evaluation. Technical report, Hugo Steinhaus Center, Wroclaw University of Technology, 2006.

[14] Elias Kyriakides and Marios Polycarpou. Short term electric load forecasting: A tutorial. Trends in Neural Computation, pages 391–418, 2007.

[15] Heiko Hahn, Silja Meyer-Nieberg, and Stefan Pickl. Electric load forecasting methods: Tools for decision making. European journal of operational research, 199(3):902–907, 2009.

[16] Chao-Ming Huang, Chi-Jen Huang, and Ming-Li Wang. A particle swarm opti-mization to identifying the ARMAX model for short-term load forecasting. IEEE Transactions on Power Systems, 20(2):1126–1133, 2005.

[17] Otavio AS Carpinteiro, Agnaldo JR Reis, and Alexandre PA da Silva. A hierarchical neural model in short-term load forecasting. Applied soft computing, 4(4):405–412, 2004.

[18] Kwang-Ho Kim, Jong-Keun Park, Kab-Ju Hwang, and Sung-Hak Kim. Implemen-tation of hybrid short-term load forecasting system using artificial neural networks and fuzzy expert systems. IEEE Transactions on Power Systems, 10(3):1534–1539, 1995.

[19] RG Brown. Introduction to random signal analysis and Kalman filtering (book).

New York, John Wiley and Sons, 1983, 357 p, 1983.

[20] Hanchuan Peng, Fuhui Long, and Chris Ding. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy.IEEE Transactions on pattern analysis and machine intelligence, 27(8):1226–1238, 2005.

[21] Gareth James, Daniela Witten, Trevor Hastie, and Robert Tibshirani.An introduction to statistical learning, volume 112. Springer, 2013.

[22] Ying-Ying Cheng, Patrick PK Chan, and Zhi-Wei Qiu. Random forest based ensem-ble system for short term load forecasting. InMachine Learning and Cybernetics (ICMLC), 2012 International Conference on, volume 1, pages 52–56. IEEE, 2012.

[23] Haifeng Zhang, Feng Gao, Jiang Wu, Kun Liu, and Xiaolin Liu. Optimal bidding strategies for wind power producers in the day-ahead electricity market. Energies, 5(11):4804–4823, 2012.

[24] Martin T Barlow. A diffusion model for electricity prices. Mathematical finance, 12(4):287–298, 2002.

[25] Sergey Voronin et al. Price spike forecasting in a competitive day-ahead energy market. Acta Universitatis Lappeenrantaensis, 2013.

[26] Nicole Ludwig, Stefan Feuerriegel, and Dirk Neumann. Putting big data analytics to work: Feature selection for forecasting electricity prices using the lasso and random forests. Journal of Decision Systems, 24(1):19–36, 2015.

[27] Oskar Knapik. Modeling and forecasting electricity price jumps in the Nord Pool power market. CREATES, page 1955, 2017.

[28] Jörgen Hellström, Jens Lundgren, and Haishan Yu. Why do electricity prices jump? Empirical evidence from the Nordic electricity market. Energy Economics, 34(6):1774–1781, 2012.

[29] Ingve Simonsen. Measuring anti-correlations in the Nordic electricity spot market by wavelets. Physica A: Statistical Mechanics and its applications, 322:597–606, 2003.

[30] Nordic Energy Regulators NordReg. Nordic market report 2012, 2014.

[31] Eurostat Regional Yearbook. Eurostat. European Commission, 2012.

[32] Ricardo Fernandes, Gabriel Santos, Isabel Praça, Tiago Pinto, Hugo Morais, Ivo F Pereira, and Zita Vale. Elspot: Nord Pool spot integration in mascem electricity market simulator. InInternational Conference on Practical Applications of Agents and Multi-Agent Systems, pages 262–272. Springer, 2014.

[33] Richard Scharff and Mikael Amelin. Trading behaviour on the continuous intraday market Elbas. Energy Policy, 88:544–557, 2016.

[34] Daniel Pogosjan and Joakim Winberg. Förändringar av marknadsdesign och de-ras påverkan på balanshållningen i det svenska kraftsystemet: En kartläggning och analys av de balansansvarigas arbetsgång, 2013.

[35] Arthur Henriot. Market design with wind: managing low-predictability in intraday markets. ROBERT SCHUMAN CENTRE FOR ADVANCED STUDIES Loyola de Palacio Programme on Energy Policy, 2012.

[36] Karin Byman. Electricity production in Sweden iva’s electricity crossroads project.

Royal Swedish Academy of Engineering Sciences, 2016.

[37] L Breiman. “Bagging predictors” Technical Report. UC Berkeley, 1994.

[38] Leo Breiman. Random forests. UC Berkeley TR567, 1999.

[39] Bijay Neupane. Ensemble Learning-based Electricity Price Forecasting for Smart Grid Deployment. PhD thesis, Master’s Thesis, Masdar Institute of Science and Technology, Abu Dhabi, UAE, 2013. Available online: http://www. aungz.

com/PDF/BijayNeupane_Master_Thesis. pdf (accessed on 1st April 2017), 2013.

[40] Leo Breiman. Random forests. Machine learning, 45(1):5–32, 2001.

[41] Alan Julian Izenman. Modern multivariate statistical techniques. Regression, clas-sification and manifold learning, 2008.

[42] Bc Jan Brabec. Decision forests in the task of semi-supervised learning. 2017.

[43] Nitesh V Chawla, Kevin W Bowyer, Lawrence O Hall, and W Philip Kegelmeyer.

Smote: synthetic minority over-sampling technique.Journal of artificial intelligence research, 16:321–357, 2002.

[44] David H Wolpert and William G Macready. An efficient method to estimate bag-ging’s generalization error. Machine Learning, 35(1):41–55, 1999.

[45] A Lahouar and J Ben Hadj Slama. Day-ahead load forecast using random forest and expert input selection.Energy Conversion and Management, 103:1040–1051, 2015.

[46] Kellie J Archer and Ryan V Kimes. Empirical characterization of random forest vari-able importance measures. Computational Statistics & Data Analysis, 52(4):2249–

2260, 2008.

[47] Tin Kam Ho. The random subspace method for constructing decision forests. IEEE transactions on pattern analysis and machine intelligence, 20(8):832–844, 1998.

[48] Marina Skurichina and Robert PW Duin. Bagging and the random subspace method for redundant feature spaces. InInternational Workshop on Multiple Classifier Sys-tems, pages 1–10. Springer, 2001.

[49] Robert Bryll, Ricardo Gutierrez-Osuna, and Francis Quek. Attribute bagging: im-proving accuracy of classifier ensembles by using random feature subsets. Pattern recognition, 36(6):1291–1302, 2003.