Designing Early Warning System: Prediction Accuracy of Currency Crisis by Using k-Nearest Neighbour Method

Developing a stable early warning system (EWS)
model that is capable to give an accurate prediction is a challenging
task. This paper introduces k-nearest neighbour (k-NN) method
which never been applied in predicting currency crisis before with the
aim of increasing the prediction accuracy. The proposed k-NN
performance depends on the choice of a distance that is used where in
our analysis; we take the Euclidean distance and the Manhattan as a
consideration. For the comparison, we employ three other methods
which are logistic regression analysis (logit), back-propagation neural
network (NN) and sequential minimal optimization (SMO). The
analysis using datasets from 8 countries and 13 macro-economic
indicators for each country shows that the proposed k-NN method
with k = 4 and Manhattan distance performs better than the other
methods.





References:
<p>[1] M. Gassner, B. Brabec, &ldquo;Nearest neighbour models for local and
regional avalanche forecasting,&rdquo; Natural Hazards and Earth System
Sciences, Vol. 2, pp. 247-253, 2002.
[2] Quansheng Kuang, Lei Zhao, &ldquo;A practical GPU based kNN algorithm,&rdquo;
in Proc. 2nd International Computer Science and Computational
Technology, China, 2009, pp. 151-155.
[3] Javier Arroyo, Carlos Mat&eacute;, &ldquo;Forecasting histogram time series with knearest
neighbours methods,&rdquo; International Journal of Forecasting, vol.
25, pp. 192-207, 2009.
[4] Madhavi Pradhan, &ldquo;Design of Classifier for Detection of Diabetes using
Neural Network and Fuzzy k-Nearest Neighbor Algorithm,&rdquo;
International Journal of Computational Engineering Research, vol. 2,
no. 5, pp. 1384-1387, Sept. 2012.
[5] Jieh-Haur Chen, &ldquo;KNN based knowledge-sharing model for severe
change order disputes in construction,&rdquo; Automation in Construction, vol.
17, pp. 773-779, 2008.
[6] Jae H. Min, Young-Chan Lee, &ldquo;Bankruptcy prediction using support
vector machine with optimal choice of kernel function parameters,&rdquo;
Journal Expert Systems with Applications, vol.28, pp. 603-614, 2005.
[7] James C. Bezdek, Siew K. Chuah, &ldquo;Generalized k-nearest neighbour
rules,&rdquo; Fuzzy Sets and Systems, vol.18, no.3, pp. 237-256, 1986.
[8] Shixin Yu et al., &ldquo;Genetic feature selection combined with composite
fuzzy nearest neighbor classifiers for hyperspectral satellite imagery,&rdquo;
Pattern Recognition Letters, vol. 23, pp. 183-190, 2002.
[9] Lamartine Almeida Teixeira, Adriano Lorena, &ldquo;A method for automatic
stock trading combining technical analysis and nearest neighbor
classification,&rdquo; Expert Systems with Applications, vol. 37, pp. 6885-
6890, 2010.
[10] S. Arya et al., &ldquo;An optimal algorithm for approximate nearest neighbor
searching fixed dimensions,&rdquo; Journal of the ACM, vol. 45, no. 6, pp.
891-923, 1998.
[11] K. Q.. Weinberger, L.K. Saul, &ldquo;Distance metric learning for large
margin nearest neighbor classification,&rdquo; Journal of Machine Learning
Research, vol.10, pp. 207-244, 2009.
[12] J. Goldberger et al., &ldquo;Neighbourhood components analysis,&rdquo; in
Proceedings of the Conference on Neural Information Processing
Systems, 2004.
[13] Lutz Hamel, Knowledge Discovery with Support Vector Machines. New
Jersey: A John-Wiley &amp; Sons, 2009.
[14] Ethem Alpaydin, Introduction to Machine Learning, 2nd ed.
Massachusetts: The MIT Press, 2010.
[15] Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of
Statistical Learning: Data Mining, Inference, and Prediction, 2nd ed.
New York: Springer-Verlag, 2008.
[16] C. M. Bishop, Pattern Recognition and Machine Learning. New York:
Springer-Verlag, 2007.
[17] WEKA &ndash; www.cs.waikato.ac/nz/ml/weka</p>