Forecasting of Grape Juice Flavor by Using Support Vector Regression

The research of juice flavor forecasting has become
more important in China. Due to the fast economic growth in China,
many different kinds of juices have been introduced to the market. If a
beverage company can understand their customers’ preference well,
the juice can be served more attractive. Thus, this study intends to
introducing the basic theory and computing process of grapes juice
flavor forecasting based on support vector regression (SVR). Applying
SVR, BPN, and LR to forecast the flavor of grapes juice in real data
shows that SVR is more suitable and effective at predicting
performance.





References:
[1] A. Vellido, P. J. Lisboa, and J. Vaughan, “Neural networks in business: A
survey of applications (1992–1998),” Expert Systems with Applications,
vol. 17, no. 1, 51-70, 1999.
[2] G. S. Atsalakis, and K. P. Valavanis, “Surveying stock market forecasting
techniques–Part II: Soft computing methods,” Expert Systems with
Applications, vol. 36, no. 3, 5932–5941, 2009.
[3] L. Cao, “Support vector machines experts for time series forecasting,”
Neurocomputing, 51, 321-339, Apr. 2003
[4] L. Cao, and F. E. Tay, “Financial forecasting using support vector
machines,” Neural Computing & Applications, 10(2), 184-192, 2001.
[5] L.J. Cao, and F. E. H. Tay, “Support vector machine with adaptive
parameters in financial time series forecasting,” Neural Networks, IEEE
Transactions on, vol. 14, no. 6, 1506-1518, 2003.
[6] F. E. Tay, and L. Cao, “Application of support vector machines in
financial time series forecasting,” Omega, vol. 29, no. 4, 309-317, 2001.
[7] V. N. Vapnik, V. N. “An overview of statistical learning theory,” Neural
Networks, IEEE Transactions on, vol. 10, no. 5, 988-999, 1999.
[8] C. J. Lu, and Y. W. Wang, “Combining independent component analysis
and growing hierarchical self-organizing maps with support vector
regression in product demand forecasting,” International Journal of
Production Economics, vol. 128, no. 2, 603-613, 2010.
[9] K. Y. Chen, and C. H. Wang, “Support vector regression with genetic
algorithms in forecasting tourism demand,” Tourism Management, vol.
28, no. 1, 215-226, 2007.
[10] E. E. Elattar, J. Goulermas, and Q. H. Wu, Q. H. “Electric load forecasting
based on locally weighted support vector regression. Systems, Man, and
Cybernetics,” Part C: Applications and Reviews, IEEE Transactions on,
vol. 40, no.4, 438-447, 2010.
[11] W. C Hong, “Traffic flow forecasting by seasonal SVR with chaotic
simulated annealing algorithm,” Neurocomputing, vol. 74, no. 12,
2096-2107, 2011.
[12] W. C. Hong, Y. Dong, F. Zheng, and C. Y. Lai, “Forecasting urban traffic
flow by SVR with continuous ACO,” Applied Mathematical Modelling,
vol. 35, no. 3, 1282-1291, 2011.
[13] W. C. Hong, Y. Dong, F. Zheng, and S. Y. Wei, “Hybrid evolutionary
algorithms in a SVR traffic flow forecasting model,” Applied
Mathematics and Computation, 217(15), 6733-6747, 2011.
[14] C. J. Lu, T. S. Lee, and C. C. Chiu, “Financial time series forecasting
using independent component analysis and support vector regression,”
Decision Support Systems, vol. 47, no. 2, 115-125,2009.
[15] K. J. Kim, “Financial time series forecasting using support vector
machines,” Neurocomputing, vol. 55, no.1, 307-319,2003.
[16] X. Liang, H. Zhang, J. Xiao, and Y. Chen, “Improving option price
forecasts with neural networks and support vector regressions.,”
Neurocomputing, vol. 72, no. 13, 3055-3065, 2009.
[17] M. Castro-Neto, Y. S. Jeong, M. K. Jeong, and L. D. Han, “Online-SVR
for short-term traffic flow prediction under typical and atypical traffic
conditions,” Expert Systems with Applications, vol. 36, no. 3, 6164-6173,
2003.
[18] P. F. Pai, S. L. Yang, and P. T. Chang, “Forecasting output of integrated
circuit industry by support vector regression models with marriage
honey-bees optimization algorithms,” Expert Systems with Applications,
vol. 36, no. 7, 10746-10751, 2009.
[19] V. Cherkassky, and Y. Ma, Practical selection of SVM parameters and
noise estimation for SVM regression. Neural networks, vol. 17, no. 1,
113-126, 2004.
[20] G. Zhang, B. Eddy Patuwo, B., & Y. Hu, “Forecasting with artificial
neural networks: The state of the art,” International Journal of
Forecasting, vol. 14, no. 1, 35-62, 1998.
[21] Y. Chauvin, and D. E. Rumelhart, “Back-propagation: Theory,
architectures, and applications,” NJ: Lawrence Erlbaum, Hillsdale, 1995,
pp. 1-35.
[22] S. Haykin, “Neural networks: A comprehensive foundation,” Prentice
Hall PTR upper Saddle River, 1994.
[23] P. L. Bartlett, “The sample complexity of pattern classification with
neural networks: the size of the weights is more important than the size of
the network,” Information Theory, IEEE Transactions on, vol. 44, no. 2,
525-536, 1998
[24] Y. Lee, S. H. Oh, M. W. Kim, “An analysis of premature saturation in
back propagation learning,” Neural networks, vol. 6, no. 5, 719-728,1993.
[25] S. F. Crone, J. Guajardo, and R. Weber, “The impact of preprocessing on
support vector regression and neural networks in time series prediction,”
Paper presented at the DMIN, 2006.
[26] S. F. Crone, S. Lessmann, and R. Stahlbock, “The impact of
preprocessing on data mining: An evaluation of classifier sensitivity in
direct marketing,” European Journal of Operational Research, vol. 173,
no. 3, 781-800, 2006.
[27] Z. Tang, and P. A. Fishwick, “Feed-forward neural nets as models for
time series forecasting,” ORSA Journal on Computing, vol. 5, no. 4,
374-385,1993.