@article{Gu2015, title = "Incremental learning for -Support Vector Regression ", journal = "Neural Networks ", volume = "", number = "0", pages = " - ", year = "2015", note = "", issn = "0893-6080", doi = "http://dx.doi.org/10.1016/j.neunet.2015.03.013", url = "http://www.sciencedirect.com/science/article/pii/S0893608015000696", author = "Bin Gu and Victor S. Sheng and Zhijie Wang and Derek Ho and Said Osman and Shuo Li", keywords = "Incremental learning", keywords = "Online learning", keywords = "ν -Support Vector Regression", keywords = "Support vector machine ", abstract = "Abstract The ν -Support Vector Regression ( ν -SVR) is an effective regression learning algorithm, which has the advantage of using a parameter ν on controlling the number of support vectors and adjusting the width of the tube automatically. However, compared to ν -Support Vector Classification ( ν -SVC) (Schölkopf et al., 2000), ν -SVR introduces an additional linear term into its objective function. Thus, directly applying the accurate on-line ν -SVC algorithm (AONSVM) to ν -SVR will not generate an effective initial solution. It is the main challenge to design an incremental ν -SVR learning algorithm. To overcome this challenge, we propose a special procedure called initial adjustments in this paper. This procedure adjusts the weights of ν -SVC based on the Karush-Kuhn-Tucker (KKT) conditions to prepare an initial solution for the incremental learning. Combining the initial adjustments with the two steps of \{AONSVM\} produces an exact and effective incremental ν -SVR learning algorithm (INSVR). Theoretical analysis has proven the existence of the three key inverse matrices, which are the cornerstones of the three steps of \{INSVR\} (including the initial adjustments), respectively. The experiments on benchmark datasets demonstrate that \{INSVR\} can avoid the infeasible updating paths as far as possible, and successfully converges to the optimal solution. The results also show that \{INSVR\} is faster than batch ν -SVR algorithms with both cold and warm starts. " }