In this paper, we present a least squares version for support vector machines(SVM)classifiers and functionestimation. Due to equality type constraints in the formulation, the solution follows from solving a set of linear equa-tions, instead of quadratic programming for classical SVM. The approach is illustrated on a two-spiral benchmarkclassification problem. The results show that the LS-SVM is an efficient method for solving pattern recognition.
In this paper, we present a new support vector machines-least squares support vector machines (LS-SVMs). While standard SVMs solutions involve solving quadratic or linear programming problems, the least squaresversion of SVMs corresponds to solving a set of linear equations, due to equality instead of inequality constraints inthe problem formulation. In LS-SVMs, Mercer condition is still applicable. Hence several type of kernels such aspolynomial, RBF's and MLP's can be used. Here we use LS-SVMs to time series prediction compared to radial basisfunction neural networks. We consider a noisy (Gaussian and uniform noise)Mackey-Glass time series. The resultsshow that least squares support vector machines is excellent for time series prediction even with high noise.