Abstract
Mixture modeling is becoming an increasingly important tool in the remote sensing community as researchers attempt to resolve subpixel, area information. This paper compares a well-established technique, linear spectral mixture models (LSMM), with a much newer idea based on data selection, support vector machines (SVM). It is shown that the constrained least squares LSMM is equivalent to the linear SVM, which relies on proving that the LSMM algorithm possesses the “maximum margin” property. This in turn shows that the LSMM algorithm can be derived from the same optimality conditions as the linear SVM, which provides important insights about the role of the bias term and rank deficiency in the pure pixel matrix within the LSMM algorithm. It also highlights one of the main advantages for using the linear SVM algorithm in that it performs automatic “pure pixel” selection from a much larger database. In addition, extensions to the basic SVM algorithm allow the technique to be applied to data sets that exhibit spectral confusion (overlapping sets of pure pixels) and to data sets that have nonlinear mixture regions. Several illustrative examples, based on an area-labeled Landsat dataset, are used to demonstrate the potential of this approach.
Keywords
Affiliated Institutions
Related Publications
An Equivalence Between Sparse Approximation and Support Vector Machines
This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation sche...
Asymptotic Behaviors of Support Vector Machines with Gaussian Kernel
Support vector machines (SVMs) with the gaussian (RBF) kernel have been popular for practical use. Model selection in this class of SVMs involves two hyper parameters: the penal...
LIBSVM
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their...
Fast Training of Support Vector Machines Using Sequential Minimal Optimization
This chapter describes a new algorithm for training Support Vector Machines: Sequential Minimal Optimization, or SMO. Training a Support Vector Machine (SVM) requires the soluti...
Advances in kernel methods: support vector learning
Introduction to support vector learning roadmap. Part 1 Theory: three remarks on the support vector method of function estimation, Vladimir Vapnik generalization performance of ...
Publication Info
- Year
- 2000
- Type
- article
- Volume
- 38
- Issue
- 5
- Pages
- 2346-2360
- Citations
- 206
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1109/36.868891