Abstract

We describe a widely applicable method of grouping -or clustering -image features (such as points, lines, corners, flow vectors and the like).It takes as input a "proximity matrix" H -a square, symmetric matrix of dimension N (where N is the number of features).The element i,j of H is an initial estimate of the "proximity" between the ith and yth features.As output it delivers another square symmetric matrix S whose i-)th element is near to, or much less than unity according as features i and j are to be assigned to the same or different clusters.To find S we first determine the eigenvalues and eigenvectors ofH and re-express the features as linear combinations of a limited number of these eigenvectors -those with the largest eigenvalues.The cosines between the resulting vectors are the elements ofS.We demonstrate the application of the method to a range of examples and briefly discuss various theoretical and computational issues.

Keywords

Eigenvalues and eigenvectorsDimension (graph theory)Matrix (chemical analysis)Element (criminal law)Square (algebra)Cluster analysisMathematicsFeature (linguistics)Modal matrixSquare matrixSymmetric matrixDirection cosineAlgorithmCombinatoricsComputer scienceMathematical analysisGeometryDiagonalizable matrixPhysicsStatisticsLaw

Affiliated Institutions

Related Publications

High-Dimensional Probability

High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. ...

2018 Cambridge University Press eBooks 1143 citations

Publication Info

Year
1990
Type
article
Pages
20.1-20.6
Citations
91
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

91
OpenAlex

Cite This

G. Scott, H. C. Longuet–Higgins (1990). Feature grouping by 'relocalisation' of eigenvectors of the proximity matrix. , 20.1-20.6. https://doi.org/10.5244/c.4.20

Identifiers

DOI
10.5244/c.4.20