Abstract

This paper presents spacetime forests defined over complementary spatial and temporal features for recognition of naturally occurring dynamic scenes. The approach improves on the previous state-of-the-art in both classification and execution rates. A particular improvement is with increased robustness to camera motion, where previous approaches have experienced difficulty. There are three key novelties in the approach. First, a novel spacetime descriptor is employed that exploits the complementary nature of spatial and temporal information, as inspired by previous research on the role of orientation features in scene classification. Second, a forest-based classifier is used to learn a multi-class representation of the feature distributions. Third, the video is processed in temporal slices with scale matched preferentially to scene dynamics over camera motion. Slicing allows for temporal alignment to be handled as latent information in the classifier and for efficient, incremental processing. The integrated approach is evaluated empirically on two publically available datasets to document its outstanding performance.

Keywords

Computer scienceArtificial intelligenceRobustness (evolution)Classifier (UML)SlicingPattern recognition (psychology)SpacetimeExploitComputer visionFeature extractionComputer graphics (images)

Affiliated Institutions

Related Publications

Recognizing indoor scenes

We propose a scheme for indoor place identification based on the recognition of global scene views. Scene views are encoded using a holistic representation that provides low-res...

2009 2009 IEEE Conference on Computer Visi... 1464 citations

Publication Info

Year
2013
Type
article
Pages
56.1-56.11
Citations
36
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

36
OpenAlex

Cite This

Christoph Feichtenhofer, Axel Pinz, Richard P. Wildes (2013). Spacetime Forests with Complementary Features for Dynamic Scene Recognition. , 56.1-56.11. https://doi.org/10.5244/c.27.56

Identifiers

DOI
10.5244/c.27.56