Omni-directional motion: pedestrian shape classification using neural networks and active contour models
View/ Open
Author
Tabb, Ken
George, S.
Davey, N.
Adams, R.G.
Attention
2299/818
Abstract
This paper describes a hybrid vision system which, following initial user interaction, can detect and track objects in the visual field, and classify them as human and non-human. The system incorporates an active contour model for detecting and tracking objects, a method of translating the contours into scale-, location- and resolution-independent vectors, and an error-backpropagation feedforward neural network for shape classification of these vectors. The network is able to generate a confidence value for a given shape, determining how ‘human’ and how ‘non-human’ it considers the shape to be. This confidence value changes as the object moves around, providing a motion signature for an object. Previous work has accommodated lateral pedestrian movement across the visual field; this paper describes a system which accommodates all angles of pedestrian movement on the ground plane.