|
TARGET TRACKING IN VIDEO ACQUIRED
USING MOVING FLIR SENSORS
|
|
Summary:
Use of multiple cameras provides extended monitoring capabilities.
Especially, having mobile cameras increases the flexibility of tracking
objects in surveillance scenarios. However, in a multi-camera scenario
different cameras will have different physical properties and different
views of the objects which makes tracking a challenging task. In this
paper, we address these problems and propose a novel approach to
perform tracking across multiple moving cameras. Proposed method
relaxes the constraints imposed by many other approaches. It does not
assume calibrated cameras or planar scenes. Our method is based on the
multi-view geometry between the cameras with overlapping fields of
views. However, well known epipolar geometry of the static scenes where
the cameras are stationary (captured by the fundamental matrix), is not
suitable for our task. Thus, we extend the standard epipolar geometry
to the geometry of dynamic scenes where the cameras are moving. In this
new setting, the fundamental matrix becomes a matrix function. Tracking
is then achieved by using the properties of this fundamental
matrix function without direct computation of the camera geometry.
Tested under a set of experiments, the proposed tracking method shows
promising performance.
|
Supporting
Agency:
|
Related
Publications:
- A. Yilmaz, K. Shafique and M. Shah
"Target
Tracking in Airborne Forward Looking Infrared Imagery,"
Image
and Vision Computing Journal (IVC), Vol. 21, No. 7, 2003, pp.
623-635.
- A. Yilmaz, K. Shafique, T. Olson,
N. Lobo and M. Shah "Target
Tracking in FLIR Imagery Using Mean-Shift and Global Motion Compensation,"
proceedings of IEEE Workshop on Computer Vision Beyond Visible Spectrum
(CVBVS), Hawaii, 2001.
|
|
|