We suggest a novel algorithm that tracks given shapes in real-time from a low quality video stream. The algorithm is based on a careful selection of a small subset of pixels, that suffices to obtain an approximation of the observed shape. The shape can then be extracted quickly from the small subset. We implemented the algorithm in a system for mutual localization of a group of low-cost toy-quadcopters. Each quadcopter carries only a single 8-gram RGB camera, and stabilizes itself via real-time tracking of the other quadcopters in 30 frames per second.
Existing algorithms for real-time shape fitting are based on more expensive hardware, external cameras, or have significantly worse performance. We provide full open source to our algorithm, experimental results, benchmarks, and video that demonstrates our system. We then discuss generalizations to other shapes, and extensions for more robotics applications.
The code can be found here : Code
コメント