Please use this identifier to cite or link to this item:
|Title:||Go With the Flow: An Optical Flow Approach to Object Tracking|
|Abstract:||Tracking is the task of identifying feature points in space and through time. Optical flow, the task of identifying pixel displacements through time, is a dense analogue of the sparse task of tracking. Deep network approaches have resulted in state of the art algorithms for both tasks, and motivated by the similarities in the tasks, we explore applying several design insights from the optical flow network RAFT to the tracking network CenterTrack to create a combined network for tracking. Many tracking approaches separate detection by tracking, first using an object detector and then associating the detections through time. In contrast, CenterTrack is a simultaneous detection and tracking algorithm that predicts a spatial displacement between frames for each object detection, and then uses a simple greedy algorithm to match detections by displacement. RAFT is a recurrent network that learns to optimize an optical flow prediction, with the update operator serving as the optimization step of an iterative algorithm, inspired by the traditional optimization setting of optical flow. In our combined approach, RAFT refines the predictions made by CenterTrack. In our best performing combined architecture, which we call CenterRAFT, the combined approach achieves 69.7% and 87.2% MOTA on MOT and KITTI, respectively, compared with 67.5% and 86.8% for baseline CenterTrack, where the models are evaluated on a validation set from a half training, half validation split for each benchmark. The improved performance demonstrates the potential of translating optical flow techniques to the task of tracking, but also must be compared against the increased complexity and cost of our combined architecture approach.|
|Type of Material:||Princeton University Senior Theses|
|Appears in Collections:||Mathematics, 1934-2021|
Files in This Item:
|CHEN-KOERT-THESIS.pdf||4.85 MB||Adobe PDF||Request a copy|
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.