OpticalFlowForHMI/

From ISLAB/CAISR
Title OpticalFlowForHMI/
Summary Implent a high speed, low latency motion estimation technique
Keywords
TimeFrame
References
Prerequisites
Author Joan Salvatella Serra
Supervisor Stefan Karlsson
Level Master
Status Closed

Generate PDF template

Project description

This project deals with dense optical flow which is the motion estimated over an entire image plane given video observation. Typically, each pixel position in each frame of the video will be associated with a 2D motion vector, describing the translation of one point in the previous frame into the current. This is distinctly different from sparse flow, where keypoints are first detected (based on their stability and suitability for tracking) who’s position are then tracked. For Human Machine Interfacing (HMI) both kinds of flow are valuable, but often the dense flow approach is overlooked. The main reason is that sparse field methods can be estimated with higher accuracy (due to only using keypoints) and speed (because the number of keypoints are few).

Dense flow field methods (most often based on variational methods in iterative schemes) can achieve high accuracy and stability at the cost of a) high computational cost and b) high lag times (the former can be addressed by adding more computational resources, such as GPUs, while the latter is more difficult to address and is most often over-looked in reported results). For HMI applications, such as gesture recognition, ego-motion estimation of a handheld device etc. it is often more important with the aspects of:

1. stability (no odd explosions of estimations, and smoothness over time),
2. consistency(similar actions provide similar estimations), and
3. low lag times(delay in time between receiving a new frame and having a useful motion vector).

(Accuracy is noticeably not part of this list). It is possible to achieve high speed dense field estimations using strictly local algorithms that require no iterative procedures. In order to get stability and consistency, such methods need some form of regularization (which is usually where variational approaches come in). However, one can do a strictly local regularization approach instead, and avoid iterative procedures. The traditional Lucas and Kanade approach to optical flow, for example, can be regularized using a simple tikhinov parameter. However, this introduces a new error that is minimized, and does not answer the question of how to best set the parameter, given the data. Another approach, is to estimate local optical flow by separating the cases of a) distributed image structure, b) no image structure (constant gray values), and c) linear symmetric image structure.

A problem that variational methods usually aim to address, is that of the aperture problem, namely that motion of linear structures cannot be estimated. A local method could effectively estimate two, non-overlapping fields of motion (called henceforth line motions and point motions). The motion of lines is recognized as having one degree of freedom lower than that of full point motions, yet the spatially varying line motion field is stable and consistent, and when adding it to the point motion field, the resulting field can be viewed as a locally regularized Lukas and Kanade approach. However, adding the fields together in this way is reducing the information content, as the separate nature of the fields hold information in itself, useful for higher level tasks in HMI (such as gesture recognition) as well as possibilities for future global regularization (such as phrasing a variational method based on the separate flow fields).

One of the benefits of the recently developed algorithm for point and line flow estimation, is that it is very fast. This project has the main goal of implementing it on a handheld portable device (android smart phone) and as proof of concept, use it as an optical mouse for a connected pc.

Project headlines

1. Implement local PL optical flow on a handheld device.
2. Make use of OpenFrameworks built in functions.
3. Extend OpenFrameworks image acquisition classes to provide optical flow when requested by developer.
4. Useful interface, release as open-source for community.
5. Implement an optical mouse, for use with the PC, as a demo of the newly derived classes.
6. Report and paper writing

Skills in c/c++ , and image analysis/computer vision are required. Prior development skills on handheld devices are desireable.