Research Abstract
We created a device that can be used for human-computer interaction (HCI). This is done by first calculating the optical flow from four cameras that are set up in a "cross" formation and then analyzing the egomotion from them. We then train a simple classifier on the egomotion that we found and then obtain gesture recognition results in real-time.