For my senior capstone project, three other students and I created a Gesture Recognition Keyboard for iOS. This is meant as an alternate mode of input for users to input text. The concept was that users could create a set of gestures by waving their phone around in 3D space. (kind of a cross between Morse code and sign language) This provides an alternate way for differently abled individuals to type, and increases the accessibility of the iPhone.
While we were able to get a partially working prototype created by the end of the term, due to technical constraints of the libraries and systems we were using, the model that we came up with wasn’t able to be generalized. We ultimately came to the conclusion that heavy data cleaning and normalization would be required before model processing to be able to get useful results out.
Below is the link to the GitHub repository for this project for those interested.