Hand gesture recognition from virtual hand data





In gesture research, one main challenge is to recognize when the user starts a specific gesture and when it ends. Further, identifying a gesture from an unintentional movement is also a challenge. However, with the proliferation of Augmented Reality (AR) and Virtual Reality (VR) and the Internet of Things (IoT) gestures are becoming prevalent due to its ease of use and adaptability.

A gestural command can be classified as either posture or gesture. A posture is a static configuration of hand/s and a gesture is a movement of the hand. This project aims to recognize postures and gestures using data gathered through virtual hands. These virtual hands data is collected from real users' hand movements and normalized to one hand scale (that is different to the users' hand through a normalisation process). The aim of this project is to recognize these postures and gestures when they are performed again in AR/VR space. The project has two parts. First to recognize hand postures. Secondly to recognize gestures. Knowledge in applied Machine Learning is an advantage.

Updated:  1 June 2019/Responsible Officer:  Dean, CECS/Page Contact:  CECS Marketing