Kinect will, at some point, be able to understand sign language, according to a new Microsoft patent.
The filing details Kinect's ability to interpret ASL (American Sign Language), and if you don't know that it will also be able to lip read and even detect toe movements.
As the patent explains: "Where the user is unable to speak, he may be prevented from joining in the voice chat. Even though he would be able to type input, this may be a laborious and slow process to someone fluent in ASL. Under the present system, he could make ASL gestures to convey his thoughts, which would then be transmitted to the other users for auditory display. The user's input could be converted to voice locally, or by each remote computer."
It goes on: "In this situation, for example, when the user kills another user's character, that victorious, though speechless, user would be able to tell the other user that he had been 'PWNED'. In another embodiment, a user may be able to speak or make the facial motions corresponding to speaking words. The system may then parse those facial motions to determine the user's intended words and process them according to the context under which they were inputted to the system."
We wonder what the ASL gesture is for 'PWNED'.
It also details the Kinect's ability to track your skeletal structure. "[Within the skeletal mapping system] a variety of joints and bones are identified: each hand, each forearm, each elbow, each bicep, each shoulder, each hip, each thigh, each knee, each foreleg, each foot, the head, the torso, the top and bottom of the spine, and the waist. Where more points are tracked, additional features may be identified, such as the bones and joints of the fingers or toes, or individual features of the face, such as the nose and eyes."
Lie down though and it all goes tits up.