This is getting silly now. A group of graduate engineering students has adapted Kinect to keep a watchful eye over surgeons using robotic equipment.
"For robotics-assisted surgeries, the surgeon has no sense of touch right now," said Howard Chizeck, UW professor of electrical engineering at the University of Washington.
"What we're doing is using that sense of touch to give information to the surgeon, like 'You don't want to gohere.'"
At the moment, when surgeons use remote controlled robotic tools inside the body to avoid sticking their big, fat hands and arms in there, they obviously lose the direct sense of touch over what they're doing.
Electrical engineering graduate student Fredrik Ryden has solved this problem by writing code that used the Kinect to map and react to environments in three dimensions (not dissimilar to how you might use it at home, we assume) and send spatial information about that environment back to theuser.
"We could define basically a force field around, say, a liver," said Chizeck. "If the surgeon got too close, he would run into that force field and it would protect the object he didn't want tocut."
"It's really good for demonstration because it's so low-cost, and because it's really accessible," Ryden said. "You already have drivers, and you can just go in there and grab the data. It's really easy to do fast prototyping because Microsoft's already builteverything."
According to Chizeck, a similar system without Kinect would usually cost around $50 thousand, and the team are suggesting that it could get become even more useful.
"Suppose there's an earthquake somewhere," Chizeck said. "First responders could get victims to a van with a satellite dish on top and the tools inside, and a surgeon somewhere else could perform thesurgery."
Both incredible and noble, but this Kinect robot still wins on cool points.