This Computer Vision Central article reports Organic Motion's markerless motion capture technology will be used in a performance featuring dancers in Canada, Japan, and the United States. From the article: "Dancers in North American and Asia will dance together using computer vision software developed by Organic Motion.
This Eurogamer article reports the concern over Kinect's ability to detect players sat down on sofas has been eradicated after Microsoft updated the software behind the tech. From the article: "Kinect's much-discussed difficulty detecting sitting and lying down players was caused by it setting the base node used to create skeletal models at the bottom of the spine.
When players were sat down with their knees raised in front of their pelvis, Kinect encountered problems.
In July Microsoft insisted that Kinect could recognise players who were sitting down, despite a recent developer comment to the contrary.
This evening Eurogamer can reveal how Microsoft has made this possible.
This NYTimes article looks at the use of motion tracking systems for athlete improvement. From the article: "In the endless quest for athletic advantage, a handful of major league baseball teams are engaged in an elaborate, largely clandestine race to master an advanced imaging technology that some baseball officials think could influence the way athletes of all ages train, perform and recover from injuries.
The technology, which has also drawn strong interest from some professional and college football teams, is an unlikely hybrid. It combines the technology that captures the human gestures at the core of three-dimensional animations like “Avatar” with advanced sensors, biomechanics and orthopedic research on the most powerful and least damaging ways to hurl a ball, swing a bat or simply run like the wind.
This I Programmer article looks at the LightSpace project: a user interface that uses a whole room. Here's a video about the project. From the article: "It's not so long since we were all entranced by the original GUI with its icons and key innovation - the mouse. Recently we moved on to gestural interfaces and touch is the key input device. Now Microsoft Research are showing off a system that gives us some idea where this might all be going. LightSpace is a user interface that uses a whole room.
This Physorg article talks about a low-cost gestural interface which uses a pair of brightly colored lycra gloves. From the article: "Ever since Steven Spielberg’s 2002 sci-fi movie Minority Report, in which a black-clad Tom Cruise stands in front of a transparent screen manipulating a host of video images simply by waving his hands, the idea of gesture-based computer interfaces has captured the imagination of technophiles. Academic and industry labs have developed a host of prototype gesture interfaces, ranging from room-sized systems with multiple cameras to detectors built into laptops’ screens. But MIT researchers have developed a system that could make gestural interfaces much more practical.
From the Physorg.com website: "Sony Ericsson today unveils the world's first ever motion activated headphones that sense the user. The clever MH907 headphones mean users simply plug in two earphones to start listening to music and pause by removing one earbud. To start listening again simply plug it back in. Do exactly the same to answer and end calls - simple as 1,2,3.
From this ScienceDaily article: "A system that can recognize human gestures could provide a new way for people with physical disabilities to interact with computers. A related system for the able bodied could also be used to make virtual worlds more realistic.
Manolya Kavakli of the Virtual and Interactive Simulations of Reality Research Group, at Macquarie University, Sydney, Australia, explains how standard input devices - keyboard and computer mouse, do not closely mimic natural hand motions such as drawing and sketching. Moreover, these devices have not been developed for ergonomic use nor for people with disabilities.
This Science Centric article reports scientists at the National Institute of Standards and Technology (NIST) have developed software that improves the accuracy of the tracking devices by at least 700 percent. From the article: "The software can be used by scientists in other immersive environments with slight modifications for their individual laboratories. This advance is a step forward in transforming immersive technology that has traditionally been a qualitative tool into a scientific instrument with which precision measurements can be made.
This Engadget post reports Mgestyk Technologies are planning to sell their gesture-based control system on the cheap. From the article: "As we've seen, it's not exactly all that difficult for someone with the necessary skills to whip up their own gesture-based control system, but the folks at upstart Mgestyk Technologies seem to think they've got something a bit more notable on their hands, and they're actually planning on selling it to the general public.