This press release reports Gyration released their UltraSense technology which includes multiple motion sensors and a core control chip containing a sophisticated set of patented algorithms, power management features and integration capabilities. From the press release: "It offers “five degrees of freedom” and breakthrough sensing accuracy for consumer control devices for PC, PC-TV integration, gaming and set top box applications.
Take a look at the Prakash system webpage. Their lighting-aware motion capture system using photosensing markers and multiplexed illumination is well documented (videos, images, a presentation and a poster are available). From their paper abstract: "In this paper, we present a high speed optical motion capture method which can measure three dimensional motion, orientation, and incident illumination at tagged points in a scene. We use tracking tags that can be imperceptibly embedded in attire or other objects and can work in natural lighting conditions. Our system can support an unlimited number of tags in a scene, and each tag has a unique id thus eliminating marker reacquisition issues. Our tags also provide incident illumination data which can be used when inserting synthetic elements in order to match the lighting of the scene at the time of capturing. This makes the technique ideal for on-set motion capture or the real-time broadcasting of virtual sets.
This ScienceDaily article reports a revolutionary unobtrusive sensor that collects and immediately transmits data about posture, stride length, step frequency, acceleration and response to shock waves travelling through the body could boost an athlete's sporting success in future. From the article: "Cufflink-sized and clipped behind the wearer's ear, the sensor is unique in two key respects. First, it does not hinder performance, yet can gather unprecedentedly wide-ranging and useful data about posture, stride length, step frequency, acceleration, response to shock waves travelling through the body etc.
"Correction of Location and Orientation Errors in Electromagnetic Motion Tracking"
John G. Hagedorn, Steven G. Satterfield, John T. Kelso, Whitney Austin, Judith E. Terrill, Adele P. Peskin
Presence: Teleoperators & Virtual Environments, August 2007, Vol. 16, No. 4, Pages 352-366.
We describe a method for calibrating an electromagnetic motion tracking device. Algorithms for correcting both location and orientation data are presented. In particular, we use a method for interpolating rotation corrections that has not previously been used in this context. This method, unlike previous methods, is rooted in the geometry of the space of rotations. This interpolation method is used in conjunction with Delaunay tetrahedralization to enable correction based on scattered data samples. We present measurements that support the assumption that neither location nor orientation errors are dependent on sensor orientation. We give results showing large improvements in both location and orientation errors. The methods are shown to impose a minimal computational burden.
This marketwire press release reports wireless InterSense's IC3 inertial sensors will be integrated into Raydon's Virtual Warrior Interactive Trainers to enhance training technique and improve users' mission readiness. From the press release: BEDFORD, MA--(Marketwire - September 5, 2007) - InterSense, Inc., a market leader in precision motion tracking technology, today announced multiple orders for its wireless InertiaCube3 (IC3) sensors for integration into Raydon's Virtual Warrior Interactive (VWI) trainers.
This Israel21c article takes a look at XTR (Extreme Reality): a south Tel Aviv startup that has developed a technology that allows a user's three-dimensional body movements to be translated onto the computer in real time. From the article: "Dor Givon ducks and weaves, and the animated figure on his computer mimics the movements, narrowly missing a right hook, and then landing a good solid punch on his opponent's face. There's a smack, as the blow makes impact and Givon turns, adjusts the small web camera attached to his laptop, and says: "It's as simple as that. The camera tracks my every move."
This CVG article reports Sony claims the PS3's ability to track the position of an object in 3D space using the new Eye Toy offers up more potential for motion-sensing gaming than the Wii Remote. From the article: "Eye Toy card battle game, Eye of Judgement, boasts some pretty impressive optical recognition technology that not only identifies a card (from a selection of hundreds) and summons the right monster, but it can track the movement of that card - including its distance from the camera and tilt orientation - and keep that monster positioned correctly on it.
This Virtual reality and head-mounted displays blog post takes a look at the motion capture demonstrations at SIGGRAPH 2007. From the post: "There were many motion capture demonstrations at the recent Siggraph show. Applications for motion capture include animated films, special-effects sequences, video games (e.g. capturing the throwing motion of a quarterback), academic studies in motion and more.
Most demonstrations involved models wearing body suits and moving in a large area with a uniform background. Real-time systems captured the movement and displayed it on computer screens.
This NewScientist Technology Blog article takes a look at the work of the Japanese experimental performance artist, Suguru Goto, who built a suit embedded with sensors to control a percussion band consisting of 5 drum-playing robots. From the article: "First, take a percussion band consisting of 5 drum-playing robots (see the movie here, .mov 20.1MB).
Then build a suit embedded with sensors that monitor your every move--from the twist of a knee to the bend of an elbow (see movie here, .mov 29.7MB).
Then connect the output of your bodysuit to the input for your robot band.
This ZDNet's Emerging Technology Trends Blog article takes a look at the EyePoint system that uses both eye-tracking technology and keyboard hot keys to reduce our dependency on the mouse while surfing on Internet. From the article: "In a recent article, Computerworld reports that Stanford University computer scientists have developed a new way to interact with our computers. The EyePoint system uses both eye-tracking technology and keyboard hot keys to reduce our dependency on the mouse while surfing on Internet.