Motion Capture Using Motion Eye Tracking

Motion Capture Using Motion Eye Tracking
0 Comment

Why should MMI be your choice of a school to learn about Motion Capture? Here is one reason why. Recently the UW Madison approached our Game Art & Animation Staff to test their new motion eye tracking glasses. Of course we took on the project, because this is the kind of first-hand information we like to pass on to our students. Our animation and motion capture teachers and students have fun with this kind of stuff.

The Motion Eye Tracking Test

Mike Gleicher approached me via email asking if MMI would be willing to help them research motion capture using their eye tracking equipment.   I met Mike and Tomislav (Tom) Pejsa (a CompSi Doctorate student) in July 2015. We talked about what they wanted to do and to see if I could help them. They wanted to use their SMI Eye Tracking Glasses with full body motion capture. In Sept 2015, Tom came to MMI again and we suited up one of our female students and started with the project. The overall reason for this was to see if their eye tracking glasses could actually track eye and head movement accurately with body movement. Tom used the SMI Eye Tracking Glasses along with Smart Recorder which is attached to the glasses via a USB cable to a smart phone that is then attached to the actor’s body using Velcro. This enables him to track what the glasses are seeing real time and I track the actor’s body movement, glasses, and ground points in real time. I place three motion capture markers on the glasses as well as three markers for points on the ground so I know where the actor is looking. The data is then applied to a 3d character and the glasses and ground data are applied to the character’s head and location points in a 3d scene. This shows how accurate the glasses are tracking the actor’s head motion. I never did this type of motion capture before and was a great experience.

The Motion Capture Results

Tom was not impressed with the results because the eye tracking glasses for some reason did not like the student’s eye color. So the data was inaccurate and non useable. Stay tuned next time as we had to do this all over again from what we learned that didn’t work. Learn more about our Animation and Motion Capture program.