Manipulating Virtual World

Using concepts from Iron Man, a team of ECE capstone students have developed a Gesture Operated Computer Aided Design tool (goCAD) to move computer objects with hand movements.


Source: News @ Northeastern

“If Tony Stark can do it in a cave with scraps,” said Kyle Dumont, then, well, he and the other mem­bers of his cap­stone team could, too. No matter that the main char­acter of Iron Man is just that—a character—the five elec­trical and com­puter engi­neering seniors decided six months ago they were going to create some­thing just as fan­tastic, if not as fan­tas­tical, as the action hero’s robotic arm.

They wanted to build some­thing that doc­tors, school teachers, or vir­tu­ally anyone could use to manip­u­late objects in a vir­tual space, the same way Iron Man can—using nothing but simple hand ges­tures. What they cre­ated, the Ges­ture Oper­ated Com­puter Aided Design tool, or goCAD, grants your average Joe access to tech­ni­cally chal­lenging design soft­ware with the simple swipe of his hand through the air.

The goCAD team earned third place in the elec­trical and com­puter engi­neering cap­stone com­pe­ti­tion this spring. One judge noted it took him three years to master a spe­cific design soft­ware and would have been quite grateful for their tool.

But goCAD is just an example of how one could use the team’s real inno­va­tion, said Greg Andrus, the group leader. Using Microsoft Kinect, the team’s tool can detect a user’s hand in space and then follow it as the user per­forms any of a series of pre­de­fined ges­tures. The approach could be used for design soft­ware, but it could also allow a brain sur­geon to easily rotate a three dimen­sional MRI image or a cal­culus teacher to explain deriv­a­tives more intu­itively than she can on a white board in the classroom.

“You start off with what the camera sees, which is every­thing,” said Samantha Kerk­hoff, who worked on the program’s low level coding. “Then you deter­mine where the fin­ger­tips are, then you deter­mine if they’re moving in a spe­cific pat­tern, which is a gesture.”

Dumont built a library of algo­rithms that help the Kinect iden­tify fin­ger­tips in space and then AJ Mills devel­oped a method for pre­dicting where those fin­ger­tips will be from one image frame to the next. Another team member, Sam Herec, fig­ured out how to turn the posi­tions of those fin­ger­tips into actual, rec­og­niz­able ges­tures, which Kerk­hoff then trans­lated into code. Finally, Melissa Milner wrote the code that trans­lates Kerkhoff’s infor­ma­tion into some­thing the pre-​​existing design soft­ware could actu­ally use.

“Every engi­neer wants to become a Stark,” Herec said. “But he’s a pretty bad engi­neer because he’s fic­tional. He works on his own and doesn’t let anyone else in.”

“But in real life,” Andrus chimed in, “you get good projects when everyone works together.”

While they won’t be using goCAD to hurl half-​​ton enemy robots to their doom, the stu­dents hope that open sourcing their soft­ware libraries will enable devel­opers around the globe to use them to make our inter­ac­tions with the cyber­world a little more intuitive.

Related Departments:Electrical & Computer Engineering