Computer animated characters can mimic human behavior extremely well, but there's one trick that digital denizens haven’t quite yet mastered: Getting dressed.

Researchers Alexander Clegg, Jie Tan, Greg Turk and Karen Liu from the Georgia Institute of Technology have produced a systematic tool that allows animators to create realistic motion for virtual humans who are getting dressed. The new algorithm enables virtual characters to intelligently manipulate simulated cloth to achieve the task of dressing with different dressing styles for various types of garment and fabric.

Computer animated films tend to omit dressing scenes due to the difficulty of manipulating simulated cloth. But with the help of this new tool, animators will be able to create scenes similar to live-action movies with iconic clothing sequences, such as the “jacket on, jacket off” drill in 2010’s “The Karate Kid” or Spiderman pulling his mask over his head for the first time. The researchers' long-term goal is to use this technology to enable robots to help disabled or elderly adults with self-care. 

 Cover image: Alexander Clegg