Feb 14, 2016 | By Tess
Video and computer game graphics have come a long way since the early days of Pac-Man and Pong, with lifelike human avatars, realistic environments, and stunning generated camera movements. Now, thanks to the work done by a team of researchers at the University of Southern California, video games are about to get a lot more real, and a lot more personal.
The team of researchers, led by Ari Shapiro, the head of a Character Animation and Simulation research group at the USC, have developed an inexpensive process to 3D scan a person’s face and body to turn them into a personalized digital avatar, and are making the technology accessible to anyone.
The process of turning a user into their own personalized avatar, if done correctly, takes just four minutes, and is made easy thanks to the free digital toolkit developed by the USC researchers. The digital toolkit includes three components: 3D scanning software; automatic rigging software to convert 3D models into a game- or simulation-ready character; and simulation software, called SmartBody which allows you to animate and control the 3D character.
First, using any sort of sensing or 3D scanning technology such as Microsoft Kinect or Intel RealSense, one simply has to 3D scan their body and face, and upload the scans using the software in the toolkit. From there, an automatic rigging software can convert the data from the 3D scans into a character that resembles the user. Finally, using SmartBody, a character animation platform developed by Shapiro and his team, the character is brought to life with movements and gestures.
The movements brought in by the SmartBody software include locomotion, steering, object manipulation, lip syncing, and nonverbal behaviors such as nodding, or hand gesturing, to name just a few. These aspects of the avatar are crucial, and of the utmost importance the the USC developers. More complex facial expressions are on the way.
As Shapiro explains, “We’re looking at all aspects of what make people, people. We’re looking at how we can model a person so that it’s convincingly a person, so that it captures the essence of them. We’re trying to create a digital person that thinks, acts, and behaves like a regular person.”
Until now, such technologies have been very inexpensive and consequently inaccessible to most people. Now, with their new technology, the team at USC is changing this reality as they plan to make their software free to the public so anyone can use it and even potentially improve on it. “We're giving everyone the ability to scan and animate themselves for free. We’re trying to foster innovation,” says Shapiro on the matter. “While tools to create games and capture 3D exist, the toolchain to bring the entire process together typically requires expert artistic intervention and a complicated set of processes.”
Now, however, all that is changing, as Shapiro and his team, which includes fellow leaders Evan Suma and Andrew Feng, have streamlined and simplified the process so that just about anyone can make their own 3D avatar within just four minutes.
“The community can now develop interesting applications with it. The applications could extend beyond games and into social media, communication, training and more.” Shapiro said.
The SmartBody software has been written in C++ and is compatible with many game and simulation engines including Unity, Unreal, Ogre, Panda3D, and GameBryo. The software, which can be downloaded here, works with Windows, Linux, OSx, and iPhone and Android devices.
Shapiro says having a personalized character may affect how people experience the game for themselves. Shapiro and his colleagues have been researching whether people make different decisions when the player character looks exactly like them. If something bad happens to the player character, does make them more invested in the game?
The personalized avatar technology could be hugely exciting within the realm of video and computer games, though it also has other potential uses and benefits. While some people may not revel at the idea of seeing themselves be shot in a violent computer game, the lifelike and personalized digital characters could be potentially used in military or police training simulations. Virtual avatars could also have a big impact on social media interactions and communications, which for the most part remain faceless. Shapiro notes that the Oculus Rift has an app, Oculus Social, where players can interact with each other in a virtual space. How would people's behavior change if they could insert their own likeness into a virtual room? “I can see a revolution in social interaction using your own 3D avatar as a means of communication,” says Shapiro who believes that face to face and physical encounters can themselves communicate a lot.
Though the notion of an embodied interface may seem futuristic—imagine your digital avatar interacting and conversing with a physician’s avatar for instance on the screen in front of you—it may soon become a reality. Until then, however, I’d like to see my own 3D scanned avatar thrown into Fallout’s nuclear wasteland.
Posted in 3D Scanning
Maybe you also like:
- ReconstructMe 3D real-time scanning system now available for free
- 30,000 year-old Venus figurine 3D scanned and printed for art exhibit in Belgium
- Dutch company Scanologics releases new mobile 3D scanner: the ScanLounge v2.1
- itSeez3D ports their mobile 3D scanning app to devices equipped with Intel RealSense camera
- 3D scanner maker Occipital raises $13M to develop spacial computing experience
- RangeVision announces new Smart line of 3D scanners starting at $2,600
- Matter and Form to launch Bevel Clip, a mobile 3D scanning attachment on Kickstarter for $49
- Matterport raises additional $30 million to expand its 3D scanning and virtual reality platform
- Microsoft granted patent for technology that scans objects and makes 3D printable models
- German hardware hacker creates low-cost mobile 3D scanning with Kinect and Raspberry Pi2
- Bristol creatives create an open source, portable, WiFi-enabled Kinect