Jun 12, 2017 | By Benedict

Researchers from Imperial College London and London’s Royal Free Hospital are 3D scanning the faces of 6,000 volunteers in an effort to create templates for reconstructive surgery. They say the collected data could help plastic surgeons create more realistic-looking faces for patients.

Of the countless applications of 3D scanning and 3D modeling technology, there are few more powerful and affecting than reconstructive surgery. Last year, a cancer survivor from Brazil became the first patient to receive a face implant created using Autodesk’s 123D Catch, a smartphone photogrammetry app, and more advanced scanning technology can produce even more impressive results. It is even becoming more commonplace for patients to receive 3D printed face transplants.

Recent developments in a massive London-based 3D scanning project could make facial reconstruction surgery even better—by collecting a huge 3D library of face shapes, types, and expressions that surgeons can use to make realistic artificial faces for patients.

Researchers from Imperial College London and London’s Royal Free Hospital are 3D scanning the faces of 6,000 volunteers for the project, taking in faces of different ages, ethnicities, and shapes to create an extensive database of faces. These digital 3D models could eventually be used by plastic surgeons as they attempt to create natural-looking faces for patients who have undergone facial trauma or who have serious facial deformities.

The extensive 3D scanning project actually began back in 2012, when researchers working with Great Ormond Street Hospital encouraged 12,000 volunteers to have their faces scanned at London’s Science Museum. When the scanning was complete, the researchers were able to develop 3D models with a neutral expression across a range of ages and ethnicities.

Now, the researchers are looking to widen the scope of the project by gathering 3D scans of faces showing different expressions. They say this will help surgeons create faces that can show a range of emotions.

"What we are aiming for is to develop bespoke 3D face models that act as a roadmap for facial reconstruction procedures,” said Dr Allan Ponniah, co-lead from the Royal Free Hospital. “We are still a few years away from using this procedure in surgery, but it shows real promise.”

“The applications could be life changing,” Ponniah continued. “For instance, if we want to generate the face of a five-year-old Chinese girl, our computer program will create a model that looks realistic and gives us dimensions we can use to rebuild a face. That would be really useful for a child with a specific facial deformity. You could input data and generate a face with the closest resemblance to the patient, within the normal range.”

A computer program developed by the researchers is able to process huge amounts of 3D scanned face data, mapping “facial landmarks” such as eye sockets, noses, and foreheads, as well as other more subtle features such as skin and corners of the lips. These so-called landmarks can be assigned coordinates, allowing for quick comparisons between faces and the generation of “average” faces based on age, ethnicity etc.

Volunteers are now having their faces scanned at 3D scanning booths in the Science Museum, and are being asked to pull faces once they enter the booth. Specifically, they are being asked to express disgust, anger, fear, sadness, surprise, and pain when prompted. They are then asked to pout, flare their nostrils, and puff out their cheeks—nothing new for the selfie generation, then!

“It is a real privilege to be working with Dr Ponniah on this project, which has the potential to revolutionize facial reconstruction procedures,” said Dr Stefanos Zaifeiriou from Imperial College London’s Department of Computing. “The beauty of our approach is that we can map hundreds of landmarks on the face and also features such as bone structure and muscles under the skin to create much more realistic faces.”

"Ultimately, we hope in some instances that patients can bring in old video recordings of themselves and we can morph this information into a 3D face model so that it more closely resembles what they would've looked like at their age before needing reconstructive surgery,” Zaifeiriou added. “What we hope to offer patients is a new type of approach that enables them to get a face that is as natural-looking as possible.”

The researchers say that their work could also have exciting uses besides facial surgery. For example, the collected data could be used to create effective facial recognition technology with much more advanced functionality than current systems. For example, a facial recognition application made by the researchers could recognize if a user has aged physically since their identification photo was taken.

Another application for the research could be helping children with autism, some of whom find it difficult to read facial expressions. The researchers have proposed a smartphone app which scans a child’s face, digitally morphing it into showing different expressions—anger, happiness, and so forth. The child could then play a game with their realistic avatar that explains (and visually demonstrates) what the different emotions are.

The technology could even help natural historians. The researchers believe that scientists could 3D scan the skulls of ancient humans found in museums, before using the gathered 3D facial data to “reconstruct” the faces of ancient humans. This could ultimately lead to a more detailed timeline of human evolution.

The researchers are still scanning visitors to the Science Museum in the hope of building up the biggest possible facial library.



Posted in 3D Scanning



Maybe you also like:


Leave a comment:

Your Name:


Subscribe us to

3ders.org Feeds 3ders.org twitter 3ders.org facebook   

About 3Ders.org

3Ders.org provides the latest news about 3D printing technology and 3D printers. We are now seven years old and have around 1.5 million unique visitors per month.

News Archive