Research summary

Research summary

  • 2016 to 2022

This project seeks to improve the experience of trying products out virtually using augmented reality.

Virtual try-on for faces is popular in gaming and cosmetics, but can also be applied in medical practices such as facial reconstruction surgery and medicinal skin products.

We are developing a new technique that can fill in missing parts of the facial image through inpainting in real-time to create a more realistic appearance. Our solution uses machine intelligence to integrate facial wrinkling information that can be used on a mobile platform with a quicker processing time than existing applications.

Working alongside face-tracking and AR specialists Image Metrics, we are testing our method in real-world applications.

This research will investigate deep learning architectures for various inpainting tasks.

Project source codes

View the source codes for our inpainting framework.

Research outputs

Selected academic papers

Funding