With a new and improved Apple operating system for iPhones--iOS 11, due out on September 19, 2017--comes new and improved capabilities for augmented reality (AR) companies ... or at least it does for ModiFace.
ModiFace, a company specializing in AR technology for the beauty industry, will launch a live 3D video-based hair color simulation (see video below) using "a set of collaborative neural network to detect hair in each video frame and to adjust the coloration of hair in a realistic way."
According to the company, the machine learning advances within iOS 11, including CoreML, will provide users with the fastest and smoothest video transformation experience.
The technology is expected to be used on both mobile devices and in stores by the end of 2017.
ModiFace to Use Apple's ARKit
The company has also announced their use of Apple's ARKit as a way to experience personalized, at-home shopping.
Customers will have the ability to virtually try on different beauty products through the ModiFace app and instantly see the products, reviews and simulations on their photo at a virtual beauty counter. For a better idea, watch the demo below.
CEO of ModiFace, Parham Aarabi, explained, “We have been working on deep learning architectures for a long time now, and recently advances in both the neural network architectures, basic hardware level optimizations, as well as the availability of significant training data, have made photo-realistic video hair tracking and coloration possible."