Epic’s MetaHuman Creator Tool Now Imports Scans of Real People
Epic Games has recently provided a major update to its MetaHuman character creation tool which can now import scans for real people for subsequent use in real-time applications.
The substantial update provides a sneak peek into future use-cases where it will be possible for users to easily import realistic digital scans of themselves into virtual reality and the broader metaverse.
The MetaHuman tool will make things easier for developers, allowing them to create a huge variety of top-quality 3D character models that can be used in real-time applications. The MetaHuman tool works like an advanced “character customizer” tool used widely in modern gaming applications but it gives users greater fidelity and control.
When it was first released, it only allowed developers to begin customizing their characters from a selection of preset faces and then modify the character’s look to their liking using tools from there.
Users have been experimenting with the tool to attempt and create their likeness or the likeness of famous celebrities.
Until now, the MetaHuman character creation tool has been providing a much faster process of character creation than creating a comparable model manually from the ground up. However, it still didn’t do a perfect job in rendering the likeness of a specific person.
The latest update should go a long way in improving that. The latest release features a new ‘Mesh to MetaHuman’ feature that developers can use to import face scans of real people (or even 3D sculpts that have been created by other software) and the system will automatically generate a MetaHuman face that is based on the imported scan, complete with full rigging for animation.
However, it is still limited in some aspects. For instance, it will not automatically generate the hair, skin textures along with other finer details. From this point, the Mesh to MetaHuman feature is primarily designed to match the overall topology of the head and to segment it in order to generate realistic animations. The developers still have to fill in the details such as skin textures and do some extra work on the hair, facial hair as well as eyes of the person whose likeness they want to generate.
The MetaHuman character creator tool is currently in Early Access and targets Unreal Engine developers. It hasn’t reached a point where users will simply snap photos of their heads to create realistic digital avatars of themselves but is definitely going in that direction.
The ultimate aim is to create a realistic and convincing avatar of ourselves that can be used in the broader metaverse.
However, it will have to overcome certain challenges. It is not enough to simply generate an avatar that looks like someone. The avatar also needs to move like the user for it to be a convincing rendition of their likeness.
Each person exhibits unique attributes in their mannerisms and facial expressions and these are easily identifiable by those that are close to them. However, realistic avatars will never quite look like the real person when in motion even if they are rigged with a person’s face and imitate the person’s expressions. This is irrelevant for people who don’t know you. However, people who already know you have a baseline of expressions to draw from, and the smallest of changes in a person’s usual facial expressions can send out the impressions of being distracted, drunk, or even tired.
Meta is also trying to address this specific challenge with a system of Codec Avatars (which is apart from Epic Games’ MetaHumans). The Codec Avatars aim to animate realistic models of a user’s face using very believable animations that are unique to the person, in real-time.
As this field continues to evolve, we are likely to see a future system that is an amalgamation of Codec Avatars and MetaHumans which will enable lifelike digital avatars to be easily created and which animate in a way that is believable and unique to the person they represent.