Meta Research: Photorealistic Clothes For Codec Avatars
Meta shared new research toward realistic simulation of clothes for avatars.
This clothing solution is built on top of Codec Avatars – Meta’s long term project to develop photorealistic avatars driven in real time by VR hardware sensors – and comes from the same team. Instead of traditional rendering, the idea of Codec Avatars is to use a series of neural networks to learn the appearance of a given person, then constantly encode their current state based on sensor input, and finally decode this state as geometry and final output textures. Since originally presented, the Codec Avatars team has showed off several evolutions of the system, such as more realistic eyes, a version only requiring eye tracking and microphone input, a 2.0 version that approaches complete realism, and generation using only an iPhone with FaceID.