You are on page 1of 2

Meet Mike: Epic Avatars

Mike Seymour Chris Evans Kim Libreri


Fxguide/The University of Sydney Epic Games Epic Games
Australia USA USA
mike.seymour@sydney.edu.au chris.evans@epicgames.com kim.libreri@epicgames.com

ABSTRACT
Meet Mike uses the latest techniques in advanced motion capture to
drive complex facial rigs to enable detailed interaction in VR. This
allows participants to meet, talk in VR and experience new levels
of photorealistic interaction. The installation uses new advances
in real time rigs, skin shaders, facial capture, deep learning and
real-time rendering in UE4.

CCS CONCEPTS
Human-centered computing Human computer interac-
tion (HCI); Virtual reality;

KEYWORDS
Virtual Reality, Facial simulation, Interactivity, Avatars, FACS, Digi- Figure 1: Digital Mike meets and interviews guests in VR.
tal Humans, Affective computing
ACM Reference format:
Mike Seymour, Chris Evans, and Kim Libreri. 2017. Meet Mike: Epic Avatars. space and experience a conversation between the advanced real
In Proceedings of SIGGRAPH 17 VR Village, Los Angeles, CA, USA, July 30 - time digital avatar.
August 03, 2017, 2 pages. The Installation explores the acceptance and affinity people have
https://doi.org/10.1145/3089269.3089276 with Avatars when they are witnessed in an immersive, interactive
and affective computing loop.
The project highlights the latest advances in VR Stereo rendering,
1 INTRODUCTION skin shaders and complex FACS based rig models inside the Unreal
In the last decade, there has been intense research activity studying Engine. The system runs on a network of VIVE VR stations and
VR experiences. In the last few years this has extended to digital provides a look at unparalleled engagement and virtual presence,
humans as either Avatars or Cognitive Agents. Due to technical while illustrating multiple levels of facial tracking and rendering.
constraints, the research has been limited to the exploration of real- Digital Mike will interview leading industry experts in VR every
istic face to face expressive emotional communication. Algorithmic day of the SIGGRAPH conference. Both Digital Mike and his guests
advances combined with extremely high resolution scanning will will meet in virtual reality with interactive real time avatars to
allow for the first time, participants at SIGGRAPH VR village to discuss a range of cutting edge topics. Audience members can join
meet in a virtual world with highly interactive and detailed life-like the discussion in VR or observe from the real world. This project
Avatars. represents a combined effort of seven teams and several Universities
The VR installation allows attendees to explore an unprecedented from five countries. It demonstrates both current leading technology
level of complex emotional interactivity with a virtual human. This and advances in interactive real time photo-real facial animation.
project builds on the work from the team that won Best of SIG- The project is the combined effort of Epic Games (UE4), 3Lateral,
GRAPH LIVE 2016 [Seymour 2016] and the Digital Human League. Cubic Motion and Loom.ai, together with the teamwork of the
Meeting Digital Mike in VR, allows users to experience the sub- Chaos Group Labs Digital Human League. Special thanks for the
tlest performances and renderings of a real-time character yet seen, major technical contributions provided by USC ICT [Debevec et al.
with eye contact and complex nuanced emotional feedback loops. 2000] and Disney Research Zurich [Brard et al. 2016].
A group of people will be able to simultaneously meet in a VR The UE4 engine includes high-quality lighting features, new skin
shaders and stereo rendering which produces crisp, detailed VR
Permission to make digital or hard copies of part or all of this work for personal or images. This is coupled with extremely complex and detailed facial
classroom use is granted without fee provided that copies are not made or distributed and eye rigs, based on extensive input scanning from the USA and
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for third-party components of this work must be honored. Europe. Together they provide one of the most realistic interactive
For all other uses, contact the owner/author(s). real-time facial experiences yet seen.
SIGGRAPH 17 VR Village, July 30 - August 03, 2017, Los Angeles, CA, USA The interaction is made more interesting as the guest in VR will
2017 Copyright held by the owner/author(s).
ACM ISBN 978-1-4503-5013-6/17/07. have a custom avatar made, which is rendered in real-time with
https://doi.org/10.1145/3089269.3089276 their facial expression being interpreted while they are wearing the
SIGGRAPH 17 VR Village, July 30 - August 03, 2017, Los Angeles, CA, USA Seymour, Evans, and Libreri

VR head gear. This is done with new sensors and cameras on the REFERENCES
VR rig powered by AI, deep learning algorithms. P. Brard, D. Bradley, M. Gross, and T. Beeler. 2016. Lightweight Eye Capture Using a
The two approaches allow for an interactive exploration of dig- Parametric Model. ACM Transactions on Graphics 33, 6 (2016), 112.
P. Debevec, T. Hawkins, C. Tchou, H. Duiker, M. Sagar, and W. Sarokin. 2000. Acquiring
ital humans and new forms of Avatars. This new VR installation the Reflectance Field of a Human Face. In Proceedings of ACM SIGGRAPH 2000.
demonstrates complex digital human rigs and real time rendering 145156.
Paul Ekman and Erika L. Rosenberg. 1997. What the face reveals: Basic and applied
including: studies of spontaneous expression using the Facial Action Coding System. Oxford
University Press.
FACS real time facial motion capture and solving[Ekman M. Seymour. 2016. EPIC win: previs to final in five minutes | fxguide. https://www.
and Rosenberg 1997] fxguide.com/featured/epic-win-previs-to-final-in-five-minutes/. (2016).
Models built with the Light Stage scanning at USC-ICT
Advanced real time VR rendering
Advanced eye scanning and reconstruction
New eye contact interaction / VR simulation
Interaction of multiple avatars
Character interaction in VR at suitably high frame rates
Shared VR environments
Lip Sync and unscripted conversational dialogue
Facial modeling and AI assisted expression analysis.

2 VR INSTALLATION DESIGN
The core installation features a host driving a high end real time rig
and a guest with a special second VR rig. Both Avatars are talking
and interacting in VR via UE4. Both guest and host have real time
facial rigs, but using different technological approaches.
Surrounding this is approximately five VR stations which provide
a VIVE VR audience. Each audience participant will see a brief
introduction and then join the event.
The host moderator uses a Technoprops stereo head rig and a
bank of surrounding monitors. From this data input station, the
facial animation is determined.
The primary avatar is driven in VR from a facial tracking system
provided by Cubic Motion and an advanced solver from 3Lateral.
The Epic Games UE4 provides a high frame rate VR environment
and real time avatar that can be seen by each of the individual
stations.
Additionally, the guest in a VIVE head display is present in VR
via a Loom.ai avatar which is driven with Deep Learning, using a
custom mouth facing camera and special eye tracking also from
Cubic Motion.

3 ACKNOWLEDGEMENTS
This project is possible due to the team at Epic Games: Kim Li-
breri: Creative Exec Supervisor, Andrew Harris: Creative Supervi-
sor, Chris Evans: Technical lead, John Jack: Senior Producer, Richard
Ugarte: Associate Producer, Ellen Liew: Event Manager, Jeff Farris:
Senior Engine Programmer, Shane Caudle: Principal Artist, Sean
Gribbin: Assistant Producer and the rest of the team at Epic Games.
Vladimir Mastilovic, the founder of 3Lateral. Gareth Edwards,
Steve Caulkin and Steve Dorning at Cubic Motion. Kiran Bhat at
Loom.ai.
Christopher Nicols at the Chaos Labs Digital Human League /
Wikihuman project, Paul Debevec and Jay Busch at Google/ USC
ICT Lightstage, Glenn Derry: Fox Studios/Technoprops.
The team at Sydney University: Kai Riemer, Judy Kay and Mike
Seymour. Alan Dennis, Kelley School of Business: Indiana Univer-
sity and Indiana Lingyao (Ivy) Yuan: Iowa State University.

You might also like