• Jack Chen

Mixed Reality Music Video

Updated: Mar 1

For our final project, our team decided to fully experiment with the potentials of combining Unreal Engine's world building and composure compositing features with live action filmmaking by making a music video following the journey of a character between real and virtual worlds.


The plot of the music video mirrors that of our production process, exploring the theme of the transition and relationship between real life and virtual reality. The main character - Charlin, played by our own team member with the same name is walking on the streets when a virtual version of himself jumps out form a portal and invites him into a seemingly utopic virtual world full of shining temptations. However, Charlin soon finds out that the virtual world is not as glamorous as it seems but is locked inside the world as the virtual version of himself closes the portal, locking Charlin forever inside the virtual world.


The art direction of our video is inspired by vaporwave, cyberpunk, and Synthwave aesthetics, along with various contemporary music videos which uses virtual production to combine live action footage of the artist within virtual worlds.


A rough 3D layout of the scene was first created inside Unreal Engine to help plan out the environment layout and camera angles for the virtual environment.

Once the plot and art directions are decided, a shortlist was created using rough storyboards taken with phones to map out the shot angles. Meanwhile, the 3D virtual environment and shot angles are also being built and planned in Unreal Engine.

A virtual camera with a virtual dolly was used to render the purely CG shots of the virtual environment.



The main segment of the live action components are shot in front of a greenscreen, practical lighting is used on set to simulate lights from the virtual environment.


In order to animate the virtual version of Charlin, we recorded motion capture data to be used with our custom avatar in Unreal Engine.



Once the live action footages are completed, the greenscreen footages are keyed and then camera tracked in After Effects. The camera tracking data was then imported into Unreal Engine to control the virtual camera in the virtual environment so that the keyed footage can then be composited on top with the movements of the environment matching that of the live action footage.




The composure tool in Unreal Engine was used to preview the composite of the live action footage with the environment. Composure was also used to separate the foreground and background layers of the cg environment.



Finally, the foreground and background layers of the virtual environment are rendered into separate image sequences to be composited back together in After Effects.


In the end, the composited footages are edited together in Premiere.





29 views1 comment

Recent Posts

See All