• Jack Chen

Thesis update Feb 19

Updated: Mar 17

Since the last blog, I've refined the "storyboard" for the VR experience further - describing each segment of the experience in order. Here are the brief outline of each segment, more details are included on the script document which is a WIP that will get updated whenever I come up with ideas for new details.


Part 1 - Human perspective

In the beginning section of the experience, players are "born" into darkness in an area which will gradually be populated with other human forms. Eventually, structures and other shapes will come together to create a linear interaction representing the development and conflict of human civilizations.


Part 2 - Small to big

After experiencing the human perspective, the viewer shifts perspective to that of an electron orbiting a proton, from here, the scale and time will change multiple times until the scale is that of the entire universe. Time will also shift depending on each scale, for example, real electron movements are too fast to be observed, so the time in that section will go very slow, whereas the movement of galaxies in real life is too slow to be observed within the lifetime of a human, so time will go extremely fast at that scale to allow players to see the movement of galaxies around the universe.


Part 3 - All of existence

Inspired by the stargate sequence from the film "2001 - a space Odyssey", this sequence represents experiencing all of time, space, and existence together. This sequence will be triggered by the player eventually entering a black-hole like portal near the end of the last section, transported into a new dimension full of vibrant shaders and shapes representing the full spectrum of existence itself.


Part 4 - Epilogue

After their trip through the various time, scale, and dimensions, players return to the human perspective from the beginning of the experience. Having just experienced all of existence, the human perspective feels like a return to home, but now feels slightly different as well.


I've also did more experiments with shaders and effects this week, finding several elements which I can use in the projects including building a particle system resembling a galaxy with a black hole in the middle:


Some post processing effects which will be useful for the "stargate" segment of the experience:

Beginning to learn basic AI programming in Unreal Engine for the human experience segment where there will be other humanoid figures moving around the environment:

A very rough layout for some segments of the experience:




I've also experimented and figured out a way to make Niagara particles that interacts with the player. In this case, the particles will always spin around the player (the cube)


In this case, the particles are attracted to the player's left hand


This is done by first setting a user parameter in the Niagara system and use it to affect elements of the system (in my case I used vortex force and point attraction force position)




Then getting the player position in the level blueprint and use it to affect the user variable set in Niagara.


I've also tried creating a portal effect following this tutorial:

https://www.youtube.com/watch?v=F28NKqG7ce8&list=PLraLBwsJKuGhw6H9SN-89l2DxXT6_RR_0&index=9&ab_channel=FusedVR

The result works in first person mode but not in VR yet, I will be finding a workaround so I can use a similar effect when transitioning between dimensions/segments in the experience. Blueprint below for future reference.

Level BP:



portal BP:




I've also been communicating with the composer/sound artist I am collaborating with to make the soundscapes for the experience. We are hoping to use a wide range of instruments and sounds to portray the wide range of time, scale, and environments present in the experience, rather than sticking with any particular genre of music or sound.


Lastly, I had the idea of integrating Brain-Machine Interface (EEG data & devices) into the project in the future, but after some research, I found that there is no direct and established way to connect EEG data to Unreal Engine. Plugins exists that helps with the process such as this plugin: https://www.unrealengine.com/marketplace/en-US/product/braincomputerinterface-ue4plugin?sessionInvalidated=true

but it still requires other software to interpret the data from an EEG device. It will definitely be great to eventually incorporate some sort of EEG based feedback based on the player's brain, especially given the theme of the experience I'm making, but for now I plan to focus on crafting a refined VR experience first before adding things such as physical computing components or EEG.

62 views0 comments