March 17th update - EEG & OSC to UE, Erosion, Earth & Atom Shaders
I've added many more things to my project since the last update, to document my work in progress, I made this WIP trailer for the project (working title: Pale Blue Dot):
Now onto the list of things I've added -
Attempting to integrate EEG (Brain Computer Interface) signal into my VR experience by passing EEG data from OpenBCI to Unreal Engine using OSC:
This is the most recent development in my thesis, the idea is to try to detect signals from the player's brain and use it to affect elements within the VR environment. This is suitable for my project since the theme being explored by the experience revolve around attempting to experience existence through different levels of time, scale, and consciousness, so the ability to establish a connection between the user's brain and the VR environment will add a meaningful layer to the project. After consulting faculty Dan O'Sullivan, I was referred to this documentation by Peter Ziyuan Lin which outlines how to use OpenBCI equipment available at the ER to send out OSC data. With Peter's help, I went and gave it a try and was able to gather EEG data from my head and send it out as OSC
When I tried to get the OSC data sent out from OpenBCI to Unreal through the Unreal OSC Plugin, Unreal gets stuck every time I run the project, but the data seems to be sending
Following a tutorial by faculty Matt Romein, I tried sending OSC with MAX instead, and Unreal seem to work without much lag. I suspect the lag might be caused by openBCI sending the signals too fast. I will be working to try and fix this issue in order to get the signals to affect elements within the VR environment. I was able to get MAX to receive the OSC signal from openBCI, so maybe I can solve this problem by organizing the data and frequency with MAX before relaying the signal to Unreal.
Another note about EEG is that with present technology (especially with consumer equipment), it is almost impossible to get consistent data with EEG, and each user will also have different ranges of signal, so for my thesis, the connection with EEG and the virtual environment will be more of a conceptual addition and tech exploration rather than using EEG as an actual tool for establishing a practical way to control the player pawn and navigate the virtual environment.
Basic mechanic for level switching:
With the help of classmate Mingxi, I created a basic blueprint for switching levels by holding down the trigger button on the VR touch controller, an emissive colored object is also added to the VR pawn, which will change colors as the trigger is held. In the final iteration, a similar feedback will be present as the player holds a button to trigger the level switching.
Earth & Cloud Shader:
Using this tutorial as a starting point, I created a planet sized Earth in Unreal complete with dynamic landscape that changes shape during runtime (more on the landscape shader below) and a stylized look which is based on the real Earth's surface with dynamic clouds floating above the surface.
Each of the original Earth texture layers (normal, cloud, base color etc.) were modified in photoshop to achieve a stylized look.
Below is the graph for the cloud shader:
Below are screenshots of the full Earth surface shader, which is a combination of the shader from the Making planets in Unreal Tutorial and the Erosion shader which will be shown further down this post.
Earth shader - world position offset:
Earth Shader part 2:
Earth Shader part 3:
Earth Shader full view:
I've also started building the level representing atoms/proton/electrons and created a nice moving shader for the basic elements (watch the trailer video above to see it move):
I've also experimented with decreasing the size of everything on the level to imitate a zooming out effect while switching between levels representing different scales. This is done by parenting everything to an empty actor then triggering the actor's scale to decrease in the level BP, this is a rough go of the mechanic in action (I also added a random violent shake to the spheres at runtime to simulate atoms shaking):
One of the most challenging problem I was able to solve is simulating the effects of erosion by changing the shape of the landscape mesh at runtime, thankfully, I was able to find this video through recommendation by classmate Kevin Peter He and emailed the uploader to find out how the author of the video was able to achieve this effect. The "World Position Offset" plug seem to be the key part of achieving this. Below is the basic shader given to me by the author of the video:
I then went on to modify this shader to make it so that the shape of the landscape not only changes height but also moves around in order to simulate real life landscapes going through large amounts of erosion and movement over millions of years. This was done through a combination of panning around the UV of different noise textures plugged around the shader.
This is the current result, I would like to modify it some more so it looks more dynamic and visually pleasing like the Earth shader when I have more time: