A STATEMENT OF INTENET OF BRINGING RAPTURE VR TO LIFE





23/02/2021



Exploring the Concept and Application of Hand Tracking Gestures and Rhythmic Spatial Audio Beats in a Virtual Reality Psychedelic-Rhythmic-Magic Game





Exploring the Concept and Application of Hand Tracking Gestures and Rhythmic Spatial Audio Beats in a Virtual Reality Psychedelic-Rhythmic-Magic Game


Over the past few years, the growth and innovation in the XR field of technology has been nothing but astonishing. Companies have not only been refining existing technologies in the VR field but have also been putting the drive towards new innovations at the forefront of their XR endeavors. New companies keep entering the XR field on a regular basis and for now it looks like XR is headed to be the new standard for how we interact with technology.


Out of these new developments the one that caught my eye the most was how different companies such as Microsoft, Oculus and Ultraleap were approaching the concept of full XR Hand Tracking.


The idea is simple. Rather than having a typical VR/XR controller a person would be able to see and use their real hands to interact with virtual objects. When I learned about this and tested it out myself Ideas started popping up in my head about various applications and possibilities when combined with 3D Spatial Audio technologies.


1.A new type of Perceptual and Reactive VR experience.


The Experience I am trying to create would revolve around 3 separate parts that work together: Spatial 3D Audio beats, VR Hand Tracking and Gesture detection, and Visual Color Cues and Effects.


My goal is to make a VR Rhythm game where the player uses different hand gestures to cast various spells. With different gestures the player would be able to attack, shield from attacks and slow down time. They would have to survive incoming attacks and attack back to kill the Boss in front of them. The 3 separate gameplay parts mentioned above will work as follows:


–VR Hand Tracking Gestures – The Player would be able to use their actual hand movements, hand positions, velocity and orientation to cast specific magic spells. The idea is to have a fully contact free gesture detection system to accurately trigger in-game events such as magic spells and menu interactions.


–Visual Color Cues and Effects – Applying various Theories surrounding Human Perception and Curiosity I hope to achieve an effect which would tap into the human vision system to send specific signals to the brain. I believe that through targeted In-Game Visuals, Color Theory and Live Visual Reactivity I can elicit specific emotions and states, reactions and overall influence over a player’s perception.


–Spatial 3D Audio –Applying theories surrounding ancient civilizations and native tribes historically using specific drum beats to elicit altered states of consciousness. I will research themes such as Shamanic Journeys through drum beats and Drum-Induced altered states of consciousness and aim to apply them using the precise accuracy of modern 3D Special Audio technologies in a VR environment.


The game would be structured in levels which are separate from each other rather than a cohesive and continual progression. Each level would contain its own Boss and visual/audio theme.


-A game would start by the player putting on the VR headset and selecting the level they would like to play from the level select screen. This would be Entirely gesture based as I do not plan to use the hands for touching buttons or any physical object collisions.

-After selecting a level the player would be placed into the gamer world and a timer signifying the beginning of the Song/Battle would start.

-When the timer ends the Song would play and a Visually Gigantic Boss would appear in front of the player.

-The player then has to avoid the attacks of the Boss and the path that is appearing in front of them as they are traversing forward into the level, and also attack the boss back when not shielding or moving.

-The player would proceed to do this until the song ends or they manage to defeat the boss. A score is shown at the end of each level, but a high rating is only possible if the boss is defeated by the end of the song.

-After the final score is displayed the player can return to the Level select screen.


The game will be narratively focused around a Non-Corporeal being which goes on adventures in various Visually Psychedelic environments. Each level would have its own theme, song and visual style associated with it. The player would embody this being through the VR headset and proceed to beat the various game bosses representing certain aspect of human suffering. The goal of the being would be to reach enlightenment and transcend reality. By embodying it through dance hopefully the player can do the same.


2.Main Motivations and Inspirations


Some of the inspirations for this game idea are Beat Saber (Beat Games 2018), Thumper (Drool 2019) and AudioShield (Dylan Fitterer 2016).

Although these games are good examples of VR Rhythm games, they all either use a basic swiping system for reaction to the music and game objects or a standard game controller for interacting with the game. This has proven to be a very effective way to accurately react to the rhythm but no examples of Hand Tracking ever being used in a context such as this exist. I hope to be the first to showcase a highly accurate and interactive hand gesture-based system for accurate reactions.

The main motivation behind making an audio-based game is because I believe it to be an effective way to help people by influencing the mind. I have researched deep into the usage of specific targeted audio beats by native civilizations and cultures to induce altered states of consciousness and hypnotic states. I feel very interested in the idea of using sound and natural human movement to convey a specific experience to the player.


For the Visual side I have chosen to focus on various Psychedelic Artists and how they depict their various visionary experiences through their paintings and creations. I will approach this as a Fine Art student would and delve deep into the artists personal stories and the way they convey their ideas. My aim is to showcase this in the games Art, Assets and style.


All these elements have inspired me in their own ways throughout the past few months, so I hope that this game manages to achieve my goal of putting the player in a different headspace and maybe use these experiences as a tool for transformation.


3.Hand Tracking, Spatial Audio and Reactive Gesture Systems.


–Hand Tracking and Gestures- There have been various companies with their own approaches to this concept. Some of the most prominent players in the field are Microsoft with the HoloLens 2 (Microsoft 2019) AR headset, Oculus with Oculus Quest (Oculus 2019) and Ultraleap with their Hand Tracking modules for integration by XR Headset OEMs. The former 2 have robust SDKs and Software behind them with Developer Facing APIs and SDKs ready for integration while Ultraleap is more focused on providing solutions to Hardware Manufacturers.

Both Microsoft and Oculus have showcased big innovations in terms of Hand Tracking approaches and their SDKs for Unity (Unity Technologies 2005) provide a solid foundation for any Hand Tracking project.

The Oculus Integration SDK (Oculus 2015) for Unity Allows a developer to leverage the full hand tracking capability of Oculus Quest/2 devices and makes public all fingers, joints and curves a developer might use to build logic. Microsoft had provided a robust foundation and examples for possible XR interactions using Hand Tracking in their MRTK SDK (Microsoft 2016) for Unity. MRTK can be used as a good example of XR UI design and ideas behind interacting with virtual objects.


Some prominent games which have used Quest Hand Tracking are Hand Physics Lab (Holonautic 2020), Waltz of the Wizard (Aldyn Dynamics 2019) and Elixr (Magnopus 2020). Hand Physics Lab is a perfect showcase of the possibilities of Physics based object interactions using Hand Tracking. The Hand interaction feels very natural as it works the same way it would in real life. A player can grab an object such as a crowbar just like in real life or squeeze an object made from a squishy material. In my opinion this way of approaching Object Interaction feels very natural as it caters to normal human expectations. Although Hand Physics Lab is pushing the boundaries of Hand Tracking forward, the technologies on display here are not what I am looking for in my project.

Waltz of the Wizard and Elixr have some examples of Gesture based hand interactions. You mostly use your hands to interact with objects in the game scene by either grabbing them or touching them however there are moments which are entirely gesture based without any object-hand collisions. Although both games displayed an intriguing magic system that utilized gestures their aim was intuitive control and smooth visual feedback from the objects manipulated with the gestures. My personal aim with this project is to create a fast-reacting magic system and rather than having only certain parts use gestures I will make them a key feature. Waltz of the Wizard and Elixr and they are a great showcase of Hand tracking and have taught me a lot.


Alongside these Hand Tracking Game experiences there have been multiple research studies published regarding the subject most prominent of which is “Gesture interaction in virtual reality” (Beijing Key Laboratory of Human-Computer Interaction 2019). This paper revolves around trying to classify a clear standard terminology and understanding behind the definition of a “Gesture” is in an XR context.


-Spatial Audio- Over the past few years there have been huge strides in the realm of Spatial 3D Game audio. On the side of regular flat-screen games there has been the introduction of Dolby Atmos (Dolby Laboratories 2012) and Sony Tempest Engine 3D Audio (Sony 2020) in PS5. In terms of the XR space there are companies such as Oculus, Google, DearVR, Valve and Superpowered.

All these companies have their own custom 3D Audio Spatializer SDKs for game engines and work both in VR or normal games.

After doing a lot of research and comparison between all these options my verdict was that all of them have specific quirks and are best used in specific scenarios. They have varying degrees of quality in terms of Vertical, Horizontal and Surrounding Audio Placement. For my project I have chosen to use the Oculus Spatializer (Oculus 2015) as it provides the most detailed documentation and most accurate Vertical sound placement. Although Oculus Spatializer showcased accurate sound positioning the Near Field effects of Resonance Audio (Google 2018) by Google has its advantages.

Further testing is needed during development to choose the best Specialization technology.


4.Methadology of Project Execution and Applied Research


Audio- I will work closely with a (BA) Music student named Otto from Falmouth University studying production to create one or two songs for my project. These music pieces will use the theory around Audio Beat induced Altered states of consciousness. I will receive all “stems” and separate sounds from the song including a finished project file to be able to properly apply the Audio Spatialization to the Music.


Visual- I will use Microsoft Maquette (Microsoft 2019) to create the assets for this project myself. I do not have a lot of experience with Asset creation for games, so I have chosen to use a VR Modeling software for assistance. After comparing multiple solutions such as Gravity Sketch (Gravity Sketch Limited 2017), Tilt Brush (Google 2016), Blocks by Google (Google 2017) and Microsoft Maquette I chose the last one since it provided very advanced tools for modeling close to those in classing modeling software such as Blender (Blender Foundation 1994) and Autodesk Maya (Autodesk 1998). Maquette also offered a Unity Material importer to which would be very helpful in getting the right texture and feel of the assets.


Interactive- I will be utilizing the Oculus Quest (Oculus 2019) as my main target device and use Hand Tracking as my method of interaction. There are multiple ways to approach Hand Gestures in VR in terms of scripting and mechanics and I will create a system that hopefully gives the player accurate gesture-based control. None of the few tutorials, examples and forum posts regarding the topic explain it to an advanced level and the refinement and error correction when dealing with gestures in the examples works bad.


Perceptual- The main purpose of this game is to capture the players feel of presence. If all systems work correctly the player will hopefully form a disillusion with their physical body. By making the player react, dance, execute magic spells and progress inside the level I hope to achieve a full melding of the players senses with the game world. While designing, scripting and testing this game I intend to approach it “player-first” and base every decision and interaction between the separate mechanics with the sole purpose of achieving this effect.


5.Conclusion


As the exciting, new world we are heading towards with XR is getting built, more and more new Innovations keep appearing and becoming our new reality. Hand Tracking is undoubtedly one of the most exciting new propositions in the field of XR and as a developer I am very excited to try and push what is possible forward.


I believe that VR has the potential to change lives and I hope that I manage to push the limits of human perception in a VR space further with the concept of intuitive and natural hand gesture-based interaction systems, immersive spatial sound and visuals.


By building on top of existing foundations, although often with little documentation and examples due to the young nature of the XR field, I hope to use Spatial Sound, Reactive Visuals and smart Game Mechanics to build a project that achieves the desired effect.


Although it is a seemingly difficult concept to bring to reality, through the research and planning I have done, I believe that I have made myself a stable Timeline and plan for executing this idea in the timeframe before the deadline.


References


BEAT GAMES. 2018. Beat Saber. Oculus Quest, PlayStation 4, Microsoft Windows: Beat Games.


DROOL. 2019. Thumper. Oculus Quest: Drool.


DYLAN FITTERER. 2016. AudioShield. Microsoft Windows: Dylan Fitterer.


MICROSOFT. 2019. Microsoft HoloLens 2. Enterprise Market: Microsoft.


OCULUS. 2019. Oculus Quest. Consumer Market: Oculus.


OCULUS. 2015. Oculus Integration SDK. Unity Asset Store: Oculus.


MICROSOFT. 2016. Mixed Reality ToolKit (MRTK). GitHub: Microsoft.


UNITY TECHNOLOGIES. 2005. Unity. X86: Unity Technologies.


HOLONAUTIC. 2020. Hand Physics Lab. SideQuest: Holonautic.


ALDYN DYNAMICS. 2019. Waltz of the Wizard: Extended Edition. Oculus Store: Aldyn Dynamics


MAGNOPUS. 2020. Elixr. Oculus Store: Oculus.


Li, Yang, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI. 2019. “Gesture interaction in virtual reality”. Science Direct: Beijing Key Laboratory of Human-Computer Interaction.


DOLBY LABORATORIES. 2012. Dolby Atmos. Worldwide: Dolby.


SONY. 2020. Tempest Engine 3D Audio. PlayStation 5: Sony.


OCULUS. 2015.Oculus Audio Spatializer. Unity: Oculus.


GOOGLE. 2018. Resonance Audio. Online: Google.


MICROSOFT. 2019. Microsoft Maquette. Microsoft Windows, Oculus Store, Steam: Microsoft.


GRAVITY SKETCH LIMITED. 2017. Gravity Sketch. Steam, Microsoft Windows: Gravity Sketch Limited.


GOOGLE. 2017. Blocks by Google. Steam: Google.


GOOGLE. 2016. Tilt Brush. Steam, Oculus Store: Google.


BLENDER FOUNDATION. 1994. Blender. All common Operating Systems: Blender Foundation.


AUTODESK. 1998. Autodesk Maya. All common Operating Systems: Autodesk