top of page

Space VR

Developed as an individual second year university project, this VR escape room is an example of my ability to use VR technology to create an immersive experience. incorporating complex puzzles.

I utilized Unity and C# to create this game.

Made for: ARU 2nd Year – Trimester 1

Duration: 4 Months

Engine: Unity

Language: C#

Position: Individual project

Brief (Excerpts):

  • 1st person

  • small game with at least 3 complex mechanics/puzzles

  • 2x2m space limitation (this will be measured in real space by defining a 2 by 2 m play area for the VR headset)

  • 5 min gameplay time (this is required to be shown in game - the time running out will lead to the player losing the playthrough)

  • Win and Lose conditions 

  • use of provided Polygon assets

Note: as said in the brief, some of the 3D assets in this project were external and not created by me.

Used assets can be found here:

POLYGON Sci-Fi Space - by Synty: https://assetstore.unity.com/packages/3d/environments/sci-fi/polygon-sci-fi-space-low-poly-3d-art-by-synty-138857

POLYGON Construction - by Synty: https://assetstore.unity.com/packages/3d/environments/industrial/polygon-construction-low-poly-3d-art-by-synty-168036

VR rig setup

Creating games in VR is much more challenging than your classic controller or keyboard and mouse controls. The reason for this complexity is due to the fact that the player is controlling more than one entity within the game, rather than a singular character. The most common example of this is the traditional two hands and headset camera, which means the player has 3 input devices in total, each controlling something within the game. Fortunately, creating the base rigging turned out to be fairly simple in Unity, as the engine has built in rigs for XR interactions. 

I still needed to create a script to detect inputs from the hand controllers. So I created this code which is attached to the two device objects. This code also has a secondary function, which animates the hand models connected to the object.

Click Image to view code

ezgif.com-gif-maker.gif

This gave me the base for inputs on any XR device, it checks to see what devices are currently available, then moves the hands and animates them according to the button presses. This was a vital function for the game, as the puzzles and mechanics must revolve around VR interactions. With this script, I can get the base input from any XR device.

Puzzles

AI Maze

When deciding on what this puzzle should be, I wanted to explore areas that I have not tried yet. One of these areas was AI, and learning how to get the AI to traverse accordingly to its surroundings (pathfinding). Using Unity Navmesh, I created this maze puzzle where the player would control a blue ship, opening and closing doors to make sure the enemy AI does not touch them. The player must reach all checkpoints to succeed. This was all done by the player using their hands to grab and interact with the maze.

Capture.PNG

Click Image to view code

Getting the AI to move towards the player:

bh.gif
Capturehudsc.PNG

I also wrote some scripts that would make the checkpoints flash colours and blink depending on their status. I did this to give some kind of visual indicater that the player has reached a checkpoint, and what checkpoint he needs to go to next.

daef.gif

Click Image to view code

Capture.PNG

The Function runs on every frame and changes the blink duration, the blink duration is a timer which switches the light intensity on and off when reaching certain destinations.

Magnet and UV

In this puzzle, my goal was to explore how you could make a UV mechanic in VR. To begin with, the original idea was to have UV goggles that the player could wear. Unfortunately, due to the time restrictions, I instead made a small UV handheld monitor, which worked just as well. After making the UV monitor, I thought the puzzle seemed a little too easy for the player to complete. So I decided to make a small magnet puzzle and combined them together to make more of a challenge.

Capture.PNG
fygv.gif

The UV was created by having the monitor hold its own camera, then creating a layer in which can only be seen by that camera. You can then put any object in that layer, and make it only be seen in that layer. This will give the effect of the monitor having 'UV vision', as the player can only see things in that layer through the monitor. 

Code for the magnet:

Click Image to view code

Capture.PNG

Planet puzzle

For this puzzle, my objective was to immerse the player by giving them control over the environment. When this puzzle is unlocked, the player is able to move planets outside the play zone. This puzzle is the final step in order for the player to complete the game, they win when the planets are aligned correctly.

Capture.PNG

This was possibly the most complex puzzle of them all, as this had many parts to the puzzle such as line rendering, raycasting, rotation around a point and used quite a few C# traits such as inheritance.

n v.gif

Click Image to view code

This puzzle affects the outside environment:

asd.gif

Mechanics

After I had finished these puzzles, I had also wanted to try experimenting with realistic objects that could possibly be used for things like simulations. VR simulations can be used in many industries such as training, it simulates real world situations for trainees to learn from without having to worry about consequences so they can be prepared in a real world situation.

I decided to make a couple of these simulations as I found the idea of applying VR to help solve real world training issues an intriguing idea. 

Drill

Click Image to view code

Drill.gif

Keypad

Click Image to view code

sadf.gif

Overall, I am very happy with the way this project turned out. I managed to do most of the exploration that I had hoped to achieve whilst developing this VR project, and learned new game features such as AI. There were a few things I would have executed differently with some of the puzzles to implement interactions better. I would like to go back to this project to polish the game, perhaps trying things like post processing on the UV monitor. I would also change up the planet puzzle, making the player able to grab holographic planets to move then around, rather than just buttons on a panel. 

bottom of page