The immersion that the Oculus Rift could provide is pretty amazing. Playing the same game in Virtual Reality (VR) can lead to completely different experiences. While testing some basic games that I made and some demos with the Rift, I noticed that I had to slide the hardware up a bit in order to orient my hands on the proper keys on the keyboard in order to play the game properly; and I had to do it a couple times during the middle of the game in order to reorient my fingers in case they slide off the keys. It was such an annoyance that when it happened more than once, I just stopped playing.
I also know that there is a big movement in the VR industry to try to find a standard peripheral for use with VR and I understood why. The quest to find a proper motion controller for use in my thesis game had begun!
Before I decided what exactly you would be able to do in the game, I wanted to test the motion controllers and peripherals that Academy of Art University had in their Game Lab, which is quite a few. I would end up testing these peripherals during my Research and Development class, which was a good thing because I was able to get some help from my professor and possible game play tests from the other students who took the class as well. I had already set up the Rift to work well within Unity so I had to focus on getting my motion controller to work well and fluidly.
I began with Kinect for Windows. Before I could play around with it, I first had to integrate Kinect with Unity. I downloaded all appropriate plugins and API’s and, after some time, managed to have Unity and Kinect work together. But when I played around with it, I noticed a few key problematic issues that made me look for another solution:
The bones of my avatar’s leg or arms would sometimes snap to random orientations even if I was in full view at all times
There wasn’t much of a range when moving towards or backwards in relation to the camera since I had to be in full view AND my appendages needed to be in clear view for the camera to pick up that they are my legs and arms
The arms jumble up when I placed them within my chest/head silhouette
Next I tried the Leap Motion. I really liked how accurate it perceived my hands and fingers and I enjoyed some of the demos. I set it up to work within Unity and began my barrage of tests. Again, it wasn’t satisfactory and here’s why:
The range is ridiculously small. The Leap Motion was located at the front of the Rift hardware, pointed slightly down. My idea was to have the Leap Motion track my hands at around chest level to simulate actions taken during the game. But the range could not reach that far and when it did pick something up, it was not accurate
While it tracked all 3 axis fairly well, the Leap Motion is better for capturing finger movements for more delicate things
From these two peripherals, I learned that I wanted something to track in all 3 axis but had to be somewhat familiar to people. I looked online and saw that there were two gaming peripherals that were being worked on, the STEM system and Razer Hydra. I really wanted to try these out but the school did not have these. I did not want to spend a lot of money or wait for a preorder to ship so I had to find another solution. I looked towards the Wii remote controller and Playstation Move motion controllers.
The Wii remote controller (Wiimote) was pretty darn accurate when everything was finished integrating into Unity; it captured the orientation of the Wii controller. However, in order to capture the 3-axis location, I had to use an infrared bar. There was no spare Wii infrared bar and I wasn’t sure if I should spend out-of-pocket money to buy one and deal the appropriate hacks to have it work on the PC. So I decided to test the Playstation Move motion controllers first.
This time, I was able to secure the entire PS3 and an Eye Camera with the motion controllers. I was a little happy because I pretty much had all the necessary hardware but a little sad because there was a lot more hardware. I had looked into Google and found that there are hacks to have the Playstation Move controllers work on the PC but it was super finicky and didn’t behave well half of the time. In light of this information, I decided to use all the hardware to at least test things out. I had to buy the Move.me software for the PS3 in order to even test it. I had seen videos of this software and was amazed at what I found, so it was all in good faith.
Once integrated into Unity and the testing began, I was surprised to see how smooth it all was! The orientation of the controller as I rotated it around, the tracking of the controller in virtual space was pretty straight on, and it even worked quite well when not in front of the camera’s field of view. I was immediately sold on using the motion controller for the PS3. I connected the Rift and tried out some basic movement in a demo and realized how much more fun it was to be swinging your arms around and rotating your hands in the demo than it was to use the standard keyboard and mouse combo. But then that got me thinking, what if I were to use a gaming controller, like Microsoft’s Xbox controller, instead of these motion controllers? After some pondering and more testing, I felt like motion controllers was the way to go. It added more to the immersion than what a standard controller could bring.
Now that I have my motion controllers and a working Unity file with both Oculus and Playstation Move integration, I began my project by figuring out the best movement system to use that would feel smooth and intuitive. One step at a time.