The IEEE Virtual Reality Conference 2012 was a swinging success in OC this past week, and I had an absolute blast – all while learning about the cutting edge in virtual, augmented, and mixed reality, and meeting a ton of incredibly talented people. I was graciously awarded passes to the conference by Mark Bolas of ICT’s MxR Lab in order to help showcase their latest off-the-shelf virtual system called FOV2GO. Pictures below:
If you have an iPhone or an Android, then you can experience visually immersive VR pretty much anywhere. This is is the basic premise of FOV2GO, and it was a runaway hit at IEEE, winning theBest Demo Award and giving out hundreds of kits for conference goers to test out for themselves. The hardware consists of an inexpensive head-mounted display made of foamcore, two lenses, and a smartphone. Here’s a quick video of putting it together:
For showcasing FOV2GO, I created a mobile version of the Shayd VR project that is going up in May at the IMD Graduate Thesis Showcase. While the full installation of Shayd encompasses an entire motion capture stage and wide-FOV head-mounted display, Shayd Mobile is much simpler, utilizing the FOV2GO stereoscopic Unity package developed by Perry Hoberman. Here’s a video of Shayd Mobile in action:
The software package, compatible with Unity 3.5 with iOS / Android license, includes numerous scripts for incorporating stereoscopy into any Unity project. You can very easily implement side-by-side stereoscopy, or anaglyph stereoscopy for use with the classic red-cyan glasses. Several other scripts utilize the iPhone’s built-in gyroscope for tracking the user’s first-person viewpoint, as well as camera movement scripts and plenty more. Check out the full documentation on the MxR website if you want to dive right in – its comprehensive and user-friendly, even to the newest Unity developer. So check it out!
As for the rest of the conference, there were a billion things going on! Above left are all the head-mounted displays we showcased at the FOV2GO booth, including versions for iPhone and Android, some with larger lenses for higher field-of-view, a few Hasbro My3D devices, and the PR3 Head-Mounted Display.The picture to the right of that is The Elder Scrolls V: Skyrim being played solely with the Microsoft Kinect. Developed by Evan Suma, the experience was quite robust, consisting of swiping motions for your sword, pushing for throwing fireballs, and even head-butting to send a wave of energy. It’s pretty goddamn sweet. On the top right, DVE Telepresence Labs were cool enough to take Nathan Burba (of the infamous Burba Blog), Palmer Lucky (of the still more infamous Mod Retro), and myself over to their labs to check out their multi-million dollar systems for use by the US Military and select billionaires. Of course they took us in a limo, in the off-chance that a wealthy potential client might be in attendance.
There was an abundance of HMDs and motion capture setups, which was so much fun to toy with. The head-mount developed by Motion Analysis, along with their reflective marker motion capture system, had a horizontal field of view approaching 100 degrees (pictured above left). I caught the chance to talk with Duane Hartley, President of Motion Analysis, and he told me how Motion Analysis was capable of holding itself together in outdoor natural environments. Compared to other passive marker systems, such as Vicon, Motion Analysis has proven itself useful for outdoor use, for instance with the recent film Rise of the Planet of the Apes. However, the holy grail of Motion Analysis’ system is the software, and most people purchase for that reason. The cheapest setup would be four cameras plus the software, around 25K. The cameras are worth about 5K each individually, comparable to the costs of a PhaseSpace setup. Speaking of PhaseSpace, the photo on the top right is their latest Augmented Reality HMD, being tested by Palmer.
Here’s a few last pictures from IEEE – from left is an HMD developed by ART, a haptic device, and a close-up of the Motion Analysis motion capture cameras. Besides an HMD, Motion Analysis also had a trackable submachine gun for use with virtual soldier training and entertainment (pictured below). Of course the gun was fake, but reflective markers were placed all around the object and the movement of the gun could be emulated by the person’s virtual gun within the game. They also usedCryENGINE 3, which made the experience look absolutely spectacular. With such a cutting edge game engine, its no wonder that virtual combat training is so realistic and useful.