I participated in the Boston VR jam this past weekend, had an awesome time making some new VR stuff, and had my mind completely melted by the Valve hardware and demos. We’ve had a few posts here about it already, but some more pics, thoughts and details here:
Pics of the Valve HMD’s: http://imgur.com/a/jSdgb#0
TL;DR: The most interesting thing I heard all weekend was: after I tried the demos, I was asking about The Room (Valve’s Holodeck setup with AR trackers on the walls), and asked if it would be viable to do edge detection in an arbitrary space to get inside-out (cameras on the HMD looking out) positional tracking, and the guy from Valve said “no, that’s not fast enough; we have a different solution already.” My speculation has run dry – what do you guys think it could be that doesn’t involve trackers on the wall, edge detection, or external cameras? And I’m assuming not STEM or a similar sensor, since they’re not as accurate as cameras, and Valve & Oculus are looking for the least-cumbersome experience.
Thoughts on the Valve hardware & demos: We ran our jam project on the hardware a number of times, and also each got to run through Valve’s official demos. The demos were the same ones they run in “The Room”: a 3D grid of cubes showing webpages, a tiny office of the 2D Portal people, the room full of pipes, a room with three of the robot playable characters from Portal 2 (one that’s your size, one tiny, one huge), one looking at a complex animated robot, and one where particles are constantly created a couple of feet in front of your face (thousands of serious DX11 compute particles with complex motion). All were very impressive; everyone I talked to agreed that the office full of Portal people was the most interesting: you really felt like a giant, and being able to bend down and hang out among them was very cool.
(Special Ono) Talking with the Valve guys about that Portal office experience, they mentioned offhandedly that they have a Dota 2 VR experience where you see the entire game arena sitting on a table in front of you and can bend down to inspect any piece of the action. I really, really want to see that. They also mentioned a life-size Dota 2 VR experience where you’re hanging out in a lane watching the heroes fight; they said it was very scary.
Needless to say, the experience in the HMD is amazing: low persistence, perfect tracking (within the camera of course), very high frame rate. I don’t get sim sickness with the DK1 as it is, but nonetheless felt much more comfortable in the Valve units. However, I did consistently have major disorientation after leaving the HMD: I felt a little fuzzy and distant, and once felt like I was going to fall over. I felt something similar the very first time I came out of the DK1, never since, but every time after leaving the Valve units (4 or 5 times).
The HMD’s are dual vertical S4 screens, running a total of 2160x1280, with white IR-reflective dots on the shell. As you see in the pictures, the bottom ~half of the screen sticks out below the faceplate, so clearly you’re only seeing about half of it.
This is not a change of plans from The Room (as I saw someone speculate in another thread) – the Valve guys said that they each have one of these units on their desks for convenient VR testing, and then they load it up in the Room for primetime.
We used a Unity plugin from Valve which is interoperable with DK1, DK2 (apparently), and the Valve hardware. The biggest end-user difference between this plugin and the Oculus Unity plugin that my team experienced is that the Valve plugin creates the stereo camera setup at runtime, so attaching gameobjects to the camera / using its forward vector / etc is slightly less trivial.
No DK2’s, or Oculus folks, to be had – word on the grapevine is that they decided they couldn’t afford the time off, as they’re busting ass getting DK2 to the rest of us.
There’s a Boston VR meetup tonight where people will be seeing the Valve hardware and jam games; we’ll report back with any updates.