Oculus Rift viewer released for all grids

CtrlAltStudio released a viewer yesterday that can be used to access Second Life and OpenSim grids with an Oculus Rift.

This is a preview viewer, which can be used to walk and look around inside the virtual world, but is missing user interface functionality. It works both in standard mode, and in 3D Oculus Rift mode, so users can switch back and forth if they need to.

David Rowe“It feels pretty good,” said  said CtrlAltStudio’s David Rowe, an independent virtual worlds developer. “People were grinning ear-to-ear when they tried it out today and yesterday.”

He has visited both Second Life and OpenSim with the viewer, which is based on the Firestorm code base.

However, there are no plans at this stage to develop a full user interface, he told Hypergrid Business.

“It depends on Linden Lab,” he said. “They’ve got it in the pipeline, but I don’t known when it comes out. It might have a good virtual worlds user interface, with the mouse key sorted out.”

If Rowe were to build his own interface, it would become obsolete as soon as Linden Labs rolls out theirs, he said.

In July, Linden Lab CEO Rod Humble told All Things Digital that this could be in ” late summer.”

Second Life's Ahern area as seen with the Oculus Rift. (Image courtesy David Rowe.)

Second Life’s Ahern welcome area as seen with the Oculus Rift. (Image courtesy David Rowe.)

The Oculus Rift requires a different approach to user interfaces than what you have in a standard viewer.

For example, anything at the edges of the screen will be outside the field of view. In addition. static text looks like it hangs in the air in front of the user, blocking the view of what’s behind it, and is also hard to focus on. If located at a comfortable focusing distance in front of the avatar, it may become difficult to read.

In addition, because the users are wearing a headset, they can’t see their keyboard. Asking them to use arrow keys to move around is simple enough, but if they need to type, or enter keyboard shortcuts, it can be hard to do blind.

Finally, some interface design have already been found to increase nausea in users, such as static screens or cut screens, which don’t move when users move their heads. Similarly, going instantaneously from standing to walking speed to running speed — without a period of acceleration in between — can cause problems for users.

Jimmy Fallon puts on Oculus Rift headset. Says: Says: ”What is going on! Wow! Holy mackerel! Whoa!”

Jimmy Fallon puts on Oculus Rift headset, says: ”What is going on! Wow! Holy mackerel! Whoa!”

Touching objects is another issue. The CtrlAltStudio viewer uses the arrow keys to move. And, since the Oculus Rift first-person “Riftlook” view is based on mouselook, moving the mouse changes the viewing direction.

“And you can’t actually see a mouse cursor to move or click on anything with at present,” he added.

Switching to a third-person view causes the viewpoint to hang in the air above the avatar’s head, creating an out-of-body experience. That isn’t stopping early testers, however, said Rowe.

“People seem to like using the third-person view,” he said. “It doesn’t work very well, so I’ll have a look at that, stop it from crashing so much.”

Over the course of this coming week he plans to iron out a few other issues that cause problems with teleporting and with windows mode — users are currently recommended to use full-screen mode.

There is also a noticeable “screendoor” issue, he said. He has an early model of the Oculus Rift, not the most recent high-definition one, so the graphics aren’t as good as they will be when the consumer version finally comes out.

Rowe said he began working on Oculus Rift support four weeks ago.

“I knew there were a bunch of people impatient to try it out, so why not?” he said. “And it’s something I wanted to try out for my own sake. It’s all very well thinking and talking about what it would be like, but it’s nothing like trying it out for real.”

The Oculus Rift hasn’t yet released the launch date for the viewer, but developers kits are are already available for $300, with an estimated 20,000 kits already shipped.

Related Posts


Maria Korolov

Maria Korolov is editor and publisher of Hypergrid Business. She has been a journalist for more than twenty years and has worked for the Chicago Tribune, Reuters, and Computerworld and has reported from over a dozen countries, including Russia and China. Follow me on Twitter @MariaKorolov.

22 Responses

  1. Now I’ll have to fork over the $300 for the Dev Kit. I was planning to wait until the third-gen consumer version came out.

    Drat. 🙂

    • sargemisfit@gmail.com' Sarge Misfit says:

      Me too! Good thing I got a couple of years worth of back tax refunds to collect. 😀

      However, isn’t Rift still being developed? I’ll likely wait until next year, if that’s the case. And its likely to be a bit smaller and lighter by then, too.

      And a bit of good news with this info, Maria. I had thought that it would take some work on the OpenSim server’ware, as well as the viewers, to make Rift work proper, but this shows that it is actually do-able on the viewer side.

      • I just check eBay’s listings to see if anyone was getting rid of their Dev Kit at a discount and for some reason a bunch of people are selling them at $400, $500, $600 and up.

        Am I missing something? Why would you pay more on eBay for a kit from some stranger, when you can buy it from the official site for $300?

        But yes, it is totally still being developed. That’s why I was planning to wait. But waiting is soooooo hard… Especially when you can go INSIDE your virtual world. Arrgh.

        • services@farworldz.com' Gaga says:

          I have seen this before on eBay, people speculate that something new and different will sell for more. And, believe it or not, there are people who are eager to buy. They are either stupid or lazy with more money than sense.

        • trrlynn73@gmail.com' Minethere says:

          why? ’cause some ppl will do anything to make a buck, and, as the sayin’ goes, wrongly attributed to P. T. Barnum, “there is a sucker born everyday”-)))

  2. David just sent me a link to a YouTube video of Oculus Rift being used in Second Life: http://www.youtube.com/watch?v=__TZHIe3yCU&feature=youtu.be

    And here’s another one:


    In the second one, when the avatar walks, you can see the shoulders when the camera swings down and to the side — very cool! Must make you feel like you really have a virtual body.

  3. paul.andrew.wilson@gmail.com' Paul says:

    I am really interested in using 3D tech with virtual worlds, I think will greatly enhance the immersion of them. I am currently trying out using the Emotiv EPOC EEG headset to control an Avatar in virtual worlds (and having quite good success).

    For people who have a disability that makes normal mobility difficult, or even just limits their use of computers, a combination of the Occulus Rift and the Emotiv headsets could help people in these situations gain a sense of freedom.

    These technologies, although cool gadgets in their own right, could offer so much for so many people with disabilities.

    • vr@shadowypools.co.uk' KeithSelmes says:

      Mobility problems and manual dexterity problems are to me the most compelling reasons for developing 3D technology and this type of hardware. It’s probably good as entertainment, but it has some very real practical uses too.

      • paul.andrew.wilson@gmail.com' Paul says:

        As someone with a disability myself (fortunately not one that prevents me from accessing and using virtual worlds too much), these new systems are looking really useful.

        Also, I have found that virtual worlds have helped me deal with the pain that I experience from my disability, it provides a distraction from the pain and allows me to escape it for a while.

        It is because of these experiences, I want to encourage and enable people to access VW for therapeutic means, and these kinds of technology help people do so.

  4. trrlynn73@gmail.com' Minethere says:

    I did note this comment, “In addition, because the users are wearing a headset, they can’t see their keyboard. Asking them to use arrow keys to move around is simple enough, but if they need to type, or enter keyboard shortcuts, it can be hard to do blind.”

    We used to have typing classes in public schooling where ppl could learn to type without looking at the keyboard, they called it “typing by touch”…is that a lost art????

    Does anybody do 10-key by touch anymore either??? Have I gone the way of dinosaurs???

    • vr@shadowypools.co.uk' KeithSelmes says:

      The touch typists I know are all fairly mature ladies now. They can be a problem, as they don’t ask for a new keyboard when the letters wear off, so no-one else can use their computer. With kit like Occulus, I have wondered if you can drink your coffee and eat your snacks while wearing it. And write noes on your real paper pad etc. So it must be superb for some uses, but it may be premature to get too excited about general use.

      • trrlynn73@gmail.com' Minethere says:

        ah, i just noticed my M and N keys are worn out…gosh…and, mature????

        Ok, now I have to wonder why only my M and N keys are the only ones worn out, and, why did you have to point this out!!

        [goes back to position by feeling the J and F key…]

        • svanmeirhaeghe@hotmail.com' Savino van Meirhaeghe says:

          simple is F and J are nutral key’s or you have Azerty or Querty keyboard i know querty as Azerty and i can write now this text and i look allone on this screen i look allone on this point i change not my face to my keyboard i write and write and stopped not i look you picture now and you name minethere and go back to my text as you see i can it in night write text’s for my not problem…

        • vr@shadowypools.co.uk' KeithSelmes says:

          I’ve a feeling if I try to explain “mature” I’ll only dig myself in deeper.

      • paul.andrew.wilson@gmail.com' Paul says:

        A solution could be to use Augmented Reality when someone wants to type.

        A camera (or two) set on the headset could record the real world outside the headset, then when someone wants to interact with the real world (type, grab snacks, or just move around) it could overlay, or replace the virtual world with the real world.

        Another solution would be to project a virtual keyboard into the virtual world and then use a Leap Motion type device to detect where the user’s fingers and hands are, and then match them to the virtual keyboard in world.They would type onto a virtual keyboard from the real world.

        Lastly, you could use voice recognition to allow a user to speak the worlds they want.

        Each system has its limitation and it pros and cons. However, the least likely to end up being used will be the ones that put the most strain on the user.

  5. joeybhyx@gmail.com' Joe Builder says:

    Tunnel vision no thank you, Mouse look viewing no thank you, Binocular viewing no thank you. To expensive no thank you, as people will see in a virtual world this won`t cut the cheese. No Thank you.

    • Can you possibly do anything else other than complain and protest about everything that isn’t SL? If you don’t like Kitely, fine…you’re in the minority. Every time you post it’s just constant draining bs complaining. Most all of the time, you’re completely wrong.

      Bet you’re a riot at parties (that is, if anyone even invites you).

  6. eirepreneur@gmail.com' James Corbett says:

    I’m a touch typist too but I’m always surprised by how many people, who work with computers every day, have neglected to learn the skill. So I think what’s needed here is for the viewer to borrow a trick from Oliver Kreylos’ amazing Oculus Rift and VRUI toolkit to project the real keyboard into the virtual world (using a Microsoft Kinect). And finally the typing animation in Second Life will actually make sense 🙂 http://www.youtube.com/watch?v=2MN3FHrQUa4&feature=youtu.be&t=2m59s

    • I’m a touch typist too, but generally need to glance down to find control and function keys — I don’t use them frequently, and they’re in a different place on every keyboard, besides.

      I have a feeling that many people actually are touch typists but don’t know it. If they were forced to spent a few hours typing without being able to see the keyboard – just the screen and the text they were copying — they might soon find out that their fingers know the keyboard pretty well. Especially if they’ve been typing for decades.

      Though I can’t speak for the one-finger typists. You might really have to see the keyboard to do that!

      I think that voice recognition is quickly getting to the point where we can just dictate — my iPhone is pretty good at it already, and I routinely use it to send text messages. It makes mistakes, but then, I make mistakes while typing, too. And the technology just keeps getting better all the time.

      All they need to do is get rid of the control functions in the user interface, and we’ll be ready for full immersion. I’d love to build with voice commands, with waving my hands around, and with an in-world tablet for everything else.

      • sebastian.gula@gmail.com' Sebastian says:

        I’m one-finger touch typist, It’s a bad habit but changing it is like trying to learn touch typing from the beginning.

  7. indigo.rainbow@aol.com' Heavy Hitter says:

    I tried this thing oh yes it is popcorn