Oculus Rift Support

Software: Away3D 4.x

Avatar
mrpinc, Sr. Member
Posted: 26 September 2012 07:57 PM   Total Posts: 119

Wondering if this http://oculusvr.com/ is something that the Away3D team will be able to support, or is it something that Adobe will need to implement in Stage3D first?

   

laurent, Newbie
Posted: 28 September 2012 03:15 PM   Total Posts: 29   [ # 1 ]

I don’t think Away3D will be able to do anything to support it. I guess their SDK will require you to have low level access to OpenGL or DirectX to support it.

This is the same limitation that we have today for stereo device like NVidia 3D Vision.

 

   

Avatar
loth, Sr. Member
Posted: 03 October 2012 08:32 AM   Total Posts: 236   [ # 2 ]

harg my eye is not camera,
i see my demo on 3 flat screen is better smile

   

Jonathan Hart, Newbie
Posted: 21 January 2013 08:01 AM   Total Posts: 16   [ # 3 ]

I don’t know if it will be possible, but Stage3d does support rendering to multiple contexts at the same time. I think Oculus only requires two side by side contexts to be rendered side by side to a specific buffer. I won’t know until my kit comes in the mail..

I definitely hope so smile I don’t want to switch to Unity.

http://www.adobe.com/devnet/flashplayer/articles/how-stage3d-works.html

   

Bibek S, Newbie
Posted: 23 January 2013 03:34 AM   Total Posts: 3   [ # 4 ]

Hi all,

I just started a thread (including code) about SBS-rendering using Away3D-4.1Alpha.  Basically, I created SBSStereoRenderMethod.as and TopBottomStereoRenderMethod.as, for use in Away3D-4.1’s new stereoscopic framework.  Details at: http://away3d.com/forum/viewthread/3868/

I don’t know if that will work for the Oculus Rift, but I’ll be interested to hear about it from anyone who has one to test with.

I /suspect/ that the image will display, but be distorted.  I believe Oculus’s SDK is for the purpose of automatically distorting the images as necessary for the Rift.  (The human eye only sees about a 40-degree field-of-view clearly; after that it’s basically peripheral vision.  The Oculus Rift presents a 90-degree FOV, which involves a lot of your FOV and should make for /very/ immersive 3D, but your program needs to account for the fact that people are seeing things peripherally rather than directly.)

Anyway, if you do work with the Rift, this should simplify your work.  I hope it helps.  Let me know how it goes.

- Bibek

   

Jonathan Hart, Newbie
Posted: 26 January 2013 01:06 AM   Total Posts: 16   [ # 5 ]

Here’s some more information:

http://en.wikipedia.org/wiki/Oculus_Rift

The video output will be DVI/HDMI, so as long as we have a screen resolution that matches and split the field of view into two viewports correctly, we’ll be able to send to the Oculus no problem.

The tricky part is the tracking data that the Oculus Rift sends us. It will be in USB format which means we need a way to read the USB data and pipe it into our AIR or Flash application.

This way describes doing so by way of a socket server proxy:
http://cookbooks.adobe.com/post_Accessing_the_USB_port_-16555.html

I don’t like that idea though. If the application is AIR only, we can use NativeProcess to route the data without the need for it going through the network stack. Anyone want to help me with that one? smile

   

Bibek S, Newbie
Posted: 26 January 2013 03:30 AM   Total Posts: 3   [ # 6 ]

Thanks for the info.  It looks like the WikiPedia page has been updated since I last saw it.

Anyway: splitting the FOV will be interesting, because it’s not a complete overlap.  Most stereoscopic-3D software is designed for a complete overlap. 
I may explore this with my SBSStereoRenderMethod.  I have figured out how to address the gap between sides, even if I don’t completely understand /why/ the fix works yet, but the principle of the fix should also work to address this overlap issue you mention.  In theory.  (That said, it’s a slightly lower priority for me, as I don’t have a Rift yet.  But I may just make the parameters configurable.)

As for the tracking data: NativeProcess doesn’t work on mobile.  If your code is intended to be desktop-only, I suppose that’s fine.  The Wikipedia page for the Rift indicates that it’s for desktop and Android devices, though, so you’d be limiting yourself.

An alternative to either using NativeProcess or going through the network stack would be to create an Native Extension (ANE) that talked to USB.  Take a look at: http://code.google.com/p/air-hid-usb/ .  The AIR version is for Windows and OSX, but the “hidapi” that it wraps is available for Windows, OSX, and Linux (http://www.signal11.us/oss/hidapi/).  Android runs on top of a basically-Linux kernel, and iOS is related to OSX and UNIX, so it should be possible to extend air-hid-usb to support both Android and iOS.  Then you’d have /direct/ access to the tracking info via AIR. cool smirk

(edit: removed angle numbers about overlap, since I may have had the numbers wrong.)

   

Jonathan Hart, Newbie
Posted: 26 January 2013 08:46 PM   Total Posts: 16   [ # 7 ]

I had thought of ANEs but didn’t mention them because I figure there aren’t too many phones out there that have USB ports + DVI/HDMI ports that would interface with the Rift but yes, that would solve the mobile problem smile

The extendedDesktop profile DOES support ANE’s, according to this link:
http://help.adobe.com/en_US/air/build/WS597e5dadb9cc1e0253f7d2fc1311b491071-8000.html

So that would indeed make ANE a more versatile solution. Good looking out!

   

Jonathan Hart, Newbie
Posted: 26 January 2013 08:48 PM   Total Posts: 16   [ # 8 ]

I’d be happy to help work on fragment shaders for warping the peripheral edges. I have some experience with that.

   

mth4, Newbie
Posted: 14 February 2013 09:15 AM   Total Posts: 15   [ # 9 ]

Any updates guys?

   

Jonathan Hart, Newbie
Posted: 26 March 2013 12:11 AM   Total Posts: 16   [ # 10 ]

None yet, not until I can get my hands on the dev kit!

   

Jonathan Hart, Newbie
Posted: 29 March 2013 09:26 PM   Total Posts: 16   [ # 11 ]

The developer portal launched today, and the SDK is available for Unity and Windows.

Unfortunately, I don’t have a windows machine so I will have to hold out for OS X support.

developer.oculusvr.com

   

Avatar
mrpinc, Sr. Member
Posted: 01 April 2013 06:10 PM   Total Posts: 119   [ # 12 ]

I’ve started a thread on the official forums looking for anyone interested in helping bring support to AIR / Away3D - I think it’s a bit beyond my skill set but I am willing to help any way I can.

https://developer.oculusvr.com/forums/viewtopic.php?f=39&t=155

   

Jonathan Hart, Newbie
Posted: 28 May 2013 03:13 AM   Total Posts: 16   [ # 13 ]

Good news, people. I got my kit on Friday and have also started a github project for the Native Extension that will connect the tracking system to AIR desktop apps.

http://github.com/jonathanhart/oculus-ane

The project is coming along nicely and I’ll probably have a functional version completed later tonight.

The next step will getting the rendering in Away3D to correspond correctly to the Rift’s specs.

   

Jonathan Hart, Newbie
Posted: 28 May 2013 03:47 AM   Total Posts: 16   [ # 14 ]

The first stable version is now up on the github repo, it includes a demo flashbuilder project to demonstrate how to read the camera quaternion.

ps - mac osx only for now!

   

Avatar
mrpinc, Sr. Member
Posted: 28 May 2013 06:25 PM   Total Posts: 119   [ # 15 ]

Wow very impressive - now I just need my Rift to try it out.

   
   

X

Away3D Forum

Member Login

Username

Password

Remember_me



X