|
Fragilem17, Newbie
Posted: 30 May 2013 01:11 PM Total Posts: 15
[ # 16 ]
Hi Jonathan,
I found this thread for fish eye warping.
http://away3d.com/forum/viewthread/3198/
but it’s not the path we want to go i think.
-
You talk about using a shader to do the warping.
(having no experience with shaders and barely understanding what they do)
How would that work? Do you use a normal camera where you set the FOV to 120 and then use a shader to “push” the then final pixels from the camera to a more suitable location on the screen to account for the curving? will everything be pixel sharp then?
Don’t know if my question makes sense?
To illustrate a bit better, take a look at this image:
http://strlen.com/gfxengine/fisheyequake/compare.html
You have the left 120 FOV from a normal camera, notice how the middle (interesting) part of the screen is far away, the shader would then do it’s operation on those pixels to stretch them in the correct location for the rift, but stuff will scale and stretch then. won’t it?
or am i completely wrong in my understanding in shaders… and is it really something that tells the camera to interpret the scene using some defined curvature resulting in each pixel coming from the camera in the correct location.
Been reading on this stuff yesterday until i fell asleep, feel like such a noob diving in to this stuff, but so want to make it work!
I suggest maybe reading this post (link below) from the oculus forums (it starts getting interesting somewhere in the middle when GeekMaster is talking about: “No post-render warp blurring effects there” and “The best approach is to pre-warp with a fisheye lens on your in-game camera”)
https://developer.oculusvr.com/forums/viewtopic.php?f=26&t=1232&p=13956&hilit=habit#p13956
|
beers, Member
Posted: 30 May 2013 02:49 PM Total Posts: 53
[ # 17 ]
I’d love to know how the away3d fisheye shader (or projection matrix?) works as well
beers
|
Jonathan Hart, Newbie
Posted: 31 May 2013 12:50 AM Total Posts: 16
[ # 18 ]
Phew, several different things going on here!
The one link related to baking the the lens into the raytracing is not applicable for us because that is a form of software rendering that we don’t utilize in Stage3D.
As for warping the vertices instead of the pixels, that’s interesting that the effect could be also accomplished in the projection matrix or a vertex shader. It would definitely be better performance than doing a per-pixel fragment shader on two separate render textures (one for each eye).
However, I do know that the fragment shader would indeed be more accurate because projection matrices act on the vertices, not the texture coordinates.
The lens would distort the screen and you’d have shapes that appear correct, but the textures on them would look odd because the rendering is doing the warping in euclidean space to correct for lenses that do warping in “Screen” space.
The fragment shader would act on “Screen space” thus being the most appropriate way. But if we can’t get the 60fps needed to make the oculus look good in Flash/AIR, it would be a good plan B.
You guys should be able to download the SDK from the Oculus site, I highly recommend reading their lengthy exploration in this concept in the documentation, as the correct way to warp the screen is gone into with extreme detail and I was hoping to provide AGAL that faithfully respected their guidelines..
|
Jonathan Hart, Newbie
Posted: 31 May 2013 07:13 AM Total Posts: 16
[ # 19 ]
I just checked into the repository Away3D rendering integration..
It’s functional: if you instantiate the native extension, and then use the supplied classes for Away3D as your Camera, View and the View’s RenderMethod you will be able to utilize both rendering and tracking.
Some caveats:
1) Not optimized
2) No lens correction yet
3) No peripheral occlusion yet
To connect the Oculus via the ANE, add it in the Project Properties (remember to also check it off in the ActionScript Build Packaging section!)
Connect to the Oculus in AS3:
var oculus:OculusANE = new OculusANE();
Tell Away3D to render in Oculus mode:
_scene = new Scene3D();
_camera = new OculusCamera3D();
_camera.z = 0;
_view = new OculusStereoView3D();
_view.scene = _scene;
_view.camera = _camera;
_view.stereoRenderMethod = new OculusStereoRenderMethod();
addChild(_view);
In your main render loop, have the following code adjust the camera:
private function enterFrame(event:Event) : void {
var quatVec:Vector.<Number> = _oculus.getCameraQuaternion();
var quat:Quaternion = new Quaternion(-quatVec[0], -quatVec[1], quatVec[2], quatVec[3]);
_camera.transform = quat.toMatrix3D(_core.transform);
_view.render();
}
I’ll clean it up, promise. Just wanted people to have it as soon as possible to play with..
|
mrpinc, Sr. Member
Posted: 31 May 2013 07:25 PM Total Posts: 119
[ # 20 ]
Terrific stuff Jonathan, hopefully I can get my Rift soon so I can help you out with this.
|
Fragilem17, Newbie
Posted: 31 May 2013 07:38 PM Total Posts: 15
[ # 21 ]
Jonathan Hart - 31 May 2013 12:50 AM because projection matrices act on the vertices, not the texture coordinates.
The lens would distort the screen and you’d have shapes that appear correct, but the textures on them would look odd because the rendering is doing the warping in euclidean space to correct for lenses that do warping in “Screen” space.
Damn, indeed didn’t think of that! the vertices would be in the correct place, but straight lines would be drawn to them… instead of curved ones. that would give weird effects…
okay so just normal rendering and then pixel pushing until it fits. thanks for the explanation.
great first release, testing right now!
Hopefully not much longer until i get my rift (belgium) and i can help with the windows ANE
|
Fragilem17, Newbie
Posted: 31 May 2013 10:19 PM Total Posts: 15
[ # 22 ]
Did some testing got stuff to render so yay
a question, shouldn’t stereoOffset=0 result in a box be displayed at 1/4 and 3/4 of the screen, it now shows 2 half boxes in 2/4.
Both boxes should then also look exactly the same i think.
|
Jonathan Hart, Newbie
Posted: 31 May 2013 10:35 PM Total Posts: 16
[ # 23 ]
Stereo offset is the difference between the left/right eye cameras.. it’s adjustable and I checked in a value that looked pretty good to me. Definitely up for more calibration and tweaking as only far off skyspheres were in focus.
I think that the cameras are inverted too (right is left and left is right). I was trying to flip them last night but passed out from exhaustion.
|
Fragilem17, Newbie
Posted: 31 May 2013 10:42 PM Total Posts: 15
[ # 24 ]
aaah ok got it. It’s like IPD, the physical separation between your eyes is used to move both boxes in the location your eyes would be looking in the rift. and the change of perspective is then directly connected to that ipd value to make it physically correct. got it now.
On my big laptop screen i just could not parallel view my result, and tried playing with the separation but could not figure out how it worked before, makes sense now.
Going to bed. damn time difference on the other hand, who knows what new stuff i will wake up to
|
Fragilem17, Newbie
Posted: 31 May 2013 10:45 PM Total Posts: 15
[ # 25 ]
cameras are definitely inverted now, cross eye viewing works like a charm on the laptop screen. (got a headache now though)
|
Jonathan Hart, Newbie
Posted: 01 June 2013 01:41 AM Total Posts: 16
[ # 26 ]
Yea, last night was rough for me. When I finally took the headset off I felt dizzy and disoriented
|
Fragilem17, Newbie
Posted: 01 June 2013 10:26 PM Total Posts: 15
[ # 27 ]
After some tweets from Rob Bateman
you shouldn’t need to use the stereo view, that’s only if you want to use those red & green glasses. use two views side by side
yes, to achieve the stereoscopic render you only need to render two views side by side with the same scene and different cameras
also:
make sure you use a shared stage3dproxy otherwise you will double the amount of GPU memory used
I’ve commited a fork on github with a flashDevelop project in the demos folder that just makes use of 2 views and 2 cameras and it does the job (haven’t figured out the shared stage3dproxy yet though. https://github.com/Fragilem17/oculus-ane
Is there any disadvantage of doing it this way instead of the OculusStereoRenderMethod you posted Jonathan? i’m guessing performance will have something to do with it…
|
Jonathan Hart, Newbie
Posted: 02 June 2013 07:39 PM Total Posts: 16
[ # 28 ]
Thanks for all the additions!! I will take a look and approve the pull request very soon. I have been taking a break this weekend but will be back to work this evening.
|
Fragilem17, Newbie
Posted: 03 June 2013 03:38 PM Total Posts: 15
[ # 29 ]
Having re-read the Oculus_SDK_Overview.pdf about doing the barrel distortion, i now understand we need to finish the ANE first, with the HMDInfo and StereoConfig classes exposed to AS3.
In order to know the correct FOV, we need to get the screensize, ipd, screen distance and stuff from the device (so it keeps working when newer devices are released)
Maybe we can start with “fake” classes that mimic the SDK but aren’t fully implemented yet, just returning hard coded averages that are compliant with the Developer Rift.
That way we can focus on the away3d views/cameras and AGAL code to do the barrel distortion and later get the real values from the Rift.
What do you think?
|
|