Projection Mapping

Software: Away3D 4.x

seanClusta, Jr. Member
Posted: 24 February 2014 06:02 PM   Total Posts: 38

Has anyone used Away3D for projection mapping?

I’m assuming it would be possible to build a tool where a quad or grid could be warped in realtime (dragging the quad’s corners to fit its target) and have videomaterial applied to it to become a simple quad-warping tool for projection mapping.
Has this been done before? And if not, could anyone advise on performance bottlenecks while using and warping videomaterial/s?

Thanks

   

Avatar
mrpinc, Sr. Member
Posted: 24 February 2014 06:13 PM   Total Posts: 119   [ # 1 ]

I’ve taken video material and applied to non-square objects and it seems to perform just fine.  I have not manipulated the mesh in real-time but I don’t really think you’ll see any performance issues, not on desktop anyway.

   

Avatar
Fabrice Closier, Administrator
Posted: 24 February 2014 06:34 PM   Total Posts: 1265   [ # 2 ]

Its all about uv’s, so theoretically yes, you could. I’m not familiar with the hard/software of such projectors but as the image projected is superimposed on a real 3D volume, you would probably need to have a mesh perfectly aligned to real world one, then aside the handling of compositing, you would need a variation of the uv Projection class to map from the camera point of view.
As long as you use the api to generate/update runtime the uv’s or the geometry, its perfectly doable.
I would worry most about the framerate if you would use HD (or any high res video) and Away to render it. Stage3D and video are not the best friends…
As said, I’m not familiar with the hardware.

You could also check processing(.org), this api is and was used in countless 3d mapping cases.

   

seanClusta, Jr. Member
Posted: 25 February 2014 10:42 AM   Total Posts: 38   [ # 3 ]

Fabrice -to get the projection mapped onto physical objects is as simple as projecting the output from the stage and dragging the vertices so they align with the physical points they’re being mapped to (the corners of a wall for instance) so I think it will work fine. Here’s a tool written for OpenFrameworks which does it if anyone’s interested to see how that’s done http://www.youtube.com/watch?v=SRyNj_HufUU . It’s the performance of mixing Stage3D & multiple videos I’m worried about but I think I’ll give it a go and see how it performs. I will post back any findings

Cheers

   

Avatar
Fabrice Closier, Administrator
Posted: 25 February 2014 12:04 PM   Total Posts: 1265   [ # 4 ]

It should in theory work yes,  what I’m saying is that moving vertices to mimic the real3d world object, updating uvs isn’t the problem. It’s about compositing the video’s and the playback frame rate. Projecting a single stream with no realtime “mix” would be probably ok, assuming you run on a descent machine.
In this video for instance, you can see multiple sources. Unless the videos would meet some kind of atlas mapping or the sources would be low res for smaller objects, I don’t think Stage3D could handle 2 or more high res streams at acceptable fps.

Of course, we’ll know for sure once someone tries wink
Let us know!

   

Avatar
theMightyAtom, Sr. Member
Posted: 25 February 2014 12:20 PM   Total Posts: 669   [ # 5 ]

I’ve done a test using AWAY3D and an Android powered projector.
Worked quite well :O)
http://www.youtube.com/watch?v=-b-YBMUoXco
In this case we were testing interactive surfaces in 3D, that is configurable products.

If you are doing pure video you can either use AfterEffects like this cool vid:
http://www.youtube.com/watch?v=i7X8ZnmLfM0&list=LLdfmUD5q097W2hrLEzQGvuA&feature=mh_lolz

Or there is aslo a tool called VVVV used for many of this type of installation.
http://vvvv.org/

Good Luck!

   
   

X

Away3D Forum

Member Login

Username

Password

Remember_me



X