|
seanClusta, Jr. Member
Posted: 23 November 2014 03:37 PM Total Posts: 38
Has anyone implemented this new feature on Away3D yet?
It’s a beta feature of the new AIR SDK (15.3 onwards I think?) light on documentation but there’s a bit on the API here: http://blogs.adobe.com/flashplayer/2014/10/beta-feature-videotexture-and-stage3d.html
I’m able to create the videoTexture and attach a video to it. The video does play as I can hear it. But I’m unsure exactly how to add it (it’s part of the context3D rendering context).
At the moment I’m trying to draw into a BitmapData object using
stage3D.context3D.drawToBitmapData(bmp.bitmapData).
Then use a BitmapTexture with that BitmapData as the input. Then finally applying the bitmapTexture instance as the input for a TextureMaterial (applying it to a mesh)
That’s not displaying on the (planeGeometry) mesh that I’ve added it to though? Any ideas what I’m doing wrong here anyone?
|
bouze, Newbie
Posted: 20 May 2015 07:06 PM Total Posts: 3
[ # 1 ]
I’ve tried to adapt native VideoTexture function to Away3D 4.1.6
*need AIR SDK 17 or higher
package away3d.textures { import away3d.materials.utils.IVideoPlayer; import away3d.materials.utils.SimpleVideoPlayer; import away3d.textures.Texture2DBase; import flash.display3D.Context3D; import flash.display3D.textures.TextureBase; import flash.display3D.textures.VideoTexture; import flash.events.Event; import flash.events.VideoTextureEvent; import flash.media.VideoStatus; public class NativeVideoTexture extends Texture2DBase { private var texture:flash.display3D.textures.VideoTexture; private var _autoPlay:Boolean; private var _player:IVideoPlayer; public function NativeVideoTexture(source:String, loop:Boolean = true, autoPlay:Boolean = false, player:IVideoPlayer = null) { try { _player = player || new SimpleVideoPlayer(); _player.loop = loop; _player.source = source; _autoPlay = autoPlay; } catch (e:Error) { trace(e, e.getStackTrace()); } } override protected function uploadContent(texture:TextureBase):void { } override protected function createTexture(context:Context3D):TextureBase { try { trace("Context3D.supportsVideoTexture", Context3D.supportsVideoTexture) if (!Context3D.supportsVideoTexture) { throw new Error("flash.display3D.textures.VideoTexture not supported"); return null; } texture = context.createVideoTexture(); texture.attachNetStream(_player.ns); texture.addEventListener(Event.TEXTURE_READY, onTextureReady); texture.addEventListener(VideoTextureEvent.RENDER_STATE, onRenderState); if (_autoPlay) _player.play(); } catch (e:Error) { trace(e, e.getStackTrace()); } return texture; } private function onTextureReady(e:Event):void { dispatchEvent(e) } private function onRenderState(e:VideoTextureEvent):void { if (e.status == VideoStatus.SOFTWARE) { trace("Indicates software video decoding works.") } if (e.status == VideoStatus.ACCELERATED) { trace("Indicates hardware-accelerated (GPU) video decoding works.") } if (e.status == VideoStatus.UNAVAILABLE) { trace("Indicates Video decoder is not available.") } } override public function dispose():void { if (_player) _player.dispose() if (texture) texture.dispose() super.dispose(); } public function get player():IVideoPlayer { return _player; } } }
|
theMightyAtom, Sr. Member
Posted: 21 May 2015 11:03 AM Total Posts: 669
[ # 2 ]
Very cool Bouze!
Have you posted any examples using the texture? What’s the performance like? I’ve noticed it’s come for Android in the lastest Beta, so now it’s REALLY interesting ![wink](/images/smileys/wink.gif)
|
bouze, Newbie
Posted: 21 May 2015 02:35 PM Total Posts: 3
[ # 3 ]
Thanks theMightyAtom!
I played 3K video on native video texture.
Frame rate went up 60fps from 10fps.
Sorry, I have no example
|
seanClusta, Jr. Member
Posted: 21 May 2015 03:30 PM Total Posts: 38
[ # 4 ]
Awesome, thanks for this bouze!
It does indeed work nicely, I’m wrapping a video onto a sphere, the video is 2880x1440 and it’s playing back at 60fps.
I’ve posted a skeleton example of this, though anyone using it will have to add the Away3D libs, add bouze’s NativeVideoTexture, and add a video.
2 things to note:
the texture.attachNetStream(_player.ns); line will throw an error as ns isn’t a publicly available property. To fix I added a getter method to the SimpleVideoPlayer class.
Also - I had to set mipmap to false (default is true) in the TextureMaterial which uses the NativeVideoTexture. As far as i know the new VideoTexture feature doesn’t allow mipmaps.
Great work, thanks!
File Attachments
|
bouze, Newbie
Posted: 22 May 2015 07:21 AM Total Posts: 3
[ # 5 ]
Hi, seanClusta
Thank you for correction!
|
theMightyAtom, Sr. Member
Posted: 09 June 2015 12:22 PM Total Posts: 669
[ # 6 ]
Anyone experiencing width/height restrictions?
Media Encoder won’t ouptut an flv/f4v over 1920.
As mp4 , 4k, it almost plays on my PC, but very choppy, and won’t play at all on Android (flv plays no probs ![smile](/images/smileys/smile.gif)
How can we retain the resolution of the original video? Any ideas?
I have it playing in Cardboard, and there is no extra strain for the video being in “stereo”.
Cheers!
[update: After throwing an error every time, Media Encoder is now outputing 4k (3840 x 1920) to f4v, though it plays even worse than mp4]
|
seanClusta, Jr. Member
Posted: 09 June 2015 02:42 PM Total Posts: 38
[ # 7 ]
Hi Pete,
I can play back a 2880x1440 mp4 VideoTexture fine on desktop but it’s struggling on android (the quality/bit-rate is high but the frame-rate is 10-15 and it’s choppy)
I’ve played back the same video on a GearVR and it plays back at 30fps on a Note4 (mono 360, through the built-in 360 video viewer app)
I’m not sure why AIR would be struggling with this? Maybe it’s because it’s new on android and in Beta? Maybe the way the video data/texture buffer is transferred to Stage3D? Not sure exactly how this works tbh.
As for encoding settings I think it’s advised to stay away from flv/f4v containers now and use mp4 with h.264 anyway. Maybe you could achieve better quality/performance by playing with the profile and level in Media Encoder?
I know that h.265 has been designed for a better quality/bandwidth ratio (same visual quality in roughly half the bit-rate) and is supported by the hardware on the Note4. I would imagine this will be the de-facto codec for VR video playback moving forward as more and more devices support hardware decoding for it, but for now we might have to struggle on with lower resolutions?
I would be keen to know if you find any optimal settings for quality/resolution and performance though, I’ll post any of my findings here too.
|
theMightyAtom, Sr. Member
Posted: 09 June 2015 03:15 PM Total Posts: 669
[ # 8 ]
Cool, I’ll do the same. I wonder if it gets altered somewhere to a power of 2 size? Maybe encoding to 2048 x 2048 would make a difference? I don’t feel I’m getting the same quallity out as goes in, though it may be an anti-aliasing difference. Quality is a big issue in VR apps, as you’re so close to the screen.
Could use a video sliced into pieces, only showing the piece in focus, though that would certainly create other issues…
Cheers!
Update: Have tested with 2048 x 2048 f4v, but decoding is still too slow. Actually just as bad performance as 4K video.
|
Beek, Member
Posted: 25 June 2015 01:58 AM Total Posts: 67
[ # 9 ]
Hi Sean and Mighty atom, how are you guys getting on with video textures. I’m about to embark upon 360 video in a VR viewer with Away3d and I’ve read your posts with interest.
Have you managed to get good consistent quality yet?
|
theMightyAtom, Sr. Member
Posted: 25 June 2015 11:30 AM Total Posts: 669
[ # 10 ]
If you keep to HD it’s fine, above that its a question of encoding and actual device power. For example my Oppo 7 seems to cope with more than my desktop, though it may be the debugger that slows it down.
Definately worth a look, and remember it’s in beta, så maybe it will improve before launch, and maybe Adobe will produce some guidelines for usage.
I’ve used this materal for webcam textures too, with a few mods.
Good luck!
|
Beek, Member
Posted: 25 June 2015 09:56 PM Total Posts: 67
[ # 11 ]
Great thanks! I’ve kind of promised it to a client now so hopefully it works!
When you say HD, what kind of resolution and image size you talking?
Do you think there might be a performance improvement if the video is sliced into 6 tiles and displayed on a cube? The performance advantage I immediately thought of was removing/pausing video from any tiles not in view, but I assume adding it back it is going to be enough of a hit to make the movement suffer.
Currently we use a skybox as a kind of panorama for VR. I was considering loading the videoTexture onto the skybox in individual cube faces, or alternatively the whole sphere as Sean has done.
|
Yaas, Newbie
Posted: 05 December 2015 08:45 AM Total Posts: 1
[ # 12 ]
Hi,
I used the example by seanClusta with a h.264 mp4 but the video is not shown and I only hear the sound of the clip. I added a trace(netStream.info) in createTexture method but returns everything 0. Can anybody help please?
|
Beek, Member
Posted: 15 December 2015 09:17 PM Total Posts: 67
[ # 13 ]
Hi @SeanClusta
Can you elaborate on
“the texture.attachNetStream(_player.ns); line will throw an error as ns isn’t a publicly available property. To fix I added a getter method to the SimpleVideoPlayer class.”
I’m struggling to work out where this goes!
EDIT - Ignore, found SimpleVideoPlayer
|
theMightyAtom, Sr. Member
Posted: 19 August 2016 01:16 PM Total Posts: 669
[ # 14 ]
Adapted as GPU accelerated camera feed.
Could be useful for AR apps, etc.
package away3d.textures { import away3d.materials.utils.IVideoPlayer; import away3d.materials.utils.SimpleVideoPlayer; import away3d.textures.Texture2DBase; import flash.display3D.Context3D; import flash.display3D.textures.TextureBase; import flash.display3D.textures.VideoTexture; import flash.events.Event; import flash.events.VideoTextureEvent; import flash.media.VideoStatus; import flash.media.Camera; public class NativeCameraTexture extends Texture2DBase { private var texture:flash.display3D.textures.VideoTexture; private var camera:Camera; public function NativeCameraTexture(cameraName:String = null) { if (cameraName == null) { camera = Camera.getCamera(); // select default camera for device }else { camera = Camera.getCamera(cameraName); // choose specific camera } } override protected function uploadContent(texture:TextureBase):void { } override protected function createTexture(context:Context3D):TextureBase { try { trace("Context3D.supportsVideoTexture", Context3D.supportsVideoTexture) if (!Context3D.supportsVideoTexture) { throw new Error("flash.display3D.textures.VideoTexture not supported"); return null; } texture = context.createVideoTexture(); texture.attachCamera(camera); texture.addEventListener(Event.TEXTURE_READY, onTextureReady); texture.addEventListener(VideoTextureEvent.RENDER_STATE, onRenderState); } catch (e:Error) { trace(e, e.getStackTrace()); } return texture; } private function onTextureReady(e:Event):void { dispatchEvent(e); trace(texture.videoHeight); } private function onRenderState(e:VideoTextureEvent):void { if (e.status == VideoStatus.SOFTWARE) { trace("Indicates software video decoding works.") } if (e.status == VideoStatus.ACCELERATED) { trace("Indicates hardware-accelerated (GPU) video decoding works.") } if (e.status == VideoStatus.UNAVAILABLE) { trace("Indicates Video decoder is not available.") } } override public function dispose():void { if (texture) texture.dispose() super.dispose(); } public function get player():IVideoPlayer { return null; } } }
|
mindthegap, Newbie
Posted: 02 October 2016 01:21 PM Total Posts: 6
[ # 15 ]
hi,
I added this to the SimpleVideoplayer, but stil having the _player.ns not public error. why, wheres my fault?
public function get ns():NetStream
{
return _ns;
}
|