|
Mike Samuel, Newbie
Posted: 22 July 2012 03:06 PM Total Posts: 30
Hello,
I’ve been reading many posts here regarding shadows and most posts can be summarized into: use backed shadows for any static object, and away3d shadows for moving objects only.
So in Blender I add a second UV channel to my material, I back a lightmap onto a separate png file, set them to overlay mode, yada yada, eh voilĂ ... Blender shows the model as expected, Material plus Light.
To be clear, I have two png files, the material png itself, and a light map png.
And yes, I need the separation as I need to switch materials later on.
I export to 3ds, load the model in away3d and nada: only the lightmap itself shows up as material.
As far as I can tell, non of the examples on github are using multiple UV channels, nor can I find a working, complete example anywhere on Google @ co.
Can anyone provide a working example on how to use multiple UV channels?
Can I use 3ds to export/import my model? (I found out OBJ can’t handle multiple UV channels, I am not sure about 3ds)
Am I supposed to do this myself in code, aka: clone the mesh and its geometry into a submesh ...add the light map as top layer myself ?
Any working examples would be highly appreciated
not much code to post, right now it’s a basic loader for 3ds.
regards
Mike
|
Matse, Sr. Member
Posted: 22 July 2012 04:50 PM Total Posts: 149
[ # 1 ]
Hi Mike,
That’s a good summary. There is no working example that I know of, and I’m not on that yet so can’t provide code but the main problem here is that none of the export formats will handle two UV sets.
So you need to export 2 versions of the mesh : one with the regular texture UVs, and another with the lightmap UVs.
Load both in away3d (later on we could probably make a tool to just extract and store relevant info from the lightmap model… but we’re not quite there yet). Access the lightmap UV data, and add it to the texture mesh’s second UV layer.
Not sure if you can just copy the data directly or if it’ll need some process : I used shockwave3d before and every export produced different results so some code was needed just to “match” the 2 meshes… quite a hell but that worked.
Also not sure where to put the lightmap texture once you got the secondary UVs in, but that’s a start ^^
|
Mike Samuel, Newbie
Posted: 22 July 2012 05:45 PM Total Posts: 30
[ # 2 ]
Hi Matse,
Thanx for your fast reply, I was already suspecting 3ds does not support multiple channels, but now I know.
Ok, .. I get the part to create two models and loading them.
Lets say I have an example model, single mesh plane called meshA
I would end up with something like
var loaderLight = .... var loaderMaterial = ....
and on OnCompleted .. we store them in individual vars
var meshLight:Mesh = loaderLight.MeshA var meshMaterial:Mesh = loaderMaterial.MeshA
now I know I can do something like this to get UV Data
var lighUVData = meshLight.geometry.subGeometries[0].UVData;
but then what? just do soemthing like this??
meshMaterial.geometry.subGeometries[0].secondaryUVData = lighUVData ;
Now I just typed this code in kinda pseudo code, its about the idea, not about working code. Couple of questions.
Where and how do I point to the lightmap.png file?
I would guess the secondary channel needs to know the lightmap in order to do it’s woodoo stuff to the material. Since the material already links to the material.png, there must be a place to point to a secondary material??? (I’m really no 3D pro, so just stone me if I ask to stupid questions)
I have no idea what you mean by “processing the data first”, I do not even understand what the UV data is, I get the basic idea of UV mapping, but that’s it.
Thanks
|
Richard Olsson, Administrator
Posted: 23 July 2012 08:22 AM Total Posts: 1192
[ # 3 ]
What you are describing (albeit in pseudo-code) should work fine. Retrieve the UVData vector from the sub-geometries of one of your geometries, and upload it to the other geometry using it’s sub-geometries’ updateSecondaryUVData() method.
The light-mapping features in Away3D are part of the shading framework, which is based around a concept of “shading methods” that can be daisy-chained and linked together into a shader. For light-mapping you have two methods you can use. One is the LightMapDiffuseMethod which goes in the “diffuse slot” on the material. The other is the LightMapMethod which goes into the daisy-chainable appendix that each shader has. Depending on which one you want to use, you’ll either assign it to the diffuseMethod property of your material, or pass it into the addMethod() function.
You probably want your light-map to be overlaid on top of your diffuse texture. In that case what you probably want is the regular LightMapMethod shading method.
If that is indeed the case, create an instance of LightMapMethod, passing in your BitmapTexture instance for the light-map, the blend mode (either ADD or MULTIPLY) and a boolean indicating whether you want this light-map method to use the secondary UV set, which in this case should be true.
Finally, just use myMaterial.addMethod(myLightMapMethod) to add the light-map method to the shader generation code.
|
Mike Samuel, Newbie
Posted: 23 July 2012 09:52 PM Total Posts: 30
[ # 4 ]
Ok Richard, you do realize you just put a small book worth of knowledge and multiple days @ Google into your 5 paragraphs here.
Men, this is the most help-full, knowledgeable response I’ve ever got.
This mite be simple stuff for all your 3D gurus, but for mear mortals like “moi”, this is simply awesome. It would be really, really nice if the documentations would include such explanations. WAU
Ok, I get myself together again, Thanks Richard, you just made my day.
Since I am using the exact same model in blender to create the Material UV and the Light UV, it turns out I don’t even need to load the LightModel. The UV mapping is identical, so I can use the LightMapMethod() with false as last param. Nice. Just need to include the png files.
I have not yet had a chance to try what the “LightMapDiffuseMethod ” does, but I think the Icebear code in the examples uses that, so I should be able to figure it out.
You also mentioned the chaining. So I went ahead, copied the lightmap png, drew some lines onto it in black, just a simple horizontal pattern. And one more, using some colored lines. Added them both to the material.
As expected, they show up with the shadow, even the the colors show up, so I am thinking. Should I use this method to create pattern?
Usually, I create submeshes, copy the geometries, assign the pattern materials, yada yada.
Is there a downside of adding multiple LightMapMethod to create pattern?
It seems easier then the subMesh copy approach.
Again lotsa thanks for the last reply , really helpful stuff
Regards
|
Richard Olsson, Administrator
Posted: 23 July 2012 09:57 PM Total Posts: 1192
[ # 5 ]
Do you really even need a light map at all? Based on the fact that you are not using more than one UV set, it sounds like you could just bake the light into the diffuse texture instead, thereby removing the need for light-mapping altogether.
You should definitely try to use as few shading features as possible. Shorter shaders will render faster on the GPU, and more importantly, fewer texture dependencies will result in less GPU upload overhead, state switching and memory usage.
We will try to add more examples for the next release where we will have a better example work-flow in place for our documentation.
|
Mike Samuel, Newbie
Posted: 23 July 2012 10:26 PM Total Posts: 30
[ # 6 ]
well, I try to use a lightmap in order to avoid using the shadow methods /light in away3d directly.
I don’t have a screen shot handy here, but imagine a simple house, triangular roof.
The model uses a single texture, UV mapped to each wall.
Now in flash, I need to change the wall color / pattern, hence the main texture. So I have x different main texture png files, one for each color /pattern.
The way I see it, if I would to back the shadow onto the main texture, I would need to
a) have a specific png file for each wall
b) then have 4 png texture files for each main textures
c) need to handle all of this in code (assign frontTex to Front mesh, etc.
This way, I have 2 shadow maps to assign (only front ans one side), and a single material I just assign to each wall mesh.
It’s actually more complicated, cause there are windows, doors etc, but you get the idea. The main texture changes interactively, while shadows should and can stay as they are.
But I am open for suggestions on making this easier…
The model is to complex to use Flash to do the shadow/light. It slows down to much, especially if you don’t have the newest hardware.
|
Richard Olsson, Administrator
Posted: 24 July 2012 08:45 AM Total Posts: 1192
[ # 7 ]
Ok. Your situation seems like a valid case to use light-mapping then! You should still try to keep it to just one LightMapMethod per material though. If you have several light-maps that you for some reason want to apply on top of each other, consider merging them into a single BitmapData at runtime and using that as a texture. Again, the goal here is to keep the number of textures and the length of the shader down.
|
Matse, Sr. Member
Posted: 24 July 2012 01:09 PM Total Posts: 149
[ # 8 ]
Ok Richard, you do realize you just put a small book worth of knowledge and multiple days @ Google into your 5 paragraphs here.
+1 Any chance we can make this thread a sticky or something until we get better options to share knowledge across users (away3d wiki ?) ?
|