RenderTexture with mixed 2d and 3d nodes #2308
Replies: 7 comments 20 replies
-
It's a little hard to test without an example to run, but have you considered implementing your own version of For example, this is how they're currently implemented:
How about implementing and using these for the "layers" that you described:
So it would be used this way:
Just taking a guess here, so I'm not sure if this would work for what you're after, but it a quick test with 2D nodes, it does render them correctly to a single |
Beta Was this translation helpful? Give feedback.
-
@martinking71 Is this effect what you're after? This is with a sepia shader applied to the RenderTexture sprite: The updated code in my previous post is what was used for this. It consists of a 2D layer, then a 3D layer, then a 2D layer on top, all rendered onto the same RenderTexture. |
Beta Was this translation helpful? Give feedback.
-
Not sure if this helps but I've attached my rough work in progress RenderTextureTest and updated CMakelists.txt file. Should just drop into the cpp-tests and run on test set 42. Click the menu to cycle through the different render states - normal, normal with flush, to texture, to texture with flush. I can't seem to get the overridden render fnc to call its ancestor correctly when rendering to texture, it crashes - not sure why this is. (see the top of the render fnc where the ancestor is called only when not rendering to texture). Also obvious when the menu no longer renders during the render to texture states Not sure if this is something to do with rendering a scene when inside a scene render already? |
Beta Was this translation helpful? Give feedback.
-
Here is a cleaner stand alone project that shows the different visit() processing and the effects using the extended RenderTexture with the partialBegin / Ends. Just click on the menu option to see:
There are a few movement effects added to show some of the issues I've found when rendering to texture. Simple new project so should be easy to spot the mistakes I've probably made. All the work is in the MixedMode::visit() where all visits() are manually called. I would guess that the camera might not be set correctly. It looks like it's just using the default camera - you have to have a camera with ax::CameraFlag::USER1 set so the 3d renders but it seems to ignore the other camera values - FOV, depth and transform. In the MixedMode constructor there is some commented out code that should make the camera move but doesn't. A block below will make the scene move so that shows updates are working, just not on the camera. |
Beta Was this translation helpful? Give feedback.
-
The second setting in the demo is the correct one - all the versions should look this was - see the visit() fnc with the manual draw order as all Nodes have the default depth settings so the draw order should be as ordered. The flushing should ensure this - the without flushes are there so show what else happens.
Thanks again for looking into this |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Here's what I see when running my code before and also with your updates. I see the same as you for 1 & 2 with 2 being the effect I expected - manual draw order but the renderer->render() must be called to get the 3d drawn in between the sprites - because the sprites and 3d have have default depth settings. I'm running on a MacBook Pro M2 if that makes any difference. |
Beta Was this translation helpful? Give feedback.
-
Hi,
I’m trying to create a torch like effect and also a magnifying effect using shaders and have had some success using RenderTexture.
All works fine until I add in 3d nodes to the mix.
I keep my own scene graph and call visit as needed on Nodes - mainly ax::Sprites and ax::Labels. My scene graph is composed of many 2d nodes with potentially many local 3d scene nodes.
I keep all my local MeshRenderer Nodes in a local Scene with their own Camera and before calling render on this local 3d Scene I call render() on the Director’s renderer to flush it and render any previous 2d nodes - saves any depth settings for 2d. This then renders this local 3d node and allows more 2d nodes to be rendered on top of it and more local 3d scenes as need be, simply layering nodes over each other.
Is there a way to flush the RenderTexture or how should I go about extending RenderTexture to support it. The end() command swaps the original target back in so rendering doesn’t go to the texture anymore. Calling render on the Director's renderer before rendering these local 3d nodes just produces a black screen as RenderTexture::end() needs to be called first. And if end() is called before rendering a local 3d scene I only get up to that point rendered into the texture so the shader effect only works on part of the scene.
I feel like I’m missing an obvious call to clear a depth buffer or something similar instead of the call to end().
And again just to reiterate, this process works fine if I don’t render to a texture or if I do but just to 2d Nodes - the manual render, the local 3d scenes with their cameras and collections of MeshRenderers - but obviously I can’t create the effects I want using shaders, which is the whole point of this exercise.
Thanks for any pointers.
Beta Was this translation helpful? Give feedback.
All reactions