Unreal Engine + EmberGen ideas

Are you storing the raw data (smoke, temperature, flames) and raymarching that and applying a transfer function during the marching, or are you raymarching the color field directly and use the alpha channel as opacity?

Assuming SDR, i.e. RGBA8 for storage, that’s 10 GB of raw data at 150 frames of 4K, and if that fits in 13 MB as video that’s about a 775x compression ratio. Not too shabby!

I think we’d all absolutely love you if you could make a small demo project with your modified shader and a sample asset and uploaded it. I’ve been watching that Volumetric Rendering thread since it started and I think literally everyone wants a working example they can download.

I use only a modified shader for RGB atlas. Sometimes I use one channel for the whole effect, for example, channel R only, in the shader I multiply it by the desired color. I get alpha from RGB addition.I don’t use temperature, fuel, and velocity maps, but you can create them in Maya if necessary.
Plus animation of the object in time, for example, its appearance and disappearance. Really very small amount of space used atlas.
RGBA8 for storage, that’s 10 GB of raw data - You count every pixel without compression.
From practice, about 5%-20% of the map is actually used with compression.


Animated star looping sequence. Of course there were many optimization steps.
Now I am building cascading mip map, next step. A few cubic textures(Volume textures) for the one simulation. How does this happen in Naiad sim.
Black is perfectly compressed!
Video Volume Texture - from video

1 Like

I need to think how to do it right.
If I do not assemble everything properly, a million questions will arise, what is what and how to connect. After a while I will try to collect everything and describe how it works.

1 Like

Happy to go through it and ask questions and tidy it up for you. It doesn’t have to be documentation-ready or anything.

An example project is indeed useful :slight_smile:

We have our own plans to make a custom raymarcher and volume shaders, but it’ll be nice to see this in general.

1 Like

Hi. Just wanted to say that I would also highly appreciate the volume texture (and sequence) export feature, as I use it a lot of times at my jobs. For now, we render a sequence of volume textures from Houdini, and play it back through blueprints (swap texture each frame). Needless to say that this would be a much faster process using Embergen.

Hi. I’ve managed to get the Motion Vectors ‘working’ in UE4, but yeah, it took some jiggery pokery. I have to also invert the green channel of the motion vector texture for it to work the way I needed. Could also have been due to the way I setup the Material.

Looking forward to the EmberGen Master Material so I can see how you’ve got it all setup to work for smooth motion vector blending

We’re a new startup specializing in animation within UE4 and were very curious about volumetric data being imported into engines such as UE4. How do you think it will be implimented and a potential timeline of when to expect it?

We could potentially have our own 3D texture format and we’ll most likely write our own volumetric raymarching shader. Depending on what UE4.26 provides with volumetric clouds and such will sort of depend on what we do with our own rendering methods. A timeline for this is likely within the next few months.

Thats cool! Yeah we were looking at the new cloud system as well in 4.26. Thanks for the update! All very interesting. We are really trying to find ways to get high quality smoke and visual effects data in UE4 and this was software that recently popped up. It looks great! We will be keeping our eye on the volumetric stuff for sure!

Nick. The volumetric clouds is already available in 4.25. Also why go with own format, when Unreal uses its own format for a long time now? A lot of ppl also already have working volumetric shaders based on this format. That volumetric clouds thing also uses Unreals own volume texture format.

Because we might be able to come up with something that has less memory usage and is more compressed. We will of course support the format everyone else is using, but that may not be viable for production use, in real-time, in games. Even for the fortnite cinematic when they used ryan brucks’ simulation stuff it was only for that, cinematics.

Yep, that makes sense. I’m also working on cinematic stuff. I don’t say you shouldn’t have your own format if its better, but existing solutions should be supported too.

Yes, as I alluded to, we will support the current formats out there as well as our own when we get to it if we can make something better. We’re the masters of real-time volumetric rendering and I think we can get something great out there. But if UE4 has incredible volumetric rendering, then we’ll probably just pipe off of that instead. Who knows… it’ll be a little while before we look into it.

any new updates with UE and niagara usage?

Still a while before we’ll get to it. Focusing on the standalone right this second.