I have downloaded the latest beta of Embergen and have resumed my testing from Spring 2021.
My questions back then were asking if Embergen can load Alembic or Fbx so we can have animated Meshes matched from Maya so that the output VDB would match the animated charcters in Maya and hence the resulting VDB would line up with the Maya animation and render in Redshift correctly.
Today I was successfully able to export my character from Maya into Embergen via the “import” node using the Alembic file format.
I can see the character animating correctly. The next step would be to attach an emitter to one of the character’s arms or a prop so that I can have the fire/smoke emit with the movement of the character.
Say that the character is holding a cigar or is holding a flame thrower. I would want the emitter to move with the geometry so that it all lines up and feels naturally emitting from the animated source.
First I assumed I would just import my character and then also import another geometry that would be the emitter geometry inside the barrel.
First snag was that it says only one import geometry allowed. Is this a beta/demo limitation?
How would I get an emitter to animate on a specific part in my alembic cache?
In Maya the alembic retains the object tree so I can parent or constrain the emitter to whichever part of the geometry I choose. In Embergen I do not see how I can attach an emitter to any specific part as the entire import is contained in the import node. I can see how this would be useful for collision in general but have no way of attaching anything to a specific part on a character or item.
How is this type of emitter <-> import geometry supposed to work in Embergen?
We can’t attach emitters to transforms yet. One mesh isn’t a demo limitation, it’s a GPU limitation right this second for everyone. With FBX you can select bones and subsquent vertices or vertex paint. I don’t think with alembic imports you have any control over what its emitting from in EmberGen at this time.
Ah ok, yes I was using alembic not FBX. I will adjust this so I can attach it to a particular joint/bone.
Thanks for the feedback. Much appreciated.
Ok, I did successfully get a basic cigar fbx geometry with joints into Embergen from Maya.
I did not see a way to expose the joints/bones. But I did emit from the tip geometry and make a pretty nice vdb output that Redshift was able to render nicely.
The only real trouble here is that the vdb volume when loaded into Maya does not line up with the original geometry.
When I export FumeFx vdb usually I have to rotate the vdb volume grid 90 degrees on two axis in order to get everything to line up.
I see this seems to also be true here with Embergen, but that it is also off in scale and transform as well.
It looks like I have to scale the sim up 5x once back in Maya to get the proper scale of the scene. This seems arbitrary as usually scale is relative. No matter what scale Maya is and what scale Embergen is that the vdb produced should still be relative to the scale once back in Maya unless there is a setting somewhere I need to check.
Rotation seems to also need the two 90 degree adjustments but this time I’m seeing an arbitrary transform that has no relative value I can see.
Any info available n how the vdb output to Maya Redshift is supposed to work?
Thanks. The vdb output is very smooth and detailed though so this is very promising.
One more question.
I imported the fbx file and set the time control in simulation mode to 24.hz. This matches the fbx timing in Maya. What I would like to be able to do is set the hz to around 12-16 range as far as the behavior of the smoke and fire speed but I want to not change the 24fps timing of the fbx animation meaning I still want the animation of the geometry to stay at the 300 frames length that it takes in Maya but I want the speed of the 12 hz in Embergen without changing the time the animation takes in Maya. 300 frames worth of 24fps frames of animation whilst getting the behavior of the 12hz over that same 300 frame animation. Currently it looks like changing it to 12hz to get the behavior right changes the pace of the item animating it now takes only 150 frames to do the same animation movement of the geometry.
Is this possible? To separate sim behavior speed from imported anim speed frame length ?
I wouldn’t try to retime with hz as it will affect your actual simulation and typically not be what you want. Keep hz at 60 and retime by exporting more or less frames. See this: EmberGen Tutorial: Creating a Live Action Torch Shot with Animated Meshes and Camera Imports - YouTube
Then for the scaling/offsets etc, everyone is facing this problem. EmberGen doesnt really deal in any scales right this second so there is nothing you can do to make it work other than eyeball it unfortunately. We are working on a fix to use real world units thought it’ll take time.