I am having problems matching the frame rate of the rendered simulation from EmberGen and my Blender animation.
Here are the details:
I have created a simple camera animation in blender of 0-300 frames. The frame rate in Blender is set at 24fps with a frame size of 720/576. I then created a viewport animation render which will be used as a camera backplate in EmberGen.
I imported the camera animation into EmberGen and confirmed that the animation length and frame rate matches the Blender rendered output. In the camera options I used the backplate sequence as above and checked the scene preview using the imported camera and it matches perfectly. In the camera node I also ensured that the imported frame rate is 24fps.
In the Export image node I switched to sequence as my export mode. The number of frames was changed to the length of the Blender animation as well as the final output image size.
I then rendered out a sequence which was imported into After Effects along with the backplate sequence. All imported and composition settings were set at 24fps.
Here’s where the problems start. The output sequence from EmberGen does not sync up with the Blender sequence. The EmberGen sequence lags considerably. I have tried to marry up the two sequences by moving the simulation in the timeline as well as changing the frame rate of the EmberGen sequence. If there is no fix for this can I render the simulation AND the backplate together?
Any thoughts? Thank you
EDIT: Solved! EmberGen uses 60fps so I set the imported sequence in After Effects to 60fps and left the Blender sequence at 24fps and they sync perfectly!
Time for coffee…