Blender Rendering Pipeline on AWS

Posted by Michael Davie on Friday, October 20, 2023

steaming-tea

My son has gotten interested in 3D animation with Blender, but his low-grade laptop is too underpowered to render any reasonably complex animation in a sane amount of time. He finally agreed to try out rendering in the cloud, and this post describes what I ended up building.

I came across a promising workshop from AWS that used AWS Batch to split a rendering job across multiple compute nodes. This seemed to be exactly what we needed (really, they had me at Step Functions) so I worked through the workshop steps and got the solution up and running fairly easily.

After testing out the existing code, I decided to make a few improvements:

Below is a high-level view of the new project:

architecture-diagram

The basic flow of the pipeline is as follows:

  1. A .blend or .zip file is uploaded to the input bucket in Simple Storage Service (S3).
  2. An EventBridge rule triggers a state machine in Step Functions, which executes the following steps:
    1. A Lambda function extracts the .zip file (if required) and writes the project file(s) to EFS.
    2. A Lambda function analyzes the .blend file and determines how many frames need to be rendered.
    3. A Batch job is created and executes container-based tasks in Fargate, each of which renders a single frame and writes it to EFS.
    4. A single-task Batch job is created to stich the frames together into a movie file, which is uploaded to the output bucket in S3.

And here’s the rendered output in GIF form:

steaming-tea

The code for this project is availabe on Github.