The Problem 📝
Animation is a time consuming practice. Even with advancements in technology lowering the barrier to entry, creating quality animation in 3d continues to be inefficient. I routinely found that I needed a way to produce animated content quickly and efficiently without a large team nor budget. So naturally, I began researching alternatives to mainstream industry workflows, and over time developed one of my own using VR as the starting point.
This short animation, titled Butterfly, was created using this workflow. I'll be breaking down the process in this blogpost.
Quill by Smoothstep
I was walking about with my partner when she stopped to admire an interesting flower. It was this wonderfully mundane moment that I couldn’t help but photograph. Later, I did a quick paint-over on my phone using Procreate. I went hard and fast with this, no precision, just vibes. This took about 20 minutes.
It didn't seem worth it to design a new character just for a standalone render, so I decided to re-use this one from a previous illustration.
Her name’s Tife (Pronounced Tee-Feh), she’s fun.
With some concept art to set the end goal, I spent somewhere around four hours building Tife’s rig. Using Quill, I constructed her model, matching it as closely to the 2D illustration as possible.
Quill doesn’t have a traditional rigging system, there are no bones, nor is there skinning, so I have her body split into several layers according to the image below. It’s very stop-motion-ey.
I also made one for the butterfly because why not.
If you’d like to know more about this process, animator Daniel Martin Peixe did a great stream breaking down how the rigging works in Quill.
Anyways, with the rig ready, I can get to animating.
Don’t let the GIF fool you, this took hours.
I animate on 3’s because the lower framerate helps echo the stop-motion aesthetic a bit. And also because I’m lazy.
But yeah, with animation complete, we can now export.
Lighting and Rendering
Like in regular 3D animation, I animate the character first and then add lighting later on. However, because I’m using Quill, some considerations need to be made.
I’ll need to explain the sausages.
So because in Quill you create models out of individual, unlit brush strokes, you might paint something in VR and it looks like this:
But then you take it into Blender, where there’s lighting, and it looks like something from your favourite nightmare:
Yeah. We don’t want that.
In order to fix this, I had to make Custom Normals. Tife’s model was built with this in mind so the brush strokes are painted in ways that would look appealing when shading is applied.
Unfortunately though, upon importing Tife into Blender, I had to fix face orientations because I did a bunch of mirroring when constructing her model and I am paying for it now.
After that, all that’s left is to add a Data Transfer Modifier to Tife using a sphere as a source. It’s very similar to what they do to shade dense tree leaves in video games. Here’s a quick tutorial by @ninetydev. It’s a neat trick.
Once I had that set up, I could tweak the custom normals to achieve a balance between smooth shading and hand-painting. I exported the animation in pieces so I could tweak the normals on a case by case basis. Helps me fine tune how Tife receives lighting.
Next, I took things into Unity to do a lighting pass. There’s no need to make the background in 3D since, apart from some noise, the camera doesn’t really move. Another reason is because i’m lazy.
All that’s left here is to use a bit of EbSynth to push the style a bit further. I painted a couple of keyframes in Photoshop and Adam composited them into the final video.
There are still a few kinks to work out with this workflow, the cleanup process can be tedious and renderers like Blender’s Cycles aren’t very kind to models made with this technique.
However, I’m confident that this is a viable way to make animation. With a bit of ingenuity, we’re able to create quality art while simultaneously working very quickly.