My Stages of Animation Production using VRoid Studio and Unity

What are the normal stages in creating an 3D animated video? There are variations depending on whether you are doing a full length movie, a short film, a TikTok video, or (as in my case) a motion web comic series, but there is more in common than is different. So here are the phases of animation production according to others, with my bent on it based on my personal experiences using VRoid Studio and Unity 3D for animation.

What Others Do

There are many blogs on processes used by others which are worth a read.

Animation Production: A Step-By-Step Guide to Making a 3D Animated Film: This post introduces 20 steps:

  • Pre-production: story, storyboarding, editorial (phase 1), visual development, pre-visualization
  • Production: modeling, surfacing, rigging, layout / set dressing / anim prep, character animation, crowds, character effects / simulation, FX, technical director, matte painting, lighting
  • Post-production: compositing, music and sound design, editorial (phase 2), and color grading

This list most closely reflects what I have done, although there are some variations such as I try to do color adjustments inside Unity (avoiding the need for a later stage) and I create a web motion comic without sound, avoiding the need for the audio stage.

What Are the Steps Involved in Creating an Animated Film? This has a similar list, with a few variations.

  • Pre-production: brainstorming, scripting, concept art, story boarding, previsualization
  • Production: layout, modeling, texturing, lighting, rigging, animation, rendering, voice-over
  • Post-production: compositing, sound editing, video editing.

What is an Animation Production Pipeline? This one lists a few different types of animation (hand drawn, 2D, etc) which I don’t focus on at the moment. I chose to use 3D in Unity, so are sticking to that area.

  • 3D animation pre-production, concept creation, storyboards, rough script reel, character creation, 3D animation production, 3d modeling, surfacing, rigging, character effects, matte painting, lighting, rendering, animation post-production, animation, composition, sound design, color grading

How to Create an Animated Short Film shares the experience of a story book writing in turning their physical book into an animated video. This is not 3D animation, but it was an interesting different approach.

  • Story, character creation, storyboard, animatics, backgrounds, dope sheets, rough animation, clean-up, inbetweening, background digital inking, character inking, compositing,

8 Steps of the 3D Animation Production Pipeline. Only 8 steps? Does that make it easier? I wish!

  • 3D layout design, 3D modeling, 3D texturing, 3D rigging, 3D animation, VFX, lighting, rendering.

So what are my stages? The following list is geared around my use of VRoid Studio to create characters, the Unity Asset Store to buy 3D models of buildings and props, and Unity to put it all together to create animated video clips. It reduces some of the steps I need to take.

My Series Pre-production

I am creating a series of episodes (not a single film). Several reasons, one being it means I can publish smaller units of content as I go! Creating a single long film with no results until the very end is too daunting a task for me. So there are some activities that I started before creating the first episode. Later episodes also require some series level efforts, but are reused across episodes.

  • Story and plot design: I start with an overall story that I want to tell. Fancy graphics with a bad storyline won’t keep people engaged. I watch a bit (too much?) anime, so I have gone with a series of episodes that form an arc (common in anime series), which allows me to keep episodes shorter. I also include character development in this stage – their personalities, quirks, background story. The overall series then involves these characters developing over time.
  • Character modeling: I use VRoid Studio to create characters, greatly simplifying the whole process. It does all the rigging, texture mapping, blend shapes for facial animation. Using VRoid Studio does limit my artistic freedom in character design. There are other tools that also exist if you want different looking characters, such as Blender. After creating a base character, I then extend them inside Unity to add special effects.
  • Location modeling: For backgrounds, I use a combination of 3D models purchased from the Unity asset store and terrain generation tools inside Unity. I created a number of locations up front, then plan to create new locations as required over time. Because I am creating a series, I will reuse the same locations many times in different ways. Other animated productions may hand draw backgrounds. I don’t do that (I cannot draw!). I create 3D models of locations and use them.

My Episode Pre-production

  • Scriptwriting / screenplay writing: I start with a script. Having an interesting story to tell is the start of the process. I include a first pass of storyboarding, written as text, in the script (e.g. frontal mid shot of Sam). Later, when I start building an episode inside Unity, I do a storyboarding pass first. I can then use Unity to try different camera positions without me having to draw any pictures.

My Episode Production

  • Shot assembly (rough cut): Once I have a script, I then start assembling shots according to the script. I start with speech bubbles for all shots, which gives me a chance to review the script again, then drop characters into locations, which lets me start finding good camera positions (this is effectively storyboarding, but using Unity to do all the hard work). I may even render the result and view it as an end product, to get the overall feel of pacing and timing.
  • Posing and Animation: After I have a rough cut of the overall script, I go back and flesh out the animation in each shot. This includes using existing animation clips, custom clips, and free webcam based motion capture options. There are some pretty impressive AI driven solutions hitting the market these days — frequently still rough around the edges, but improving definitely. I also include posing here as animation can often be achieved by blending between poses (including posed facial expressions).
  • Cinematography: This includes camera control (shot composition, panning, camera special effects). Unity provides a number of tools here to produce some rather nice looking shots.
  • Lighting and Visual effects: I don’t use a lot of visual effects because of the nature of the series I create, but I do some effect creation, such as rain on a window, as required by an episode. I then reuse them across episodes for consistency. I include lighting here because it also to make a shot “look good”. For example, a sunrise streaming through clouds. Comic speech bubbles I also include here.

My Episode Post-Production

  • Video editing: I do some video editing, but I try to generate ready-to-ship video clips from inside Unity to minimize the amount of work I need to do externally. I use automated scripts, when needed, to join video clips together.
  • Publishing: Because existing publishing platforms don’t support my form of motion web comic, I had to develop my own website for viewing episodes. I also publish short videos or similar on social media platforms. I also include discovery and monetization in this bucket.

Things I Don’t Do

There are some activities others have mentioned that I do not do.

  • Concept art: I am not an artist. It might be a great idea, but it’s simply beyond my skill!
  • Visual development: I am using VRoid Studio for characters and Unity for rendering, so there is less choice over design. Unity supports the ability to create your own “shaders”, which can be used to create a particular look and feel. But it’s a lot of work to change everything to use those shaders, so I have generally stayed away from shaders. As a result, there was not much I could control in this area. It was more a matter of picking assets from the Unity Asset store (and other 3D model stores) that made sense being used together. Creating my own 3D models for locations was beyond what I wanted to tackle.
  • Dope sheets: I don’t use dope sheets. I work out the timing of a shot when animating it.
  • Compositing: I animate characters directly in 3D locations inside Unity, so there is no need for a separate compositing phase to combine different elements of the final drawing. I also do all visual effects inside Unity for the same reason. I could use After Effects or similar, but decided not to in order to keep my workflow simpler.
  • Color grading: I don’t do independent color grading – I do it as part of lighting of a shot inside Unity. Check out Shoost by Muro if you want to see what is possible in this area however! Very cool stuff!
  • Background inking / character inking / inbetweening: These are more tasks for 2D animation. Because I am using 3D location sets, these tasks are all done for me. The effort required for background creation should not be underestimated!
  • Audio: I purposely decided to avoid audio, at least for now, as its not needed for a web motion comic. I did try audio on a previous project, but I found using friends for voice acting difficult to coordinate when production is over an extended period of time. I am keeping an eye on computer voice synthesis. It’s still too dry for me (not enough emotion), but there are signs that it is improving.
  • Rendering: Unity provides all the tools here.

Later additions

I found the following after first publishing this article.

  • Bloop Animation has a number of videos, including How to Make an Animated Short Film (it mentions a free course, but when I looked on the site it might now be a paid course – the video was quite old)

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s