Graduate Portfolio - Technical Animation - Character Rig
Rigging
22/07/25 - 28/09/25
Introduction
This blog post documents the development of the technical animation character rig for the 3D character 'Terrax'. It shows the full journey from integrating textures into Maya from Substance Painter; creation of the skeleton; painting of skin weights; creation of the control rig; building a pose library; creating a control 'picker' and creating a test animation. This includes a full body and facial rig using a bone based setup for easy integration into a game engine, which will ultimately be the end point for this project.
Project Management
At this moment in time there is still a significant amount of time left on the project and also a significant amount of work. The character model and textures took longer than expected, however, the overall quality was good therefore it was worth spending this time. For this module I will be producing 3 animated outcomes including a character rig demo reel, character performance cutscene and gameplay animations with character controller in Unreal Engine. Therefore, this is technically the start of the first animation outcome and we are already 2 months into the project out of the 7 months available. Largely this does mean the project is relatively on track, however, it is difficult to know how long the next phase will taken, given that the previous section of the project overran to focus on quality. Furthermore, I am planning on taking a 3 week summer break for well being - this is important because I am working a lot in my teaching role and outside of work on this masters. However, I have 5 weeks off work over summer meaning that I can work full time on my masters for 2 weeks which will offset this time off.
The goal for this project is to produce an advanced character rig and I am setting myself the goal to be complete by the end of September. This module is a double module, the end of September would mark the end of one module and beginning of another. This would then give me the equivalent of one module's worth of time to complete the remaining animation outcomes and the reflective blog post. It is tight, however, it is looking achieveable.
The above image shows the Trello board and the below image shows the Trello card for this outcome. It is broken down into clear tasks to funnel me through the process. This can be added to and broken down further as needed moving forward using an agile workflow.
Project and Material Setup
To set up the working rig file correctly a Maya project was created to organise all assets into a useable file structure. This would mean that assets like textures would be in the project's file structure and the Maya binary file would be able to easily reference them. This reflects professional workflows and file organisation. After this was set up the low poly assets were imported and materials applied to these so that they were ready for the textures to be applied. As you can see in the image below, Terrax has 3 materials and then there is another for the staff.
The video below (Abe Leal 3D, 2023) was used to understand the most optimal way for textures to be set up in Maya, specifically to be rendered in Arnold. This ultimately used a packed setup where the RGB values are assigned to the ambient occlusion, metalness and roughness maps. This reduces the texture count and ultimately improves performance and file management.
To preview the textures in the renderer I needed a lighting setup. I used the HDRI below (Savva & Guest, 2025) with a Skydome light to give me a simple lighting setup.
Below shows the Hypershade window for one of the material setups. All texture maps bar the colour map are set to be raw. The ambient occlusion map (red channel from the packed AO/Metallness/Roughness map) and the base colour are multiplied together and plugged into the base colour. The green channel from the packed texture is the roughness map and the blue value is the metalness.
The below image shows a test render with all materials set up as previously described. This looks really good and matches the renders in Substance Painter, ultimately showing successful set up of textures. I love the stylised, 3D, cartoon-like look of the render and this fills me with confidence that the cutscene animation I will make will look great once rigged brought to life through animation.
The below image shows experimentation with the render settings to set up post-processing effects for the render so that the emissive maps could be rendered. I wasn't sure if this would be used but it was interesting to explore the setup of this.
Below shows the outcome of this. It worked quite well, a little shiny so I'm not sure if I will definitely use this but it was a good experimentation.
The next image below shows a mock up of how I imagined the staff to be positioned on the character. Again, I'm not sure if I will use the staff at all, however, it was fun to experiment with and imagine this as a video game character.
Research
One of the key pieces of research I wanted to explore on this project was Lake's (2024) book on technical animation. This provides an in-depth look at common rigging skills but also includes some scripting knowledge that I was hoping to explore. The goal was to build on my existing foundation of rigging and build more professional knowledge on the topic. Here are the things I learnt on first read that are relevant to my project:
Euler rotation order should be set to the axis that will be used the most (e.g. XYZ, YZX, etc.) to reduce issues during animation.
Euler rotations can cause gimbal lock. Quaternions do not, however they are not as user-friendly. Quaternions are best used “under the hood” rather than directly by animators.
ASCII files can be debugged as text-based files, making it easier to compare differences and identify issues.
Use standard naming conventions for consistency and clarity: A_ for animation sequences, BS_ for blendspaces, ABP_ for animation blueprints, SKEL_ for skeletons, SK_ for skeletal meshes, and PHYS_ for physical assets.
Default scene settings should use meters for units and an FPS of 30, even though the runtime will be 60 or 120 FPS.
Use transform limits to constrain values, such as preventing the elbow from rotating backwards. These limits can be set in the Attribute Editor under limit information.
Maya uses a right-handed coordinate system and Unreal uses a left-handed system, so a -90 rotation on the X axis of the root is required to account for this difference.
Rig Development
I wanted to get stuck into developing the rig and learning as I went. I started by renaming all meshing using a common prefix. This would help with scene management when I was working with a large number of different digital assets and would enable me to easily tell which nodes are what.
Next I needed to place the joints and build the skeleton. I created the reference board shown below to understand where the joints were in a skeleton in relation to the muscles and skin. Since my character sculpt had good anatomy this process should be fairly seamless and I should be able to achieve good results.
The below image shows the initial development of the core skeleton. This would be later mirrored once it was set up correctly. At this point I was using the joint orient tool to ensure that all joints were oriented to the same axis. This should make for natural deformation in animation.
I noticed a problem with the finger orients not pointing in the direction that they should be and retaining some rotation values. This would cause issues later with inconsistencies in the rig's rotation values if it wasn't resolved. I unparented the finger bones; zeroed out the rotation values; reoriented the joint axis; rotated the joint chain back to the correct position by main joint; froze transforms and finally set preferred angle.
To help make sure that bones in the hand were placed correctly I used the below two images (Swap, 2025) to help understand where the joints should be placed and rotate from. This was particularly useful for the metacarpals that exist within the palm and allow for the hand to curve.
23/07/25
Next, I wanted to make sure that all of the bones were oriented correctly and consistently, as I wasn’t confident this was the case initially. The joint orientation needed to follow a clear standard: the Z axis pointing forward, the X axis pointing down the length of the bone, and the Y axis pointing horizontally from the bone. To achieve this, I unparented all of the bones and used aim constraints to align each joint towards the next bone in the hierarchy. This forced the joints into the correct orientation. Once this was done, I froze the transformations to move the physical rotation values into the joint orient attributes instead. As a result, all joints now have a rotation value of 0, with their orientation stored correctly in the joint orient values. The image below shows these orientations on one half of the character :
These bones were then mirrored with the addition of some extra bones for the pauldrons and kneepads. The below image shows this:
I noticed an issue on the teeth and tongue where it appeared they were slightly off centre. I investigated this and resolved using repositioning and snapping tools.
Next I combined some of the meshes together to reduce the number of skinned meshes. I did this by material so combined all body meshes together, everything on the head and finally the eyes. I also created a bone for the weapon so that this could be animated within the same character rig too.
Face Rigging
Next I wanted to build the rig for the face and I found a paid course by Martin (2022). This appeared to offer a good bone based system that I knew would easily translate into a game engine, therefore I knew I could learn a lot from this to develop a good facial rig system for my project. Here are some of the key learnings I gleamed from this source:
Group the control and then parent the control to that group. This ensures the group holds the transforms, while the curve itself does not. I want to double-check this in the workflows shared from Lake (2024).
Snap the controls to the joints.
Shape the curve once it is correctly positioned.
Parent the joint to the control.
When binding, select selected joints only so the end joints are not included.
When skinning, block in weights by painting with Replace set to 1.
When smoothing skin weights, lock everything except the two joints you want to blend between.
Mouth bones act as pivots inside the mouth so they rotate correctly to deform the relevant areas. There are end joints again, but these are not used, as the deformation is driven by the rotation of the internal bones.
Duplicate the jaw joint to allow the mouth bones to move at different percentages relative to the jaw. This avoids the issue where all mouth bones move down uniformly when the jaw moves down.
Mouth controllers have their transforms set up to mirror for better accessibility. For example, moving the left mouth corner to the right causes the right mouth corner to move in the opposite direction.
Use aim constraints with locators so the system knows which direction is “up.”
Next I wanted to compare this research with further research from Lake's (2024) book but focussing a little more on the control rig set up. This would allow me to adapt what I was learning to the best needs of the project. This is what I learnt:
It is recommended to use 8 influences when binding.
Control rigs work by using group nodes to keep everything organised, with an additional group that houses the controllers.
Using a clear colour language is good practice: central controls are yellow, left controls are red, and right controls are blue. This follows aviation conventions. Some control rigs use green for right-handed controls; however, Maya’s selection colour is green, which can be confusing, so blue has been widely adopted instead.
Use a flat rather than nested control rig setup. In this approach, controls are connected using constraints rather than direct parent-child relationships, which offers more flexibility later on.
Setting up spaces can be useful, particularly for elements like weapons and hands.
Using offset parent groups aligns with the approach shown in the previous tutorial from Martin (2022).
Tomorrow I plan to use all this core knowledge to follow along with but adapt the Martin (2022) tutorials. One adaption I will be using the nested control rig setup opposed to parented for greater flexibility further down the rigging pipeline.
24/07/25
The below image shows the creation of the core bones in the head, including the jaw. It also shows the control curves created for each of these bones (neck, head and jaw.).
The image below shows the utilisation of drawing overrides to change the colour of the control curves. I will be using yellow for central controls, blue for left handed controls and red for right handed controls.
While rigging the face, I noticed a vertex issue causing non-manifold geometry, which had been introduced during the mirroring process. This was caused by too many vertices being merged too closely together. To resolve this, I removed the non-manifold geometry and cleaned up the mesh, which then allowed me to bind the head mesh to the facial bones correctly.
Once bound, I focused on weight painting, as shown in the image below. I started by blocking in weights with 100% influence and then gradually smoothing them using the 'add' command with a low value. This allowed me to softly build influence across the mesh. To maintain control, I locked all skin weights except for the two bones I was blending between. For example, in the image below I was blending influence between the head and jaw bones. Locking the remaining joints prevented Maya from assigning unwanted influences and helped keep the deformation clean and predictable.
Below shows a gif of the head's core skin weight deformation including the neck and jaw.
The next image shows the development of the mouth controllers. These initially had some awkward alignment, so I needed to reposition the left and right counterparts in a way that allowed them to be selected and moved naturally together in local space. For example, if I grab the first upper-lip controller on each side and move them upwards, they should both move up in the same direction.
I also offset the upper- and lower-lip rotations to improve usability, so that when the upper lip is moved up, the lower lip moves down slightly. There isn’t a perfect way to set this up, as one axis will always behave in the opposite direction, but it is possible to get two axes working consistently. I worked through this systematically, testing different scenarios to ensure the mirrored controllers behaved in a way that felt intuitive for animation. To help reorient the controllers, I used the following MEL command:
"rotate -r -os 0 180 180"
This rotates the object relative to its current orientation in object space, with the X, Y and Z values adjusted as needed.
Next to set up the mouth control curves to the bones, I used locators as up vectors within aim constraints. This was done to give the joints a stable orientation while still allowing them to aim cleanly at the controls. Without an up vector the joints can roll or flip unpredictably, which is especially noticeable in facial rigs where the movements are subtle. Using locators as up vectors helped keep the mouth bones behaving consistently and ensured cleaner, more reliable deformation when posing and animating the face. The image below shows the setup for the aim constraints including the locators as explained.
The gif below shows the skinning of the mouth to the mouth bones. Overall this is working really well. Using the previously described locking weight influence technique made this process so much easier than it could have been. Without this technique Maya would have been assigning small amounts of influence around the many joint bones and it would have been difficult to manage. Therefore, this process sped up my workflow significantly.
I tested a quick smile pose for the mouth and rendered this in the image below:
25/08/26
Next I moved onto the cheek, ears, teeth, tongue, nose bone creation, control curve setup up, constraints and weight painting. The below image shows the control curves set up for these.
The image below shows the joint chains that have been set up for these.
26/06/25
The gif below demonstrates the skinning for the cheek, ears, teeth, tongue and nose. This is working well and feels a fairly seamless process. I am repeating lots of the techniques I'm learning but applying these to my model.
27/08/25
Next I moved onto the creation of the eye bones and the look at controller. This controller would allow easy movement of both eyes together as well as individual eye control too.
The gif below shows the outcome of the eye setup. Next I'd move onto the eyelids that would be able to blink but also allow some subtle deformation as the eyes move.
The image below shows the rig set up at this moment in time. Progress is going well, even if it is all in the face area! However, the face has a lot of complexity and is likely to take a long time. When I move onto the body more vast progress is likely to be made.
29/08/25
The gif below shows the outcome of the eyelid rigging and skinning. As you can see, the character can now blink but also as the eyes roll around in the socket there is some deformation to the skin around the eye which works well because it looks really natural.
I used a 'set driven key' on each eye to animate the blinking and added blink value to each eye. This means that you can easily animate this single value to get an easy blink. This is important because to get a perfect blink shape it involved rotation on multiple rotation transforms which would have been hard to remember. Therefore, using a 'set driven key' allows Maya to do the remembering and makes the animation process easier. There are also controls for the eyelids that allow you to pose them in anyway you wish for different expressions.
The image below shows the setup of the 'set driven key' :
This next gif below shows the full eyelid control rig set up functioning with the controls and blink setup visible.
I noticed some mirroring issues across the mesh where the skin mirror weights options wasn't associating the correct verts with the correct mirrored joints. After some experimentation I found that if I labeled all my joints and then set the influence association to label I achieved perfect results. This was a little bit of a time sink, however, since the results were flawless it was worth it. It also filled me with confidence that all skin influences were mirroring properly.
The image below shows the mirror skin weight influence options to achieve this :
The image below shows how each joint was labelled to work in conjunction with the above image. You can see that it tells the software exactly what side the bone is on and allows custom labels to be created.
The image below shows further development with the face rig, now focussing on the eyebrows.
The gif below shows the functioning eyebrow rig. This enables you to move all controls at once or individual parts of the brow to allow for full expressive possibilities.
I ran another pose test to explore the level of expression I could achieve using this face rig, given that all core elements bar the hair were created. This turned out well, particularly the ears that accentuate the annoyed feeling in this expression.
30/08/25
For the hair, each strand has its own individual control nested inside a double-group setup. These controls are hooked up to each other, allowing them to be driven collectively. A master control is then used to mass-control all strands via the Connection Editor. Both the master control and the base of each hair strand are parent-constrained to the head control. I initially ran into double-transform issues due to the number of constraint and group layers, but once resolved this setup allows each strand to be animated individually or all together. It’s a really flexible system and works well in practice.
The below image shows how the connection editor was used to create an association between the hair controllers and the bones.
The below image shows the bone setup.
The below gif shows how the hair rig works including the individual controllers and the master controller.
The below gif shows some tweaks to the skinning to ensure that deformation is as good as it can be. The hair meshes are curved meaning that bending them in the opposite direction to their curve will yield slightly less desirable results. However, when animated in motion this should be less noticeable.
This now marked the end of the head rig setup. Everything was working well and this was due to the iterative approach I took to focussing on specific elements and polishing them to be fully functioning. Next steps are to move onto the body rig and solve the challenge of integrating the head rig onto the body.
Body Rigging
31/08/25
Next I started to develop a wrist corrective joint to improve the deformation around the wrist. This used a leaf joint hooked up to an aim constraint. Essentially the way this works is that it will inherit some rotational values but not all. This allows you to paint weights in a way that ensures the volume of the wrist is sustain as it twists and bends.
The below image shows the aim constraint settings for this setup.
The gif below shows a visual demonstration of how the corrective leaf joint behaves using this system. As you can see, it twists with the joint but as the wrist bends it stays still. This allows you to maintain volume in the wrist area to prevent the skin becoming thin as it twists.
Next I moved onto creating an IK/FK setup for the arms using the guidance in Lake's (2024) book. This IK/FK arm setup uses three joint chains: one for FK, one for IK, and a third bind chain is the core skeleton that blends between the two. Two sets of controls are used so the arm can be animated in either IK or FK, depending on the needs of the shot. To drive all these different joint chains together I used the connection editor as shown in the image below:
The image below shows the FK control curve setup. This is a fairly standard set up with each controller being constrained to the previous controller in the hierarchy.
The next image below shows the IK control curve setup. This setup drives joint rotation from the end and used a pole vector constraint to control the overall twist.
This system was working perfectly and drove the skinned joints well. The star shaped controller was the switch and I'd also setup visibility switching so when you turned the controls to FK the IK controls would vanish. This cleans up the visual noise and makes the rig more user friendly.
01/09/25
Whilst the previous IK/FK setup was working, as soon as I tried to attach it to the clavicle the system broke. I experimented for some time but I couldn't solve this issue. I was worried that this might be a bottomless pit and I could sink too much time into it and make little progress. Therefore, I decided to explore other options and found this rigging series by Dikko (2022a). The core difference here is that it used a third joint chain that was driven by the IK and FK systems and this joint chain then drove the main skeleton; effectively acting as a middle-man.
I ran into an issue with the legs that caused them to bow outwards. I noticed that the joint chain wasn't straight which was causing this. I went back to the core skeleton and tweaked the legs so that they were straight. This required a lot of readjustment and although this was fiddly it was necessary to produce a good leg rig.
I then had another issue with the arm IK where the joints move out of position when the pol vector is applied. This means that they are offset from the main skeleton joints and from the skin itself. This means that the rotation will be all out of alignment. To solve this I followed the technique shown in the video below (Harris, 2018). This involved a slightly different order of operations that counters the bone shifting problem. Once I followed this everything worked as intended!
The image below shows both the IK and FK systems setup up and working. The left side is the FK arms and IK legs and the right side is the IK arms and FK legs. There is a master group at the top of each joint chain that means you can grab the entire system and place it where you want. The next step will be to position this back on top of the skeleton and setup up connections that drive the main joint joint using the connection editor similar to before!
This image shows both IK and FK systems repositions to overlap the main skeleton. To do this I used the relative transform option and used the same value that I offset them with originally.
This image shows the connection editor connections that blend the rotation values together of all the joint chains so that the switch node can blend between IK and FK and this then drives the main joints.
This image below shows some added functionality that I implemented to toggle visibility of the controller so that when you were in FK mode you only saw FK control curves.
The same logic was used for the legs with the system set up in the connection editor again as shown in the image below.
The image below show the completed FK setup for the arms and legs.
The image below show the completed IK setup for the arms and legs.
I was confident that this system would work well because it already used a clavicle controller that was present in both IK and FK setups so that the should could be moved. Since both systems worked with this clavicle controller - it should be easy to connect that to the spine and then connect the face rig to the spine too. This should solve the previous issues I was having where the old IK/FK arm system was inheriting double transforms and offsetting it's position when connected.
It is also worth noting that the feet follow the IK/FK system too using a clever constraint order to utilise easy access controls to create a good range of movement including foot roll and reverse foot lock.
02/09/25
Next I continued working with the video series by Dikko (2022b) to develop the spine setup. This demonstrated the creation of an FK/IK spine setup that could be used simultaneously. This meant that I could create a lot of flexibility in the spine poses by initially posing with FK and then tweaking hip or should rotations independently of the rest of the rig with the IK controls.
There were differences between my bone orientation and that from video, which meant some of the IK Spline axis settings were different from the videos. After experimentation I found that the correct setup for my character's bone orientation was using the X axis as up and using the Z axis as forward. The below image shows the spine controls and the spline IK settings:
After all this experimentation I felt like I had solved the issue and finished my spine, however, i noticed that through my experimentation the arm joints were not in their original location. This meant that some of the joints wouldn't line up with skin or controller and the overall deformation would be wrong. This was frustrating, however, I would try developing the system again tomorrow using the knowledge I learnt from this experimentation session.
03/09/25
When I revisited the steps from the video I was looking out for the point where the skeleton was shifting. I located the issue to be with the application of the spline IK causing the joints to slightly rotate to fit the shape of the curve. Even when I increased the curve resolution there was still a small amount of movement, and across the full skeleton this had a much bigger impact than expected. To resolve this, I temporarily unparented the limb and head joint chains and slightly increased the curve resolution on the spline IK, using two spans instead of one. I didn’t push this too far as higher curve resolution becomes a pain to skin. I then set up the spine system properly using scaling and maths nodes, driving scale and inverse scale from the curve length in the Node Editor to help maintain volume. The twist was set up using what I’d learnt the previous day, and once everything was behaving correctly I reparented all the joints. After bringing in a reference skeleton to check alignment, everything now lines up as expected.
The image below shows the spine setup completed.
The arms are now set up so the IK system is attached to the global control, allowing the limbs to be placed on static surfaces, while the FK systems follow the FK joint chains. The neck is parented to the top chest IK, and overall everything is behaving as expected and looking solid. This means that whilst I did develop a different system to my original IK/FK setup I have resolved the issue and ended up with a more advanced spine system too. Next I would move onto rigging the hands using a simple FK system.
04/09/25
The image below shows the simple hand setup. Controllers are shaped like lollipops and come out of the hand for ease of access. These are then simply constrained to the joints and each other in their FK joint chains to allow for easy rotation.
05/09/25
Now that the majority of the rig was complete, the last system I needed to develop was the pauldron controls. Initially, this was handled by simply attaching a bone from the shoulder, but I wanted to see if I could create a more automatic setup that would drive the pauldron based on the arm’s movement. I came across a written source by Motomura (2018) and a supporting video (MORSOV, 2024) that demonstrated a simple approach to rigging shoulder armour, which was exactly what I was looking for. However, due to a language barrier in both resources, it took some time to fully understand the setup. The core idea uses an aim constraint, with the up vector looking at the forearm joint, one axis set to not inherit rotation, and rotation limits applied. This causes the pauldron bone to always try to aim toward the upper arm, creating the illusion that it naturally follows the arm movement. I then combined this with techniques I’d used elsewhere in the rig to add an extra layer of control, allowing manual adjustments on top of the automatically driven rotation when needed.
The gid below shows how this system works. It is such a simple technique, however, the results are extremely satisfying. This will make animation significantly easier through the automatically driven rotation that still allows me to manually tweak this.
Skinning
Next I turned my attention to skinning. I haven't included too many work in progress images since it is a repetitive process, however, instead the images below show the character posed which demonstrate the result of the skinning. The techniques I used for skinning have been discussed earlier in this blog post.
09/09/25
The gif below demonstrates the hand skinning.
12/09/25
The skinning process revealed some more issues with areas of the mesh losing deformation as they rotated. This was primarily the shoulders, however the legs also suffered. I tried applying my leaf joint knowledge to these areas, however, the results with this system were limited and I decided to gain some more visual reference to help me create a more robust set up.
I found the video below (antCGI, 2021a) that highlights how to create and position the roll joints in these areas.
Then I found the video below (antCGI, 2021b) which built on the previous video by showing the system used to create the roll joints. The core idea is to keep the twist joints behaving naturally as the limb rotates, essentially making using multiple joints to skin different the complete limb. A locator is created and constrained so that the twist joint always points at it, which stabilises the rotation and prevents unwanted twisting in the shoulder/arm chain. By forcing the twist joint to aim at this reference, the system maintains a steady orientation even as the main joints rotate, making the roll behaviour much cleaner and easier to animate.
The image below shows the creation of these new roll joints within the skeleton.
This gif below shows the roll joints working with the skin. This looks fairly simple, however, it was really complex and results in the skin holding the volume of the limb no matter what angle you pose it in. I was really happy with this outcome because if I couldn't solve this then the overall quality of my next animations would be lower or it would limit my flexibility in poses.
13/09/25
The rig is now finished and the video below shows a simple render of the skin moving in different positions. This will be later presented nicely through a render and fading the control curves in and out to showcase the work produced as part of this rig.
The image below shows a static shot of the completed control rig.
The image below shows a static shot of the completed skeleton.
AnimSchool Control Curve Picker
Whilst the control cruve set up is really good I also wanted to look into utiliseing the Animschool Picker (2025) for further rig accessibility. This enables you to select one or more control curves through a simple 2D interface which makes it easier to select a full arm or just the IK over the FK controls. The image below shows the setup of the picker for my character's main body.
The below image shows a bespoke control curve picker specifically for the face. This will make facial animation easier, especially as there are a lot of curves in close proximity of each other.
This next image below shows how the picker can be used in the scene when animating.
Pose Library
I wanted to look at creating a pose library for the character to be used across multiple animations. I found the below video (Kouhi, 2019) that showed a powerful tool called Studio Library (2025). This would be used mainly for facial expressions and hand poses, however, could also be used to mirror poses across a animation loop which would speed up production times.
The below image shows the integration of Studio Library (2025) intro my Maya project and the creation of some simple hand poses. I have a default relaxed hand pose and a fist. These are two common poses that you want to flip between, therefore saving them in a pose library will speed up workflows significantly.
The image below demonstrates the fist pose that was saved to studio library.
The image below shows the full rig setup for the Terrax character including the pose library functionality, the control rig itself and then the control rig picker. I was super happy with these systems I'd built, particularly the level of flexibility and professionalism in these rigs.
20/09/25
Next I moved onto expanding the facial expression pose library. I used The Complete Artists Guide to Facial Expressions (Faigin, 1990) to inform the poses I created for expressions. I also used the Stop Staring book (Osipa, 2003) to inform the visemes that would be used for mouth shapes in lip sync. I studied the reference images and information in these books careful to help make believable expression poses. Studio Library (2025) can then be used to ease in, ease out and slightly over shoot these poses which makes it a powerful animation companion.
The below image shows the full library that was created.
The below four images show some close up examples of some of the poses I created. The first is anger, second is fear, third is happy and fourth is shocked.
Overall the pose library was working really well. Although this took some time to set up it was going to save so much time in the future that it was worth it. Furthermore, it was great to see the flexibility of my facial rig to see my character express emotions so well!
21/09/25
Whilst experimenting with posing the rig I noticed issues with the kneepad straps. I added this additional bone and controller in to enable a simple tweak controller for the knee straps that could be used to ensure the poses around the knee worked and looked good with the knee strap. This was a simple system with no automation but served the purpose and slightly expanded the flexibility of the rig to create even stronger poses.
Animation Test
Next to fully stress test the rig I wanted to create an animation to check the useability, deformation and overall animation quality I could achieve. This animation could then be used as one of the gameplay animation outcome when I start developing this. The gif below shows the block out of the walk cycle with some refinement. This character is meant to be quite confident and cocky so I've gone for a little bit of a strut with the swinging of the arms.
22/09/25
This gif below shows the completed animation from a three quarter view.
This gif below shows the completed animation from a side view.
Overall I was satisfied with the rig's performance and useability. This was a significant concern because if my rig wasn't good then I wouldn't be able to produce the next animation outcomes for the module. Instead I would have needed to use a premade rig and then all the plans and time invested would be wasted. Luckily, this wasn't the case and my effective researching skills allowed me to develop an advanced character rig.
Rig Presentation
26/09/25
Next I needed to present the characer rig proffesionally which would form the first animation outcome. I created the following plan for how I would showcase the rig video.
The plan was as follows:
Title card with character posed and concept art of the character.
Mesh showcase with wireframe blending into geometry, blending into the normal map being applied before finally blending the textures in.
Rendered walk showcase blending into overlaying rendered controls.
Rig showcase including : Foot setup, IKFK Switches, Pauldron Setup, Advanced twist controls, IKFK Spine setup, Hands, Hair, Ears etc.
Facial rig demonstration.
This plan needed me to render wireframe so I used the below video (Bittorf, 2021) to learn how I could professionally render this. You create or assign an aiWireframe shader, plug it into your mesh so the render engine recognises only the edges, and tweak settings like edge type and thickness to control how the lines look in the final render. This lets you output a clean wireframe pass showing topology in the rendered frame rather than relying on screen overlays or viewport methods.
Next I needed to see if I could render the control curves. I found the below video (Animators Journey, 2023) to showed how to render control curves. The process involves making the NURBS curves render-able in Arnold rather than relying on viewport display. By enabling the render curve option and assigning a simple shader, the control shapes can be included directly in the final render, allowing clean, visible controls to be shown alongside the geometry.
I then used the techniques discussed to execute my plan. This involved rendering the walk cycle multiple times to get a version with no control curves, a version with wireframe and a version with control curves. This enabled me to create smooth transitions to blend different elements in and out which worked really well. I added some music by TyTShala (2024) so that the video wasn't completely silent. The video below shows the finished outcome and execution of the plan I set.
A higher quality version of this video is available on Vimeo : https://vimeo.com/1134625825
Conclusion
This rigging phase has been a really valuable part of the project and a massive learning curve in both technical workflows and problem-solving. I created a full body and facial rig for Terrax, working through bone orientation, joint placement, weight painting, facial rigging, IK/FK systems, corrective joints and flexible control setups. I refined and iterated my outcome as I identified issues like non-manifold geometry or mirrored weighting problems.
Research played a huge role in shaping how I approached each system, from learning about Euler rotation orders and control rig organisation to using reference tutorials to build practical, engine-friendly setups. I also developed tools to speed up future work, like the control curve picker and pose library, and tested the rig with animation to confirm its performance and deforming quality. Overall I’m really happy with the quality and flexibility of the rig I’ve built, and completing it gives me confidence moving forward into the next animation outcomes. Having the rig in place now makes those phases much more achievable and stronger technically.
I'm super excited to now use this rig to develop animations. The useability of the rig is good and the render quality is coming out great. This now means that the outcomes I make as part of this module will be truly unique by using a character that doesn't exist anywhere else. This will stand out in my portfolio and improve my employability.
References
Abe Leal 3D (2023) How to Connect PBR
Textures in Maya. 19th July. Available at: https://youtu.be/Zy0dYnHMRPY
(Accessed: 21 December 2025).
Animators Journey (2023) Free Maya Script:
Render Curves in Autodesk Maya Arnold. 6th June. Available at: https://youtu.be/vNlBNoRyL68
(Accessed: 21 December 2025).
Animschool Picker (2025) AnimSchool Store.
Available at: https://store.animschool.edu/animschool-picker/ (Accessed: 21
December 2025).
antCGI (2021a) #RiggingInMaya | Part 03 |
Skeleton Building. 26th November. Available at: https://youtu.be/fGacyVzJGIU
(Accessed: 21 December 2025).
antCGI (2021b) #RiggingInMaya | Part 15 |
Twist & Roll Joints. 10th December. Available at: https://youtu.be/C4YBoxJdRVA
(Accessed: 21 December 2025).
Bittorf, D. (2021) Rendering Wireframe in
Maya with Arnold. 20th September. Available at: https://youtu.be/szW4yYwsloE
(Accessed: 21 December 2025).
Dikko (2022a) Character Rigging in Maya!
Episode 6 - Creating the IK Hand Controls. 9th November.
Available at: https://youtu.be/2vJ8pSPcDzM (Accessed: 21 December 2025).
Dikko (2022b) Character Rigging in Maya!
Episode 10 - Creating a Robust Spine Rig with SPLlNE IK. 15th November.
Available at: https://youtu.be/2Cvj4UYlPoU (Accessed: 21 December 2025).
Faigin, G. (1990) The Artist’s Complete
Guide to Facial Expression. New York: Watson-Guptill Publications.
Harris, T.M. (2018) When adding a pole
vector moves joints. (How to Fix) *Maya 2018*. 18th November.
Available at: https://youtu.be/GxM4GawKwjQ
(Accessed: 21 December 2025).
Kouhi, B. (2019) How to Save Poses and
Animation In Maya (Episode 12) | How To Be A 3D Animator 2020. 26th
August. Available at: https://youtu.be/dalhhPigwrY (Accessed: 21 December
2025).
Lake, M. (2024) Technical Animation in
Video Games. 1st Edition. Florida: CRC Press.
MORSOV (2024) Риг наплечника в Майе /
Rigging a spaulder in Maya. 18th February. Available at: https://youtu.be/gIAxLNnHNp0
(Accessed: 21 December 2025).
Osipa, J. (2003) Stop Staring: Facial
Modeling and Animation Done Right. 1st edn. Sybex.
Savva, D. & Guest, J. (2025) Golden Gate Hills. [Polyhaven].
Available at: https://polyhaven.com/a/golden_gate_hills (Accessed: 21 December
2025).
Studio Library (2025) Studio Library
Website. Available at: https://www.studiolibrary.com/ (Accessed: 06 December 2025).
Swap N, W.P. (2025) Complete guide to hand
anatomy: Parts, Names & Diagram, HumanBodyPartsAnatomy.
Available at:
https://humanbodypartsanatomy.com/hand-anatomy-parts-functions-diagram/
(Accessed: 21 December 2025).
TyTShala (2024) Sci-fi Terror Loop |
Royalty-free music - pixabay. Available at:
https://pixabay.com/music/upbeat-sci-fi-terror-loop-262135/ (Accessed: 21
December 2025).