This is my final outcome, and below is the work-in-progress video where I explain the steps and the full process involved in creating my FMP.
After all animations, simulations, and effects were successfully imported and working in Unreal, it was finally time to render the final sequence. Rendering was another huge challenge. Each shot took a long time to process, and over the course of the project, rendering the entire sequence took around 52 hours. During this time, I encountered multiple crashes and errors that forced me to cancel and restart renders repeatedly. It was stressful, but I had to stay focused and systematic to ensure every shot was captured correctly.
Once the rendering was complete, I brought all the footage into After Effects. Here, I added 2D effects on top of the renders, enhancing moments like the sword clashes, character appearances, and subtle highlights. After finishing the compositing, I moved into Premiere Pro to color grade the entire sequence. This step helped unify the look of the project and gave it a polished, cinematic feel. I also used Photoshop to add 2D impact frames.
Impact Frames


Color Grading & Transitions in Premiere Pro
This project pushed me to my limits in every way. I was forced out of my comfort zone and had to try tools and techniques I had never used before, from Alembic caching, and Marvelous Designer simulations, to keyframing visual effects, rendering, and compositing. I encountered repeated problems and setbacks, but working through them taught me a tremendous amount. From character creation and rigging to animation, cloth simulation, Unreal Engine setup, visual effects, rendering, and post-production, I learned the full pipeline of bringing a complex animation project to life. It was exhausting, challenging, and sometimes frustrating but seeing the final result made all the effort completely worth it.
After I had finalized all my animation and clothes simulations in Maya, I moved everything into Unreal Engine. This was another challenging stage of the project because, despite my careful preparation, importing the assets came with its own set of problems. I exported the camera from Maya as FBX and brought it into Unreal so I could maintain the exact framing I had planned for every shot. The Blender character was imported as Alembic cache, which meant I couldn’t attach the sword directly to the body. To solve this, I also imported the sword as Alembic cache and planned keyframes carefully.
One major challenge was aligning the characters with the Unreal world. When importing the camera and characters, I realized that Unreal placed them at the origin of the world. This meant I had to move my entire environment so that the locations I had planned for each character matched the world center. I created multiple sub-levels and carefully positioned everything, testing placement with reference points to make sure all the shots lined up correctly.
Every shot seemed to bring a new problem. Some FBX files imported fine, others caused errors or didn’t play the animations correctly. Sometimes the rig would break, or the Alembic cache wouldn’t match perfectly with the character’s master control. I realized that the most reliable solution was to import each shot separately as a complete skeletal mesh rather than trying to use one skeleton for multiple animations. This approach was time-consuming but ensured stability.
Once the characters were successfully imported, I moved on to adding visual effects. This included swords glowing, characters appearing and disappearing, and eye glow effects. Each effect had to be carefully keyframed on the timeline to match the animation. It was a painstaking process, but seeing everything come together in Unreal was incredibly satisfying. This stage taught me how complex exporting and importing pipelines can be, and how much patience and iteration are required to get everything working smoothly.
Lighting in Unreal
Creating Effects in Unreal
BLOG 9 – Marvelous Simulations
Once my animation was finalized, I moved on to simulating clothes in Marvelous Designer. This stage was something I had been preparing for throughout the project, and I knew it was going to be a meticulous and time-consuming process. Every shot needed to be prepared carefully, starting with placing the character in a proper A-pose at the beginning. This was crucial because if the clothes didn’t fit correctly from the start, the simulation would behave unpredictably. After setting up the A-pose, I baked the animation and exported Alembic caches so I could start simulating the cloth over the animated movements.
The simulation process was extremely detailed and required a lot of back-and-forth between Maya and Marvelous Designer. Any slight collision between the hands or arms and the clothing could cause the fabric to stretch, stick, or clip unnaturally. This meant that for almost every shot, I had to watch the simulation carefully, note any errors, go back into Maya to tweak the animation slightly, and then return to Marvelous Designer to run the simulation again. It was a tedious and repetitive process, but it was necessary to achieve a realistic and natural movement of the garments.
Some shots were even more challenging because the characters moved quickly, which caused the cloth to break or behave unrealistically. To solve this, I first simulated the clothes on top of the character’s normal movement without moving the master control, which allowed the cloth to settle properly. Once the simulation looked correct, I imported the Alembic cache back into Maya and used the match transformation option to align the clothes with the character’s master control. This added another layer of precision and ensured that the garments stayed consistent across every shot.
Even though it was exhausting, seeing the clothes move naturally with the characters was extremely satisfying. Each successful simulation felt like a small victory, and it made me realize how important it is to combine careful animation planning with simulation techniques. This stage not only added realism to my characters but also pushed me to think critically about timing, collisions, and how subtle details can make or break the final look of the scene. Overall, simulating clothes in Marvelous Designer was a challenging yet rewarding part of the pipeline, and it taught me a lot about patience, iteration, and problem-solving in 3D animation workflows.cter.
Cloth Simulation
BLOG 8 – Unreal Character Import Problems
When I first imported my character into Unreal, it looked very bad — dull and whitewashed compared to Maya. I was disappointed and considered rendering everything in Maya only. But I had some effects planned in Unreal, so I spent a few days fixing lighting and textures until the character looked better.
I exported each shot as a separate FBX with skeletal mesh instead of trying to apply multiple animations to a single skeleton. This solved most of the import problems. Every shot worked correctly after that.
The clothes and the second character posed separate challenges. For the clothes, I imported the Alembic caches and set the time ranges, and it worked. For the other character I downloaded online, FBX import never worked in Unreal, and even Alembic caches gave only one material slot instead of eight. I had to go back to Maya, assign materials individually to each mesh element, triangulate, bake the animation, and then export Alembic. This was a tedious process, but it worked in the end.
Unreal Engine Errors
BLOG 7 – Rendering in Maya
For the first few shots of my project, like the animator sleeping and the Maya character jumping off a building, I decided to do the rendering entirely in Maya. I set up the lighting and environment carefully, making sure everything looked correct in the frame. I double-checked the camera angles, the character’s positioning, and how the shadows fell across the scene. Once I felt everything was ready, I hit render for the first shot.
To my surprise, it took over 13 hours to render just that one shot. I hadn’t anticipated how long each frame would take with the lighting and materials I was using. Seeing the estimated time made me panic a little—I realized that if every shot took this long, I would fall seriously behind schedule.
At first, I thought I could use the render farm to speed things up, but unfortunately, it wasn’t working when I needed it the most. I had to quickly come up with a backup plan. I ended up rendering across 4–5 different PCs in Digital Space, leaving them to run overnight. Even then, one shot that had over 500 frames took around two full days to complete.
Because of the long render times, I wasn’t able to implement all the feedback I had received for some animation shots. It was frustrating, but I had to make decisions based on time constraints. I realized I wouldn’t be able to complete the rendering, post-processing, and color grading for everything before the deadline, so I applied for a deadline extension. This allowed me to work systematically without rushing and ensured I could maintain the quality I wanted for my final shots.
Looking back, this stage taught me a lot about planning render times, testing setups, and managing resources. I learned that even if the animation and lighting are perfect, render time can completely change your workflow and schedule. It was stressful, but it was also a crucial lesson in patience and problem-solving in a real production scenario.
Lighting in MAYA
After rigging was done, it was finally time for animation the part I had been waiting for eagerly. I had my shots planned, including what actions I wanted, which dialogues were needed, and the camera angles I would use. I started looking for references for fighting scenes, and for dialogue and normal body movement scenes, I recorded myself to use as reference.
Before starting the animation, I changed a few shots. I had around 30 different shots to animate, from the animator sleeping to the two characters clashing, and I was panicking since animating characters properly takes a lot of time, especially with three characters. I started animating day and night, focusing only on the animation and ignoring environment work.
A lot of issues came up with my rig. Sometimes the hands or feet would come out of the gloves or shoes if I bent them too much, so I had to go back and forth between fixing the rig and reloading references. Slowly but surely, I fixed these issues and kept animating. It took me almost two months to complete all the animations as polished as I could while keeping the submission deadline in mind.
I got feedback from Ting, George and peers, and implemented as many changes as I could. For the scenes with swords, I had to use locators to make the swords follow the wrists, but this was not supported in Unreal. Since I was planning to render in Unreal, I decided to use Alembic caches to transfer the animation files.
Animation
With clean bakes ready, I moved on to texturing in Substance Painter. Importing the character initially gave me errors, but after adjusting a few export settings in Maya, everything loaded properly. I started with the skin because it was the most important part to get right. I followed tutorials from Abe Leal to achieve semi-realistic skin with subsurface scattering and proper roughness variation. This was my first time aiming for a more realistic skin texture, and I was happy with the results.
After finishing the skin, I textured the clothes and hair, which were easier but still time-consuming. Finally, I textured the eyes and created a special glow layer for the iris, which I planned to use in certain scenes.
Texturing
When the texturing was done, I went back into Maya and started rigging using Advanced Skeleton. This was one of the scariest stages for me because I didn’t have much rigging experience. The initial skeleton generation was simple, but the weight painting took days. Advanced Skeleton didn’t generate clean weights for me, so I manually fixed almost everything.
Maya also crashed multiple times during rigging. I faced issues like:
- Controllers not working
- Improperly mirrored joints
- Incorrect skinning
- Unexpected errors during build
I eventually got the body rig to deform nicely, but then a major issue appeared: the clothes wouldn’t deform properly. They moved awkwardly, stretched incorrectly, and didn’t follow the body realistically. I tried re-rigging, using deformers, adjusting weights—nothing solved it. I started worrying I wouldn’t be able to use my character at all.
Rigging
Then I had an idea: instead of rigging the clothes, what if I simulated them?
So I installed Marvelous Designer. I had very little experience with MD, but I recreated the exact garments from my ZBrush sculpt by using the original sculpt as reference. At first, I faced several issues because I was still learning the software, but after watching tutorials and getting some help from Reddit, I was able to rebuild all the clothes and simulate them realistically.
Since the UVs changed, I retextured the clothes again.
Marvelous Designer Clothes

I also planned out my workflow going forward:
- Animate the character without clothes
- Finalise the animation
- Then simulate the clothing shot by shot depending on the scene
- Apply different simulations for Maya scenes vs Unreal Engine scenes
For the facial rig, Advanced Skeleton created a basic setup, but it had limitations. The eyebrow controls were very simple—just two controls for each brow—so I couldn’t create subtle expressions like squeezing the eyebrows during dialogue. But the rig was good enough to get me through the project, and after some weight painting fixes, I was satisfied with it.
This blog covers one of the toughest stages of my pipeline, but also the most rewarding because I managed to solve a problem that almost stopped my project entirely.
Once the sculpt was finished, I moved into Maya for retopology. I expected this process to be straightforward because I had done retopo before, but I ran into issues almost immediately. The biggest challenge was the clothing. My sculpted outfit had lots of pockets and small folds, which made clean retopology difficult. I struggled for a while, but after watching a few tutorials I finally understood how to simplify shapes and maintain the silhouette without losing detail.
After completing the retopology, I moved on to UV mapping. Before starting, I planned out how many UV sets I would need:
- One UV set for the entire body
- Separate UVs for clothes
- Separate UVs for hair
- A separate UV just for the eyes (because I wanted detailed close-ups)
Retopology & UV unwrapping
Once all the UVs were done, I began the baking process. At first, I tried baking everything inside Substance Painter, but the results were extremely poor. The normal maps weren’t clean, and a lot of details didn’t transfer properly. I didn’t want those problems to affect my textures later, so I switched to Marmoset Toolbag.
Marmoset Toolbag made the baking process so much easier. I simply placed the high-poly meshes in the high folder and the low-poly in the low folder, set the cage distance, and baked everything cleanly. The output included layered Photoshop files, which allowed me to tweak the Ambient Occlusion and Normal Maps manually where needed.
This stage took about a week in total, including problem-solving and switching software. It was frustrating at first, but in the end, I got clean bakes that captured all the details from my sculpt.
Baking
BLOG 3 — Sculpting My Character in ZBrush
With my references ready and my previs completed, I began the most time-consuming stage: sculpting my original character in ZBrush. This took me roughly 3–4 weeks, from late July to around mid-August.
Since I wanted realistic body proportions with some stylised features, I paid extremely close attention to anatomy. I created the entire body first, making sure all muscles, joints, and proportions were believable. One of my main concerns during sculpting was making sure the clothes wouldn’t intersect with the body during animation later. So I sculpted the character carefully, considering space between clothing and the base mesh.
After the body sculpt was complete, I added the clothing. I had a very clear image of how I wanted the character to look, inspired by JRPG aesthetics, so I sculpted the garments directly onto the body. The final high-poly sculpt ended up being around 14 million polygons, which captured all the fine details I wanted.
Surprisingly, ZBrush didn’t give me many technical problems. No major crashes, no corrupted files. This part went smoothly, which I was grateful for because I knew the next stages wouldn’t be as forgiving.
This stage ended with a finished high-detail sculpt that I was proud of and ready to take into Maya for retopology.
Sculpting in ZBrush
Screenshots




