Categories
3D Computer Animation Fundamental Immersion

Week 3: Sequencer

This week, we began working with the Unreal Engine 5 Sequencer. I learned how to translate film production techniques into the Unreal Engine Sequencer, with an emphasis on understanding the key differences between traditional film production and the Unreal Engine workflow.

Film

1. Master Scene Technique (Master Shot)

  • This refers to filming an entire scene in a single, continuous shot from a wider angle, usually capturing the entire set and all the actors involved. The master shot provides coverage for the whole scene, establishing the geography of the action and relationships between the characters.

2. Coverage Cameras

  • Coverage refers to additional shots taken from various angles and distances after the master shot. These include close-ups, over-the-shoulder shots, and inserts that help emphasize emotions, reactions, and details. These are edited with the master shot to build the final scene, ensuring smooth continuity and flexibility during the editing process.

3. Linear vs Non-Linear Storytelling

  • Linear: A narrative that progresses in chronological order, moving from beginning to end without jumping in time.
  • Non-Linear: A narrative that doesn’t follow a strict chronological order. It may involve flashbacks, flash-forwards, or scenes out of sequence to create a more dynamic or layered story.

4. Triple Take Technique

  • This is when a director asks for three takes of a specific shot, often with slight variations in performance, framing, or blocking. It gives editors multiple options to choose from during post-production and helps ensure the scene is captured effectively.

5. Overlapping Action

  • A technique where parts of a movement or action are staggered or delayed, making it more realistic. For example, when a character turns, their body, head, and arms don’t move at the same time. This is common in animation, but in live-action, it’s often seen in the timing and natural flow of actions.

6. Hitting Marks

  • In film production, actors are given specific spots (marks) where they need to stand or move during a scene for the best camera angles, lighting, and focus. “Hitting the mark” means landing on that spot accurately while delivering a performance.

7. Traditional Live Action Production Roles

  • Gaffer: The head electrician on set, responsible for managing lighting setups based on the cinematographer’s vision.
  • Grips: Technicians who handle the rigging and setup of equipment (e.g., lights, dollies, scaffolds). Key grips lead this team.
  • Production Manager: Oversees the logistics of the production, ensuring that the schedule, budget, and resources are well managed.
  • Director of Photography (DP): Responsible for the visual look of the film, working closely with the director to choose camera angles, lighting, and shot composition.

These are all crucial elements of traditional live-action filmmaking, giving a solid foundation for understanding how a production comes together from both creative and technical standpoints.

Unreal Engine

1. Sequence-Based Linear Workflow

  • One Level Sequence: This refers to the organization of a scene or project as a single, continuous timeline or sequence. In linear workflows, this sequence progresses from start to finish, without any jumps or re-arrangements. It’s simpler but may lack the flexibility of non-linear editing.
  • Multi-Camera: In a multi-camera setup, several cameras record a scene from different angles simultaneously. This allows for capturing various perspectives (e.g., wide shots, close-ups) in one take, which can then be edited together in post-production.
  • Single Camera Cuts Track: In a single-camera setup, each shot is filmed separately, one at a time. The footage is then edited together in post-production. “Cuts track” refers to editing the individual shots together in sequence on a single track in the timeline.

2. Shot-Based Non-Linear Workflow

  • Nested Level Sequences: These are sub-sequences or smaller units of a larger project that can be worked on independently and then combined into the main timeline. In a non-linear workflow, scenes can be rearranged, edited out of sequence, or altered without needing to stick to a strict order. Nested sequences are particularly useful for complex projects where different parts of a scene are handled separately and then “nested” back into the main project.
  • Take System: A system used to manage multiple versions (or “takes”) of the same shot or scene. This allows the editor or director to easily switch between different takes to find the best one for a specific moment in the timeline. It’s common in both animation and live-action editing.
  • Sub-Scene Tracks (Optional): Optional tracks that allow specific elements of a scene, such as animation, sound, or visual effects, to be edited and adjusted independently. This provides flexibility for making changes to one part of the scene without affecting the whole.

3. Multi-Artist Collaborative Workflow

  • Sub-Levels: Sub-levels are subdivisions of the main project that different artists can work on independently. In complex projects, having sub-levels ensures that multiple people can collaborate on different parts without interfering with each other’s work.
  • Sub-Scene Tracks: Similar to sub-levels but more granular, sub-scene tracks allow various aspects of a scene (e.g., animation, lighting, or sound) to be separated and worked on individually. This modular approach is useful in a collaborative environment where specialists focus on their specific areas.
  • Visibility Tracks: These control the visibility of certain elements or layers in a scene. For example, an artist may choose to hide specific props, characters, or effects during editing or rendering to focus on a particular aspect of the project.

Workflow Comparison

  • Linear Workflow is straightforward, where everything is worked on in a fixed order from start to finish. It’s more commonly used in traditional filmmaking where the story or scenes are shot in sequence.
  • Non-Linear Workflow allows for flexibility, where shots, scenes, and edits can be arranged, changed, or modified out of order. This is especially useful in animation and VFX-heavy productions, where various artists can work on different parts of the project simultaneously, and the final sequence is pieced together later.

These workflows are critical for large-scale productions, especially when teams need to collaborate efficiently without stepping on each other’s toes or slowing down the process.

After understanding the key differences between traditional film production and Unreal Engine, we began working with the Sequencer. I learned that the Sequencer is Unreal Engine’s Non-Linear Editing Tool, offering features such as:

  • Ground-up Shot Creation: Allows creators to build individual shots from scratch within the Unreal Engine, providing full control over elements like camera angles, lighting, and scene composition.
  • Pre-visualization: A tool used to create rough versions of scenes or sequences before full production. It helps visualize how a final scene will look and aids in planning and decision-making.
  • Full Film Creation: Unreal Engine can be used for the end-to-end creation of entire films, from pre-production to final rendering, providing a virtual production environment.
  • Game Cinematic Creation: The tool is also used to create cinematic sequences for games, helping to craft narrative-driven cutscenes or trailers with high-quality visuals.

This makes Unreal Engine highly versatile for both film and game industries.

I learned that in film, the narrative is typically structured as a collection of sequences, often following a three-act structure. In contrast, Unreal Engine uses a Level Sequence, which organizes Nested Level Sequences to build an entire narrative film.

We were introduced to two new terms: Possessable and Spawnable Actors, where Possessable actors are like Zombies which we don’t want in our scene and Spawnable actors are considered to be the angles which we always want in our scene. For Spawnable actors we will have to call them for them to appear in our scene whereas for possessable actors they will be seen in our scene always no matter what.

Afterward, we created a sample project named DMX Previs Sample to learn how to create new cameras within the scene and animate their movements, all while gaining experience with the Sequencer and the process of adding cameras and other objects to it so that we can keyframe them and animate them.

We learned to create Spawnable Actors after adding the models to the Sequencer. By right-clicking on the object I wish to convert into a Spawnable Actor and selecting Create Spawnable within the Sequencer, the object becomes a Spawnable Actor. This ensures that the object is always available in the Sequencer whenever we need to render the scene or access the Sequencer.

We created a Level Sequence and then opened the Sequencer to add a camera to the DMX Previs Sample scene. After incorporating the camera into the Sequencer, we adjusted the focus property from frame 1 to frame 100 and keyframed it to create a simple animation.

This week, I experimented with camera settings and movements to create various camera animations while enhancing my understanding of cameras in Unreal Engine, all while learning to use the Sequencer effectively.

Categories
3D Computer Animation Fundamental Immersion

Week 2: Unreal Engine World Building

This week, we delved deeper into Unreal Engine, beginning with the fundamentals such as the tools, user interface, and various features. We also explored the organization of elements within Unreal Engine and learned how to effectively manage levels inside a project.

When doing world partitions and world level design Unreal Engine is faster, easier and more efficient and due to introduction of nanite structures it can become incredibly detail.

The assets which we import are saved in the Content Folder as .uasset files and levels are saved as .umap files and projects are saved as .uproject files.

If don’t know what is where placed and not being able to find something in Unreal Engine just search for it in Unreal Engine.

Upon launching Unreal Engine, the Unreal Project Browser appears, allowing us to choose a location for our project and assign it a name. Today, we selected “Film/Video & Live Events” as the project type and opted for Starter Content. The Raytracing option is optional and can be enabled later within the project.

Content Drawer Shortcut: Ctrl + Space

To enlarge the view port press F10 key but we will still have the access to content drawer and other options and F11 makes the view port full screen and then we can not use the content drawer.

In this lecture, I learned that organizing assets into levels can enhance efficiency by facilitating the separation of different elements, making them easier to locate. Typically, there is a main level, within which we can create sub-levels.

Lighting is essential for creating realistic environments. I explored Unreal’s advanced lighting tools, including the Environmental Light Mixer, Directional Light, Sky Atmosphere, Sky Light, Exponential Height Fog, and Volumetric Clouds. Rotating the Directional Lights changes the time of the day in the scene. Ctrl + L + Left Click Drag is the short cut for rotating the Directional Light in the scene.

Created a new Emissive Material by multiplying a colour with parameter and connecting the node from the multiply to Emissive Channel.

Shortcut for creating a Three Colour Constant in Materials: 3 + Left Click

Shortcut to create a Parameter: 1 + Left Click

Just drag and drop the Emissive Material on any object to apply it on them.

The Quixel Bridge add-on allows for quick world-building by importing assets into the scene. Assets can be downloaded in Quixel Bridge and then easily dragged and dropped into Unreal Engine.

Props can be merged using the “Merge Actors” option in the Tools menu. Grouping props allows for rearrangement or further editing, but if no changes are needed, they can be merged into a single entity.

Landscapes can be added to a scene, or a new landscape can be created using Landscape Mode, which also allows for sculpting and shaping the terrain.

Categories
3D Computer Animation Fundamental Immersion

Week 1: Introduction to Unreal Engine

The initial week provided a fundamental overview of Unreal Engine 5.4.4, highlighting its applications in the gaming and film industries.

Since I already had an Epic Games account, I simply navigated to the Unreal Engine section to install Unreal Engine 5.4.4 as instructed.

The key difference between Unreal Engine 5.3 and 5.4.4 is the improved performance, featuring a 25% reduction in GPU time and a 50% decrease in render thread time. Additionally, Unreal Engine 5.4 introduces support for spline meshes, which are crucial for modeling roads and landscapes.

Unreal Engine Marketplace is a place were we can purchase the assets, like environments, characters and many more things, for Unreal Engine and there are also free assets available to get and there is also some assets which get free for a month and change every month.

The Unreal Engine Marketplace offers a wide range of assets available for purchase, as well as free assets. Additionally, certain assets are made available for free on a monthly basis, with new selections introduced each month.

The Vault functions as a storage location for purchased assets, enabling users to either add them directly to an existing project or create a new project using the items stored in the Vault.

Quixel Megascans is a vast online library of high-resolution, PBR-calibrated surfaces, vegetation, and 3D scans, offering a wide range of textures, assets, plants, and more for import into Unreal Engine. However, Quixel Bridge is required to download and import these resources into projects.

I got to know that Megascans will be transitioning to Fab in October. Fab is Epic Games’ new all-in-one marketplace for discovering, buying, selling, and sharing assets.

There was a brief introduction about Quixel Bridge that it is a crucial tool necessary for browsing, downloading, and importing assets directly from Quixel Megascans into Unreal Engine, Maya, or any other 3D software.

Categories
3D Computer Animation Fundamental Animation

Week 12: Showreel Submission

This is my submission for Term 1.

Design Proposal Submission

Categories
3D Computer Animation Fundamental Animation

Week 11: Body Mechanics Spline

This week, I took my blockout animation a step further by converting it to spline. My main focus was on creating smooth arcs in the character’s movement to make the animation feel more natural and dynamic.

While working on this, I faced a few challenges. The character seemed to drift too far to the right while in the air, which threw off the balance of the jump. The exaggeration at the start of the motion didn’t look right either—it felt unnatural and distracted from the overall flow. On top of that, the character stayed in the air for too long, making the jump feel unrealistic.

To fix these issues, I concentrated on refining the motion path and adjusting the character’s poses frame by frame. It took some patience, but these tweaks started to bring the animation closer to what I had envisioned.

This process reminded me how important it is to focus on details like timing and arcs to make animations look believable. It’s been a challenging but rewarding experience, and I’m excited to see how much further I can push this animation.

Categories
3D Computer Animation Fundamental Animation

Week 10: Body Mechanics Blocking

This week, I worked on blockout animation of the character jumping. I struggled with a few things, like getting the character’s position right during the jump. The landing didn’t look natural either, and the timing between the start of the jump and reaching the peak felt slow.

To fix this, I focused on the motion path of the jump and adjusted the landing poses to make the weight shift more realistic. I also sped up the frames between the takeoff and peak to give the jump more energy.

It took some trial and error, but I learned a lot about timing, weight, and motion arcs.

Categories
3D Computer Animation Fundamental Animation

Week 9: Body Mechanics Planning

This week we were told to find references for our body mechanics assignment and make a plan on animating them. We were given different options from which I selected “Jumping from a ledge”. After deciding the animation which I want to do I started looking for references on Youtube. For my animation I was looking for parkour references and jumps like Spiderman and Miles Morales. It was then when I came across a Youtuber named ‘Hero DW’ and I then I saw one of his video where he had done a backflip jump from a high ground and i selected that for my animation.

So, after I found the reference for my animation I started with my planning and drawing a 2D test animation to check if i got the timing right or not.

Categories
3D Computer Animation Fundamental Animation

Week 8: Walk Cycle Spline

This week I got the feedback on my walk cycle spline animation and had to make some changes to make the walk look better, like making the the heel bend a little more and avoiding knee pop.

Working on my spline animation I noticed a few problems like the feet were going inside the ground plane and there was some knee pop and my walker. I tried to fix everything in the graph editor and tried creating curves in the feet and body movement using the animation curve tool. After a short while I was able to fix most of the issues in my spline animation.

Categories
3D Computer Animation Fundamental Animation

Week 7: Walk Cycle Blocking

This week George told us that there are 4 main key poses for the walk cycle animation which are:-

  1. Contact Pose
  2. Down Pose
  3. Passing Pose
  4. Up Pose

By understanding these basic poses, we started to do a walk cycle of 24 frames by creating a foundation in the keyframes by blocking these poses. The contact poses went at 0, 12 and 24 and the others were distributed evenly between these frames.

Again for this assignment I referenced the walker into my scene and started keyframing the contact poses. For the down pose, passing pose and up pose I keyframed them evenly with 3 frames between each. After getting that down it was easy to replicate the next 24 frames with the help of the initial ones, I just needed to change the values and make some changes in the graph editor and with that I have a 48 frames walk cycle ready. After completing the walk, I added some rotation to the ball so that the walk feels more realistic and the weight shift to be visible.

I got feedback on my walk cycle blockout animation and I got some changes to make and after the critique session we went ahead with the lecture and George told us that our next assignment will be to convert the blockout of walk cycle into spline.

Categories
3D Computer Animation Fundamental Animation

Week 6: Weight Shift Spline

This week I have converted the blockout animation into spline after making changes in my blockout animation. The changes which George told me do in my blockout helped me a lot during my spline as I did not get any major issues in my spline animation when i converted it from blockout.

When I first converted the animation from blockout to spline I faced some challenges like the feet placement was not accurate and there was not enough movement in the heel part but those were just minor issues which I was able to fix in a short amount of time. After fixing these in the graph editor and cleaning the graph editor a little I was some what close to achieving the smooth transition between key poses and the realistic weight shift feel.