This is my submission for Term 1.
Design Proposal Submission
This is my submission for Term 1.
Design Proposal Submission
This week I focused on all the Sequencers and camera focus and timing. For one of the scene I created a Niagara effect of dust particles bursting around in a circle when the character is hitting the ground with some camera shake to add a dramatic effect to the scene.
After I was done with all the focusing and timing of the cameras I added Bloom and LUTs to the cameras and turned on Motion Blur and put all the Sequencer together in a Master Sequencer for rendering.
During rendering I faced some issues with the cloth simulation as the cloth was acting weird and later on I found out that Anti Aliasing was causing this issue so in Movie Render I turned off Anti Aliasing and to compensate with it I added some extra Console Variables in the renderer to enhance the quality of the renders.
I rendered all the scenes in the Digital Space as my device was not strong enough to render all that and after the rendering was done I just did some colour correction and added some sound effects to the video in Adobe Premiere Pro and in this way I completed my Unreal Engine Assignment.
So this week I started with the animation and adding cameras in my scene and creating Sequencers with different cameras and character animation. The problems which I faced this week was mostly regarding cloth simulation and collision. At first I created a Character Blueprint with Cape added as a Modular Mesh but the problem with that was, it is either not possible or too difficult to add a Character Blueprint with selected animation in the Sequencer and the other problem was, although selecting the skeletal mesh of cape as the character’s skeleton the cape was not colliding with the character.
Even after watching all the Youtube videos regarding this issue and trying everything I was not able to make the cape collide with the character mesh and the bones of the character was retargeted with the bones of UE4 mannequin. So, to solve this issue I went back to Maya and added the cape in the character’s FBX file and rigged it again using the AccuRig and then followed the same steps as before to added the character back in Unreal Engine 5.
So, after all this now I had a character with cape and did not needed to create a Character Blueprint so the issue with the sequencer was also solved. I just added Cloth Simulation with weights to the cape and just adjusted the hit box of the character to make the cape work properly.
For the sword, I just created a socket in the Right Hand of the character and added the sword to it so now the sword was moving with the character and the animations. This week I just fixed the cloth simulation and added the sword to the character and worked with the animations and camera angles to sync them all properly.
So I finally found a character which I want to add in my project because I think it blends very well with the aesthetics of my project. I imported the character into Maya and gave it a basic rig with Human IK so that I can transfer animations to that character then import it in Unreal but everything started getting more and more complex as the animations which I wanted did not have a T-Pose so it was getting hard for me to use Human IK on them as I was new to this and I tried everything to learn and do it properly but it always felt like I hit a wall as no matter what I tried it never worked and It was too time consuming. So, I just made some basic animations myself and later I imported the model in ‘Accu Rig’ which is a free rigging software and then imported that model in Unreal as I was having problems importing the rig directly from Maya.
So after struggling a lot with the character and its rig I just imported it in Unreal Engine with Animations and then I just simply mixed the animations since I had 2 same characters (one with Maya rig and animation and the other one which i got from Accu Rig) I used the Accu Rig version of my character and targeted it to UE 4 Mannequin Skeleton so that I was able to use the Unreal Engine free animations from the packages which I have and I just blended it with the Maya Animation which I have created.
So this week I spent most of my time fixing my character which I got from Sketch Fab and figuring out ways to use animation properly on it.
This week I finished grey boxing and imported the meshed to Unreal Engine and placed them properly there. I also started looking for character animate and add to my project. After importing the basic shapes in Unreal I started adjusting the atmosphere and sky in Unreal to set the mood for my project as I am going for a dark red night like atmosphere.
Next I started combining different meshes from Fab, Quixel Bridge and Sketchfab and started making modular meshes to change the basic meshes with those. I just combined different meshes and changed the texture file a little to blend them into each other and after that I started combining them by creating ‘Packed Level Instances’ and that gave me the building blocks for the Cathedral.
This week we all had to show Serra our progress with the project for Term 1. So, I got review on my project and what I have done so far.
This week I just focused on creating a map layout for my project as I was going to build a gothic style cathedral inspired from ‘Elden Ring’. So, for that I need a plan and a basic layout.
After I was done with the layout I started placing basic shapes in place of the buildings to get an idea of the building’s size and how they will look. The basic shapes served as a place holder till I finish grey boxing so that I will know what I have to place and where I have to place the detailed meshes exactly.
Grey boxing will take some time to complete because the building which I am going for is quite huge an it’s gonna contain a lot of small modular meshes. I was doing grey boxing in Maya and then I will import it in Unreal Engine to place the detailed textured meshes in place of basic shapes.
This week, I looked into Control Rigs in Unreal Engine 5, learning how to use them for animating characters. It was all about setting up rigs and making sure everything was in place for smooth animations.
First, I added a Control Rig to my project via the content browser under the Animation section. I started by creating a Modular Rig, which included adding the default mannequin’s skeletal mesh into the blueprint but instead of the default mannequin I used a Paragon Character from FAB. I then dragged and dropped the rig module into the available sockets on the character’s skeleton to set it up.
I could also customize the rig by adjusting the controller sizes and colors to fit my preferences, which was super useful for making the rig easier to work with.
Next, I worked with an octopus model to practice a different type of rigging. The first step was editing the skeleton and setting the root bone’s transform to 0. I deleted the existing arm bones but kept the shoulder bone. Then, I added bones starting from the shoulder, creating joints for the arms, and bound the skin to the skeleton.
I added a Control Rig Samples Pack from Fab to my project and created a regular control rig for the octopus. After importing the octopus into the rig hierarchy, I checked out the chain hierarchy to understand how the bones connected.
I created controls for the shoulder and the last joint of each arm by selecting the bones and right-clicking to add a new control. To make sure the control wasn’t a child of another control, I pressed Shift + P. I then set up a parent constraint by assigning the root bone in the name section. By parenting the root bone control and attaching it to the shoulder bone, I ensured that the control could transform the bones it was linked to. I repeated this for all the shoulders.
I experimented with SpringInterp by turning it into a vector. I dragged the last bone of the arm into the graph, got the bone’s data, and connected the translation to the target. Then, I linked the control for the bone to the Set Control node, which applied the changes.
I also set up an Aim Constraint to control the direction the arm would face. I defined the name and target for the arm’s aim, set the Aim Control’s parent to Control, and connected the nodes to make sure everything worked together.
Lastly, I applied what I had learned so far by rigging a human character. Using the nodes and techniques I had practiced, I was able to set up the character’s joints and controls, giving me a much better understanding of how to prepare characters for animation. This emphasized the importance of proper joint placement and control creation to ensure smooth, realistic animations.
This week was a great opportunity to learn how to use Control Rigs effectively in Unreal Engine. It’s exciting to see how these tools help set up characters for animation, and I can’t wait to continue building on this knowledge!
In Week 4, I further explored the Sequencer in Unreal Engine and then progressed to working on creating Materials.
Initially, I learned how to create Shots in Sequencer, which is essential for maintaining organization when working with multiple cameras. To create a Shot, you first need to add a Subsequence track from the Add button in the Sequencer menu. This Subsequence Track serves to organize and manage the various cameras assigned to different tasks.
To create Shots, we first need to generate a Level Sequence from the Cinematics menu and ensure proper naming for organization. These sequences are then added to the Subsequence Track, where their lengths can be adjusted. Once the Shots are in the Subsequence Track, we can assign cameras to them by selecting the camera, pressing Ctrl + X, and then double-clicking on the desired Shot to paste the camera into it.
This week, I also learned how to create Materials in Unreal Engine. I began by downloading a Material from Quixel Bridge and importing it into Unreal Engine. I then created a new Material and used the Material’s node editor to add the Normal, Base Colour, and Roughness maps of the imported Material. I learned that the R channel is for Ambient Occlusion, the G channel for Roughness, and the B channel for Displacement. I connected the RGB channel of the Colour Map to the Base Colour, the G channel of the Roughness Map to the Roughness input, and the RGB channel of the Normal Map to the Normal input of the new Material.
Next, I experimented with adjusting the tiling of the Material by adding Texture Coordinate, Multiply, and Add nodes. This allowed me to modify the tiling based on specific needs, and I tested different values to explore various outcomes
After creating a Material, we were instructed to generate a Material Instance by right-clicking on the created Material and selecting the Material Instance option. The key difference between a Master Material and a Material Instance is that the Material Instance inherits all the properties from the Master Material and allows for real-time updates and adjustments.
This week, we began working with the Unreal Engine 5 Sequencer. I learned how to translate film production techniques into the Unreal Engine Sequencer, with an emphasis on understanding the key differences between traditional film production and the Unreal Engine workflow.
Film
1. Master Scene Technique (Master Shot)
2. Coverage Cameras
3. Linear vs Non-Linear Storytelling
4. Triple Take Technique
5. Overlapping Action
6. Hitting Marks
7. Traditional Live Action Production Roles
These are all crucial elements of traditional live-action filmmaking, giving a solid foundation for understanding how a production comes together from both creative and technical standpoints.
Unreal Engine
1. Sequence-Based Linear Workflow
2. Shot-Based Non-Linear Workflow
3. Multi-Artist Collaborative Workflow
Workflow Comparison
These workflows are critical for large-scale productions, especially when teams need to collaborate efficiently without stepping on each other’s toes or slowing down the process.
After understanding the key differences between traditional film production and Unreal Engine, we began working with the Sequencer. I learned that the Sequencer is Unreal Engine’s Non-Linear Editing Tool, offering features such as:
This makes Unreal Engine highly versatile for both film and game industries.
I learned that in film, the narrative is typically structured as a collection of sequences, often following a three-act structure. In contrast, Unreal Engine uses a Level Sequence, which organizes Nested Level Sequences to build an entire narrative film.
We were introduced to two new terms: Possessable and Spawnable Actors, where Possessable actors are like Zombies which we don’t want in our scene and Spawnable actors are considered to be the angles which we always want in our scene. For Spawnable actors we will have to call them for them to appear in our scene whereas for possessable actors they will be seen in our scene always no matter what.
Afterward, we created a sample project named DMX Previs Sample to learn how to create new cameras within the scene and animate their movements, all while gaining experience with the Sequencer and the process of adding cameras and other objects to it so that we can keyframe them and animate them.
We learned to create Spawnable Actors after adding the models to the Sequencer. By right-clicking on the object I wish to convert into a Spawnable Actor and selecting Create Spawnable within the Sequencer, the object becomes a Spawnable Actor. This ensures that the object is always available in the Sequencer whenever we need to render the scene or access the Sequencer.
We created a Level Sequence and then opened the Sequencer to add a camera to the DMX Previs Sample scene. After incorporating the camera into the Sequencer, we adjusted the focus property from frame 1 to frame 100 and keyframed it to create a simple animation.
This week, I experimented with camera settings and movements to create various camera animations while enhancing my understanding of cameras in Unreal Engine, all while learning to use the Sequencer effectively.
This week, we delved deeper into Unreal Engine, beginning with the fundamentals such as the tools, user interface, and various features. We also explored the organization of elements within Unreal Engine and learned how to effectively manage levels inside a project.
When doing world partitions and world level design Unreal Engine is faster, easier and more efficient and due to introduction of nanite structures it can become incredibly detail.
The assets which we import are saved in the Content Folder as .uasset files and levels are saved as .umap files and projects are saved as .uproject files.
If don’t know what is where placed and not being able to find something in Unreal Engine just search for it in Unreal Engine.
Upon launching Unreal Engine, the Unreal Project Browser appears, allowing us to choose a location for our project and assign it a name. Today, we selected “Film/Video & Live Events” as the project type and opted for Starter Content. The Raytracing option is optional and can be enabled later within the project.
Content Drawer Shortcut: Ctrl + Space
To enlarge the view port press F10 key but we will still have the access to content drawer and other options and F11 makes the view port full screen and then we can not use the content drawer.
In this lecture, I learned that organizing assets into levels can enhance efficiency by facilitating the separation of different elements, making them easier to locate. Typically, there is a main level, within which we can create sub-levels.
Lighting is essential for creating realistic environments. I explored Unreal’s advanced lighting tools, including the Environmental Light Mixer, Directional Light, Sky Atmosphere, Sky Light, Exponential Height Fog, and Volumetric Clouds. Rotating the Directional Lights changes the time of the day in the scene. Ctrl + L + Left Click Drag is the short cut for rotating the Directional Light in the scene.
Created a new Emissive Material by multiplying a colour with parameter and connecting the node from the multiply to Emissive Channel.
Shortcut for creating a Three Colour Constant in Materials: 3 + Left Click
Shortcut to create a Parameter: 1 + Left Click
Just drag and drop the Emissive Material on any object to apply it on them.
The Quixel Bridge add-on allows for quick world-building by importing assets into the scene. Assets can be downloaded in Quixel Bridge and then easily dragged and dropped into Unreal Engine.
Props can be merged using the “Merge Actors” option in the Tools menu. Grouping props allows for rearrangement or further editing, but if no changes are needed, they can be merged into a single entity.
Landscapes can be added to a scene, or a new landscape can be created using Landscape Mode, which also allows for sculpting and shaping the terrain.