Click Here to view all the amazing entries to Rookie Awards 2024
Performance Test
Share  

Performance Test

이한빈
by KAL on 1 Jun 2024 for Rookie Awards 2024

In the sandy apocalyptic world, Hidden civilization of mankind about to run Performance test of AI-controlled humanoid infantry battle platform, K-810. This is way too detailed entry for my film, <Performance Test>. please enjoy my process!

4 397 1
Round of applause for our sponsors

Among the desolated desert dune, The underground city. last humanity Developed combat robot under the command of The Organization. The Organization is now awaiting the result of the "Performance Test".

TEASER

FULL FILM

Introduction

Hello. My name is Hanbean Lee.

I have just finished my thesis film, Performance Test, and will soon graduate. Before that, I am honored to submit my work to The Rookies Award this year.

Initially, I aimed to become a 3D animator. However, while working on this project, I discovered that my desire is more to direct animation, not just acting it. Yes, this project means a lot to me.

Performance Test deals with an AI-enabled robot with a gun. It only needs to aim at its designated enemies, but will it? I hope viewers will ponder whether it is the robot with the gun that is dangerous, or the human who gave the gun to the robot. After all, we now have to live with AI.

UnrealEngine

 One unique aspect of the production process for this project is that I used Unreal Engine 5. In short, Unreal Engine is excellent. Although it is not software designed exclusively for animation and thus requires some effort, it is incredibly fast and versatile. During the one and half year production period, I aimed to utilize as many features of Unreal Engine as possible.

As an animation producer, I found the following core features of Unreal Engine particularly noteworthy:

Nanite

 Nanite is simply amazing. It is one of the core features of Unreal Engine 5. Although it has some limitations, it allows for the effective rendering of high-density geometry.

Asset Network Including Quixel

 Unreal Engine has a large asset network centered around UE Marketplace with Quixel. Assets created by various content creators are brought into the Unreal Engine project in an editable format. This allowed me to maintain a certain level of uniformity by editing the properties of external assets. One of the goals of this project was to create the highest density project possible using one computer and one person. To achieve this, I decided to utilize the maximum available resources, including assets created by others.

MetaHuman

 MetaHuman provides a template structure for highly detailed human characters. MetaHuman is already being used in various fields, and as soon as I discovered this feature, I knew I had to use it. While it required significant effort in many aspects of expression, I am ultimately satisfied with the final result.

Directing

The initial concept for this project was meant to feel like a sort of prologue. Originally, I had envisioned a much larger story centered around the robot featured in this work. That story was intended to be more of a hopeful, heroic tale. However, this project, compared to the original concept, was designed to be significantly more rigid, suffocating, dark, and ominous.

The robot gradually becomes more adept. Initially, it starts hitting targets one by one slowly, but it progressively gets faster, firing in bursts as needed. Eventually, it starts to independently distinguish between Ally and Enemy and takes action accordingly. However, its judgments begin to diverge from expectations. So yeah, quite a simple story.

To be honest, my directing skills were not very good when I was working on the storyboard for this project. You can see this in the video posted above. When I first rendered that video in December 2023, I was shocked by how terrible it was. At that time, I could only manage basic shot compositions. I saw my lack of skills in many areas, including layout and pacing. The reason I'm mentioning this is that after realizing my shortcomings, I spent a lot of time directing the film, which significantly improved the project. I wanted to share this experience of me getting the importance of proper Directing.

The lesson I have learned while overcoming my weak spots is that "Meaningless direction can nullify all other efforts, including animating, lighting, and rigging." For an animation scene to come to life, each shot must have a clear purpose and focus. I now understand the importance of setting these elements clearly. 

List of full video renders from Dec 23 ~ JUN 24(left),

Some of self direction troubleshooting list(right)

It was December 2023 when I realized these shortcomings. At that point, I had to completely overhaul the project, and time was running short. I benefited greatly from Unreal Engine during this period. Rendering all the scenes took only about 12 hours, allowing me to quickly attempt multiple renders and make numerous directorial adjustments and revisions.

Since I mention of Unreal Engine, I'll continue discussing the pre-production process. After creating the storyboard and story reel using Storyboarder, I transitioned to Unreal Engine. I created a simple level and prepared assets for previz. using the Sequencer feature, I started the camera direction of the previz stage.

 Choosing to compose shots and go through the previz stage in Unreal Engine was a quite smart decision. This approach allowed me to reuse the pre-composed shots directly for rendering and post-production, rather than leaving them in the previz stage. This significantly saved time on file transfers and lighting after the animation stage.

Assets

 After completing the previz stage, I moved on to the asset creation phase. As I mentioned earlier, I wanted to concentrate my efforts to achieve the project’s objectives. Therefore, I created irreplaceable assets with the highest density possible and substituted as many assets as I could by purchasing or download. Here is the list from the planning stage of assets I planned to create and those I intended to purchase or download through marketplaces or Quixel:

Assets to Create:

1. Robot (definitely!)

2. Objects prominently featured on camera

3. Machinery requiring rigging

4. Clothing

5. Other hard-to-replace assets


Assets to Purchase or Download:

1. Rat

2. Human characters (MetaHuman)

3. General background assets


 The 'choose and focus' plan, as above, was turned out generally effective. But when it came to execution, things didn't quite go as expected. Let me delve into this paragraph in detail, including this aspect.

Scene

 The background of this film comprises the facilities of an underground city. The outside world is a desert filled with sand, so the background is laden with sand dust. While the facilities are maintained, they appear worn in places, and the city constantly strives to overcome shortages of many supplies. In contrast, the control room of The Organization is different. I aimed to incorporate some decorative and sculptural elements symbolizing the slowly crumbling authority of the old world.

 Thus, I envisioned the main stage of the film, the Shooting range, as a space littered with sand and worn in spots. The design of the space was structured around the concept of conducting the Performance Test by temporarily altering the layout of an already-used shooting range. It's practical but carries a sense of poverty. The principle in designing this space was "functional, practical, but visibly challenging to maintain, and absolutely missing of wooden materials."
 The atmospheric reference for such a space was drawn from the game <DarkTide> and the movie <Dune>.

At this stage, I already had a previz level set up. Therefore, my approach was essentially to enhance the existing scene. I focused on increasing the density, bringing in assets, and editing them to achieve harmony.

As you can see, I poured thousands of assets into the scene as if my PC had no VRAM limit.(IT HAD LIMIT BTW) I purchased and downloaded suitable assets from Quixel, UE Marketplace, Sketchfab, and Turbosquid without hesitation, continuously placing them until they looked good in the master shot (the first two shots from the process view). Then, I kept adding assets until the gaps in the previz camera shots were filled. It was quite fun!

 As a result, my scene now contains 6,500 assets, making it quite heavy. However, I was able to work relatively smoothly by using the Nanite feature. While Nanite is originally designed for use with background assets, I also utilized it later when creating characters.

 After the "major shift" in direction(four months ago) I had to create a new scene. Although the camera angles were not yet finished, the concept of a "Control room" was clearly established. Therefore, I drew simple concept art and gathered references accordingly. To convey an impression of authority and rigidity, I based the cultural motifs on the Cold War-era Soviet Union.

 However, unlike the shooting range, the space itself needed to feel very special, so I couldn't design this space with assets alone. Therefore, I approached this stage by doing the poly modeling of the scene in Maya and then handling the rest in UE. Bringing the base model into UE made the subsequent steps significantly easier. This approach allowed me to create a dense and detailed space in various aspects. Unfortunately in the final film, the space isn't highlighted as much due to directorial reasons, which is a bit sad:(

Character

 The character creation process for my project was both the most ambitious and the one with the most negotiations. Some characters were crafted to reflect my highest aspirations, making them look truly impressive, while others involved using assets and adjusting them as much as possible to fit the project.

The Robot

 I actually love machines. Among them, I adore robots, and if the robot is about the size of a person and is carrying an assault rifle, I go absolutely crazy. If you were to read this and Assume that maybe I just wanted to create a robot all along, I wouldn't argue with you :)

 The robot featured in my work, "K-810", is actually a simple robot without any personality. I considered the appearance and actions of this robot not as expressions of its own will, but as manifestations of the 'authority' that weaponizes it. The robot and the AI within it merely carry out orders as instrumental entities, just like the other humans in the shooting range.

 The design direction for the robot aimed to appear relatively modern compared to other designs in the work, but I sought to exclude elements like holograms or cutting-edge sensors and instead prioritize physical implementation as much as possible. Think of it as more akin to "Wall-E" than "EVE." For real-world references in robot character design, I looked at Boston Dynamics' "ATLAS" and media such as "Chappie" and "Outside the Wire," set in a contemporary background.

 In terms of the design of this robot, I set a sort of quality benchmark that all other visual elements in the work had to keep up with. It meant that all props and backgrounds had to match the density of the robot. If they fell short, I made efforts to adjust them by any means necessary.

  When proceeding with character animation for the robot, it was necessary for each part to fit together precisely. Therefore, careful consideration was required from the design stage regarding how the robot should move. I referenced the anatomical structure of human joints and aimed to incorporate it into the robot's design. The shoulder was particularly challenging because the human collarbone technically floats! so I spent a long time implementing the movement design of this part from the early stages of the design.

 I also remember spending quite a long time for the eyes. The eyes had a structure similar to the 'reverse chameleon eye', which are recessed inward. Implementation aside, I always had concerns about whether this design would accurately depict the gaze during the animation stage. I relieved when it turned out quite well in the end.

 Lastly, design the silhouette of the head and eyeholes were quite tough too. When determining the impression of the robot's face, I aimed for a somewhat harmless appearance from below but a significantly threatening one when viewed from above. Honestly, this was something I couldn't confirm during the design stage. So, I did my best to reflect an image as close to my vision as possible in the final design sheet and had to move on to the next stage.

 The poly modeling stage of the robot involved the use of a software called <Plasticity>. This software is based on NURBS and provides powerful Boolean operations with accurate measurements, allowing for construction of hard-surface models. Like i did at UE, I inserted so many details like maniac. It was so happy.

 but It wasn't all sunshine and rainbows. Issues that were pondered during the design phase, such as those concerning the eyes and head finally surfaced here. This was the last stage where modifications could be made, so there were adjust with the silhouette, adding and altering designs, and so on.

 Plasticity supports exporting polygons in Ngon, quad, and tris formats. so first I  export mesh in Ngon. Then, I  unwrap the UVs of the Ngon model in Maya. After that, I transferred the UVs to the quad-exported model. 

 yes. I understand that the proper method is to go through retopology, then unfold the UVs, and send it to ZBrush for detailing work for Substance Painter. Embarrassingly, I didn't have much time then, so I had to resort to these shortcuts.

In any case, the UVs were unfolded and the transfer was completed. So, I moved on to the next stage. I designated the materials and separated the mesh for baking the mesh maps in Substance Painter. Although using UDIMs might have made things easier, I refrained from doing so due to concerns about potential technical issues in UE. The mesh map baking was done using the highest-density Ngon output from Plasticity. It turned out quite well!

courtesy of Alexandre Alves. from link below!

There is something I wish I had known when texturing assets for export to UE. The color profiles of Unreal Engine and Substance Painter are different, and when imported into Unreal, the colors tend to appear brighter than intended.

To synchronize them, you should download the profile from the link and apply it in Substance Painter.

Additionally, here's another helpful link that you might want to check out.

How to match Substance Painter viewport with UE5 ? By Alexandre Alves

The intent of robot's texturing was armor plates coated with a enamel-like material along other metal parts. I aimed for a contrast between the dark-coated metal and the bright armor plates. So I start with fiber pattern and stack paint and desaturation, dust for detail. 

 And I exported  an 'dusted' edition, which is dust and soot accumulate on the robot's material. making it appear gradually dirtier as the time flows. This was implemented in Unreal Engine using the Material Parameter Collection function.


 Due to the mentioned color profile differences and using high-saturation light colors during the process, the pre-set colors of the robot didn't appear as intended and looked almost white. Additionally, the specular highlights were way too intense. Therefore, I reattempted the texturing, assigning a more yellowish hue among other adjustments.

Rigging

 The rigging of the robot was accomplished with the help of my talented colleague, Kim DaHyun, in our Lab. The rigging process was carried out using Advanced Skeleton 6 (AS6), which is truly impressive. For those unfamiliar, I highly recommend checking it out.

 Before transferring the file, I placed Locators at the robot's joints and provided detailed instructions for each rigging component, as shown above.

 The rigging structure of the robot is not much different from that of a typical biped character. However, I did utilize several constraints to implement secondary movements for areas like the neck and hips.

 On the other hand, significant effort went into the eye movements. As previously mentioned, the robot has inverted chameleon-like eyes. To implement the gaze, it ultimately had to be controlled by similar conventional Eye Aim setup. The three rings surrounding the eyes needed to move in automatic sync with the eye movements. The challenging part was that each ring had to follow the eye movements to different extents. Otherwise, the eye movements would resemble a large, erratic disc. DaHyun implemented the following formulas in the eye controller to achieve this functionality:

First ring - movement value = eye movement / 1.2

 rotation value = eye rotation value * 4

Second ring - movement value = eye movement / 2.2

 rotation value = eye rotation value * 1

Third ring - movement value = eye movement / 50

 rotation value = eye rotation value * log(1.1)

 Using this approach, the rings surrounding the eyes were able to follow the eye movements while maintaining their respective offsets, creating a more natural and controlled appearance.

 The rigged character, except for tubes and soft objects that required skinning, was entirely bound using constraints. This approach yielded significant synergy when exporting to Unreal Engine, particularly due to the ability to leverage the Nanite functionality.

 The Nanite feature allows for the relatively inexpensive and lossless rendering of high-density geometry. However, Nanite cannot be activated on skinned meshes, which is a significant limitation. By importing the robot's geometry as a static mesh and attaching it to the robot's joints, this limitation can be bypassed. This was implemented through Blueprints and Construction Scripts, and the same technique was applied to all props in the project, including guns and machinery. 

Humans

 There are five human characters in this film. Among them, one is the leader, one is the leader's secretary. The three main characters are scientists conducting experiments. I wanted the audience to initially perceive these three scientists as if they had control. However, towards the end, the emphasis shifts to show that the scientists are merely instrumental beings, much like the robot. While they tremble before the robot's heated gun barrel, the leader watched the entire scene from a distant control room and makes decisions without a second thought while The secretary, like a machine, merely performs her assigned tasks.

 MetaHuman was released in early 2022. At that time, I was thinking early concept of my project and thought it would be beneficial to save time by using MetaHuman for creating human characters. MetaHuman looked really impressive to me back then. Consequently, I actively utilized MetaHuman from the previz stage.

 However, as the scene progressed and the robot was completed and animated, I began to feel a sense of incongruity. The visual quality of the backgrounds, props, and robot had outran the MetaHuman characters. This discrepancy was particularly noticeable in the faces and hair, which had a distinct "template-like" impression. I believe this was due to the numerous works using MetaHuman emerged while I was developing my project.

 So, I set out to find a solution. I discovered that 3DScanStore offers scanned data specifically for MetaHumans. Considering the attributes of each character, I selected suitable faces and used the Mesh to MetaHuman feature to create new faces and assign textures for each character. Additionally, I acquired new hair for the female character and made adjustments. Thankfully, I was able to complete all these processes in about two days.

 As a result, the primary cause of the low-quality appearance was the textures. For the young female character, the new face seemed somewhat too mature, which didn't quite fit. So, I applied the new texture to the old model, and to my surprise, the quality improved significantly without any sense of inconsistency! 

 At first, I intended to dress all the human characters in identical lab coats. I thought that simply adjusting the size of the same outfit would yield satisfactory results. How naive. It didn't work out that way.

 The clothes needed adjustments too. The shape of the clothing had to change according to each character's body type, and the silhouette of the clothes had to be adjusted based on the character's role in the film. At this point, I sought help from my distant friend, Jang Jiye. She assisted me greatly by modifying the assets I had, ensuring they fit each character's body shape perfectly according to my requests.

 When texturing each piece of clothing, I considered that the material of the lab coats could symbolize their respective ranks. Thus, I designated a clean sand color for the leader and the secretary's outfits, aiming for a relatively tidy and well-fitted look. In contrast, I thought it would be fitting for the three scientists to have lab coats that appeared dirty and worn out, almost like rugged work uniforms.

 Metahuman characters can be transferred to other DCC tools via Quixel Bridge. I was able to override their rigging using the name matcher feature of Advanced Skeleton 6.

 There is also a rat character. The rat was a planned to purchase from the beginning. thanks to the excellent quality of asset, it was usable without major complications. of course addtional efforts were required in converting Xgen hair data to Groom via a MEL script for import into Unreal Engine. 

 To depict the unfortunate fate of the rat, I set up a pose for the dead rat, then separated it to sculpt and texture it again. Additionally, for animating I set up a new rig for each piece of the rat.

Prop

 In the course of creating this film, I designed a total of ten props. Each of these props appears at least once as a key object within the scenes. This was part of my initial plan, as I believe one of the best ways to show characters interacting with their world is through scenes where they handle various objects. To be honest, I also included them simply because I have a deep love for machinery.

Rifle

 The rifle is appears most frequently in the film. It's the prop I invested the most effort into creating. Given that the cultural motif of the world within the film is inspired by the former Soviet Union, I designed the rifle accordingly. However, considering that the user of the rifle is a robot, I thought it would be interesting to design the rifle in a way that doesn't accommodate the user at all. This added a unique twist to the design.

 During my research, I came across the TKB-022p, an experimental firearm from the former Soviet Union. Its rough, handle-less design struck me as perfect for a robot.

 Since I was using design of an existing firearm as a reference, I didn't feel the need for full concept art. Instead, I focused on editing and refining the existing design to align it with my concept, transforming it into a unique element that fit seamlessly within the context of my project.

 The rifle was created using the standard pipeline of DCC tools, ZBrush, and Substance Painter. Since the robot always needed to carry the rifle, I made an additional edition that appeared worn and covered in soot to match Visual along with the robot. For the later part of the film, I also add an emissive map to simulate the barrel heating up, allowing this effect to be controlled within Unreal Engine.

Target Wrangler

 This machine moves along rails set in each shooting lane. upon reaching the designated position, it extends its hydraulic arms to display targets. The Target Wrangler consists of grip wheels, continuous tracks, extendable arms, and an adjustable target clamp.

 Finding suitable references during the design phase for this asset was challenging, so I created a more detailed concept design instead of digging websites. This included designing the rails and targets associated with the Target Wrangler. 

 The designs of all machinery featured in this film including the Target Wrangler, were heavily inspired by the simple yet functional aesthetics found in Dieter Rams' works.

 While the implementation of the Target Wrangler was successful, issues arose during the rigging process. It's about movement with the tracks. I initially attempted to use the wheel function in Advanced Skeleton but soon faced difficulties. Fortunately, Mik3 came to the rescue. While the exact details of his solution remain magical to me, It's safe to say  he rewrote the whole expressions related to the track's movements. Thanks mik3!

Consoles

 The Consoles in the Film are mostly imported from external sources, with some specially designed for this experiment. Each Console serves a specific role within the artwork.

 The Container Module restrains the robot and stores all the necessary equipment and materials for it. On the side of this equipment, ammunition is directly supplies. Then it automatically reloading the magazine and placing it in the inner discharge port.

 The Target Console is placed at each shooting lane and displays the status of the current Shooting Session. While it features many switches to control the Shooting Session, they are overridden by AI within the Film.

 The Feed Monitor syncs the robot's POV for observation. Adjacent to the Feed Monitor is an auxiliary monitor used for one-way communication with the Organization, designed based on the broadcast equipment's switch array.

 The Behaviour Monitor is one of the devices directly showing AI decision-making processes. Through this device, we can catch a glimpse of K810's struggles.

 The Switch Panel is a device that grants authority to AI and is directly connected to circuits. It was used as an auxiliary device.

 The Power Console grants more authority to the robot's AI and distributes power. It features many mechanical switches such as keys and levers, along with analog indicators, indirectly granting and depicting the process of revoking authority from the robot.

Consoles

The Consoles in the setup are mostly imported from external sources, with some specially designed for this experiment. Each Console serves a specific role within the artwork.

The Container Module restrains the robot and stores all the necessary equipment and materials for it. On the side of this equipment, ammunition is directly supplied, automatically reloading the magazine and placing it in the inner discharge port.

The Target Console is placed at each bay and displays the status of the current Shooting Session. While it features many switches to control the Shooting Session, they are overridden by AI within the artwork, rendering them useless.

The Feed Monitor syncs the robot's gaze for scientist observation. Adjacent to the Feed Monitor is an auxiliary monitor used for one-way communication with the Organization, designed based on the broadcast equipment's switch array.

The Behaviour Monitor is one of the devices directly showing AI decision-making processes. Through this device, we can catch a glimpse of K810's struggles.

The Switch Panel is a device that grants authority to AI and is directly connected to circuits. It was used as an auxiliary device.

The Power Console grants more authority to the robot's AI and distributes power. It features many mechanical switches such as keys and levers, along with analog indicators, indirectly granting and depicting the process of revoking authority from the robot.

 Each console is designed to have multiple functions depending on its purpose. While I would love to delve into explaining the mechanical properties and rigging structures associated with each of these props one by one, I understand that even esteemed readers like yourselves must be quite exhausted by now. Therefore, I'll focus on highlighting the notable aspects.

 This work features numerous screens. and while the details of each screen may not be individually significant, I believed they added a convincing touch and depth to the film. Therefore, I set them up to display information relevant to the world and the stages of the story based on the initial concept. The design of each screen was created with a primitive 16-bit theme. It implemented by going through sketches before being designed and animated in After Effects. These Screen Images were then rendered as image sequences for rendered as Emissive maps within the Unreal Engine.

Document

 The document is the last asset added to this film, but it is also the most important. The contents of this asset symbolize the deep-seated authority and the relationships between the subjects who are subjugated to it, which is a crucial theme of the film.

 When this asset first appears in the film, I wanted it to be a moment where the audience gets a glimpse of the hidden truth happening behind closed doors. To achieve this impression, I thought it would be effective to include seals for confidentiality, memos, and symbols of the 'organization' scattered throughout the document. To accomplish this, I gathered references from confidential documents and old official records from past eras.

 While implementing this asset, I thought it would be compelling to depict the backstory behind this asset, which serves as a powerful symbol in the film. Although this narrative couldn't be fully captured within the film itself, I was able to convey it indirectly through the teaser.

Animating

 Animating was one of the most challenging parts of my project. As I mentioned earlier, my initial direction was quite lacking. And to reiterate, empty direction leads to empty animation. I realized this during the animation process, but I still had the obligation to proceed toward completion.

 Before diving into the animation process, I prepared several preliminary tasks:

1. Export the entire level from Unreal Engine to Maya to prepare the background for animation.

2. Export the camera for each shot from Unreal Engine's Sequencer.

3. Export rig files in FBX format and import them into Unreal Engine as skeletal meshes.

4. Import the background and rig files into a single scene in Maya and create an Animprep file.

5. Create an animation scene in Maya, reference the AnimPrep file, and import the camera from step 2 to finalize the animation scene.

6. Once the animation is complete, export the animation from the animation scene in FBX format and import it back into Unreal Engine.

 For this animation, motion capture was partially used. With the invaluable help of my research lab friend,  Lee Soyoung, I used Rokoko Suit available in our lab to animate the MetaHumans. Ultimately, this motion capture data served as a 3D reference for our animating process with Nam Hyebin. During the cleanup process of the motion capture data, I found Animbot's "Smooth Rough" and "Connect to Neighbors" functions particularly useful.

 In addition, I experimented with the MetaHuman Animator feature for facial animation. It's always worth trying new things when given the chance. However, this did not make it into the final film. Unlike body mocap data, exporting it to Maya was challenging, and cleaning up the keys proved to be very tough.

 For the robot's animatings at the early stage of film, I aimed to convey a sense of it performing very basic commands with simple movements, reflecting a mechanical and somewhat rudimentary behavior. as you can see, the robot retrieves a mounted rifle with slow, segmented actions. This sequence is broken down into three key movements: looking, raising the arm, and rotating the wrist and body. 

 As the film progresses, the robot's proficiency in shooting increases, and its movements become faster and more fluid.  the robot performs the action of aiming at a new target, which involves three steps: looking, turning the body, and aiming. Unlike the early scenes where the movements were slow and segmented by body parts, these actions are executed simultaneously, with motions that counteract the inertia generated by the quick movements.

As long as the robot is powered on, the component corresponding to the pupil in its eye is designed to spin rapidly. from this shot, the robot is shown resolving an error on its own and refocusing on the target. The goal of this shot was to convey the robot's method of perceiving and intending through its eyes, which function differently from human eyes.

To implement the gaze, the rings surrounding the robot's eye were animated to move erratically, causing the eye's position and rotation to appear scattered. This depicted the dispersion of its gaze, symbolizing confusion. Additionally, the rotation of the rings was animated to further emphasize the robot's chaotic state.

Conversely, when the robot regains focus, the previously erratic movements were programmed to align suddenly, with both the eye and the surrounding rings swiftly snapping to a forward-facing position. This instant alignment was used to convey that the robot had regained its concentration.

Lighting

 I conceived the lighting direction for this Film in two distinct phases: before and after the Performance Test begins. When the scientist operates the key, the lighting in the shooting range turns off, leaving only the flood light of the container module. This was intended to signify a pivotal transition for the narrative.

 The first lighting setup features evenly distributed light, but overall it's not very bright and has a slightly hazy impression. Throughout the scene, there's a large rect light  on the ceiling, creating a gentle gradient of light from top to bottom.

 For the second lighting setup, most of the light is muted, leaving only limited but powerful backlight to create a sharp impression. By muting most of the light, it becomes easier for the audience to focus on the areas I intended. With the light becoming sharper, it required careful adjustment of the fill light.

 I completed the reference search based on this plan and proceeded with the light painting work. While the second directing style looked quite appealing, I had some concerns about the slightly faded charm of the first directing style.  However, once I started the lighting work in Unreal Engine, the results turned out quite well, with the addition of fog and the representation of hazy light.
 Additionally, when the climax of the film approached, I devised a lighting setup where the scene is bathed in red light.

VFX/Simulation

 This project's VFX and simulations was complex stages using Unreal's Niagara, Houdini, Maya's MASH, and Marvelous Designer all together. Some VFX were implemented from the previs stage, while others were completed at the end of the project.

Niagara

 The muzzle flash, one of the most crucial VFX elements, was created using UE's Niagara. This VFX not only encompasses the implementation of gun muzzle flames but also includes casing ejection and smoke effects. To implement this effect, I started by conceptualizing the necessary components. The effect includes the following elements:

1. Muzzle flash

2. Smoke

3. Casing ejection

To implement the muzzle flash effect, I structured five Niagara modules as follows:

1. Primary muzzle flash effect

2. Secondary muzzle flash effect

3. Muzzle sparkle effect

4. Smoke effect

5. Casing eject effect

 To ensure that these five modules execute simultaneously according to my signal, I connected the Loop Delay of the Spawn burst in each module to a variable (FireRate-x/sec) I set, allowing me to control them in Sequencer.

 One challenging part was dynamic of Casing eject effect. Due to the nature of UE's Niagara physic's limitation, the ejected casings tended to stuck themselves vertically into the ground. To address this, I wrote a Module Script to gradually align the mesh with the ground based on collision counts. For whom interested, I'll include a detailed BP below:)

MASH

 I used Maya's MASH feature to implement the simulation. In this entry, I'll highlight a particular case.

 To create the effect of ammunition being drawn into the machine, I used two curves placed in a hanging shape. By utilizing the MASH feature, I was able to position a model combining ammunition and links along the curves, achieving the basic form of the effect.

And I assigned keys to the animation speed of the curve to give the impression of the ammunition moving along the curve. for depth to the movement, I animated the control vertex of the curve to which the ammunition was attached, giving it a sense of swaying as if under tension.

Houdini

Indeed, my bright and resilient friend Ham Yusun came to my rescue as I was struggling with VFX. I am truly fortunate:). She had the magical tool Houdini at her disposal, and rather ungratefully, I asked her to handle the most challenging tasks. It's the Exploding Rat!

 truth be told, there's not much to say. I handed her the rat's head along with a tutorial link above, and she worked tirelessly to provide me with an amazing result.

Compositing

 The compositing process for this project was essential in recovering from mistakes and adjusting the video tone to create a more captivating visual. However, the most critical aspect was implementing the robot's POV scenes.

 Here's an example of how I 'recovered from mistake' in a particular scene:

In this scene, to emphasize the Bloom effect of the lighting, I applied a glow effect to the masked areas. Additionally, I added the missing muzzle flash and effects to simulate bullet trajectories. To align with the lighting direction, I blended two types of LUTs that matched the tone and flow of the scene. Finally, I added film grain to complete the scene.

POV

 On the other hand, implementing the robot's POV required significantly more effort. I approached depicting the robot's viewpoint similarly to how I portrayed the screens on the consoles mentioned earlier. However, unlike the console screens, the robot's POV needed to illustrate how visual input through the robot's eyes is processed and interpreted.

 While pondering the methodology, I stumbled upon the screen display of Tesla's autonomous vehicles. I believed this would serve as an excellent reference for rendering the robot's vision. To implement this effect, I extensively used the RotoBrush and Tracker features in After Effects.

 Additionally, I discovered how to use Unreal Engine's post-process material feature to obtain edited images through various effects from the camera's perspective. This effect proved to be an excellent tool for depicting the robot's status.

In summary, I went through the following steps to implement the robot's POV:

1. Use the post-process material feature in UE to add TV lines across the entire screen and highlight targets.

2. use Tracker and RotoBrush tools to mask out the terrain and render it with lines, then added aiming points to the targets.

3. Added HUD information and depicted the real-time changes in the HUD based on the robot's actions.

4. Applied screen effects and a slight chromatic aberration effect.

5. Added LUTs and film grain to finalize the look.

Closing

 It’s finally over. I wonder if anyone is actually still reading this entry. The reason I wrote such an extensive entry is not only to make a good impression on the judges but also because I want my future self to remember the experiences from this project.

 Over the past year and a half, I have learned an incredible amount from working on this project. Some decisions were good, some were unnecessary, and some were outright terrible, requiring significant effort to rectify. Nevertheless, I feel a sense of pride in this film, which is filled with both regret and glory. Through this project, my skills have improved significantly, and my experience has deepened.

 I know that when I graduate and enter the professional field, I will likely make mistakes far worse than those I've made here. Whenever that happens, I will revisit this entry to remind myself of how I overcame challenges. Cheers.

ShotProgress


Comments (1)