Click Here to view all the amazing entries to Rookie Awards 2024
Maziar Mazaheri | Technical Artist / FX Artist
Share  

Maziar Mazaheri | Technical Artist / FX Artist

by Mazkota on 30 May 2024 for Rookie Awards 2024

I am excited to share my entry for Rookies 2024. I've worked hard on a selection of different FX projects within Houdini, Unreal Engine and Embergen. Over the past year I've collaborated with a team of 3 other talented artist's to create Roo. A game about the journey of a red panda through a mystical terrain.

5 293 1
Round of applause for our sponsors

Final Major Project

Synopsis:

Roo, a determined red panda, navigating a Himalayan woodland shattered by a magical storm. With the guidance of a forest wisp, solve puzzles, collect mystical runes and restore balance to the enchanted world. This adventure game combines elements of fantasy, puzzle-solving, and exploration, offering over 1 hour of playtime in an open world with 7 unique areas. Key mechanics include 7 main puzzles, quadruped player controls, character switching and panda NPCs.

Responsibilities:

In this project, my main responsibilities were for all the in game and cinematic FX elements. The game contains 2 main cinematics, an opening and closing cinematic. The first cinematic introduces the game with the thunderstorm, lightning striking the tree and Roo is then revived by the Wisp. 

My other responsibility was Roo's movement mechanics. As we wanted the game to feel more lively, we introduced complex movement controls to the player character such as ledge perching, rolling, slope adjustment, wall mounting, climbing and mantling. 

Finally, I was in charge of creating procedural assets for the project such as all the rocks for different environments and small assets. These included the paths, ruin area, statues of Roo and all the rocks within the cave where the game begins.

Opening Cinematic:

The game begins when a thunderstorm strikes the mother tree. Perched on this tree, Roo, your main playing character, is awakened and terrified. Most of the other panda's have run away trying to find shelter away from this storm. As Roo tries to cross a bridge, he slips and falls into the river, which leaves him unconscious. You then find his body in a dark cave where a guardian wisp, called Wilo, finds him and revives him. The game commences when his eyes open and you now play as Roo.

References:

The role of reference was crucial for creating all the cinematic FX elements. Because our game was semi-realistic and stylized, I had to research into films that fit a similar category. 3 films in particular were really useful to analyse for the revival shot, which were the Son of Bigfoot, Avengers Age of Ultron, and The Eternals, mainly focusing on the dynamics, appearance, and timing of the effects. As for the lightning shots, just focusing on real-life references was key to really understand how lightning, thunderbolts and the thunder clouds that form in the storm function.

Lightning

References

Responsible for creating the lightning and thunder FX in the opening trailer for the game, I found a selection of live-action references that were key in understanding the dynamics and appearance of lightning in real-life. Things I considered were the high emissivity of the main lightning bolt, the frequency of secondary lightning branches appearing from the main branch, and the pacing of the lightning strike in real-life.

One of the issues with this effect was when matching real-life lightning strike references, the speed at which the bolt hits the tree was within a blink of a second, but translating this into a shot context would require more than a second to really sell the impact of the strike. Furthermore, I investigated how the wood reacts to the strike in the immediate aftermath, paying attention to details like how the wood flies off, how the tree also suddenly turns a bright orange or red from the intense heat radiating from the center.

The fact that the tree burns ever so slightly from the strike was such a small detail, and the timing of the cinematic was so fast that early on in the process I oversaw this FX element from the final render.

References

I was heavily inspired by the healing touch effect in the Son of Bigfoot film, mainly because the style of the film was similar to our own, and the purpose of the effect was also similar to my effect. In the film, the character is also lifted up, and we see this massive smoke cloud that appears around him. Breaking down this effect, there are 3 primarily layers; the smoke that emanates from the body, the swirling smoke, and the small ribbons that fly around the swirling smoke.

In addition, I studied Wanda's magical powers from Age of Ultron, in particular, analysing the wispiness of it, and how it affects the space around it. I was very inspired by stills from the character Vision in this film, specifically, how he phased through different people and objects, studying the translucency and material properties of this effect. I wanted the smoke to be quite wispy, similar to Wanda's powers, and retain some of that translucency which I observed in Vision's powers, where these references suddenly became very useful.

(Explanation continued below, after DNEG Masterclass section)

DNEG VFX Masterclass

Brief:

As one of our final year modules at university, we were allowed to choose a brief from a selection of industry clients that best represent our chosen specialisms. The brief I chose to undertake was delivered by DNEG VFX, which was to design an abstract magical effect to take place between 2 hero characters.

The environments, rigged and animated characters, lighting and layout were provided. The main marking criteria for this brief was the originality of the effect, nodal hygiene in the technical setup, the proceduralism of the effect and the complexity of the effect.

Highly inspired by the Last of Us Game and episodic series, I decided to create a fungal based effect for this masterclass, looking at different types of mould and fungus types; 2 in particular intrigued me which were cordyceps and aspergillus fungus types. 

Pre-production

Although we were provided with a base environment, it was our decision to adapt the surrounding environment to improve the effect's integration and overall cohesiveness. In my shot, I was always considering how the effect could interact with the environment, so therefore this stage of the production was essential to plan this out.

Inspirations

As previously mentioned, The Last of Us, Annihilation, Ghost Rider 2: Blackout's appearance and Resident Evil, were very useful film and episodic references to really investigate the pacing, dynamics and appearance of the different mould types and how they propagate onto different types of surfaces'.

The episodic and game cinematics from the Last of Us zombies was my primary references, not only did this film study the same type of fungus, but I was particularly interested in how they handled the dynamics of the fungus on moving objects and humans; which was something I had to consider in my own shot.

While the film annihilation did not feature any moving objects with fungal growth, I was very inspired by the still shot's of this alien fungus that grew on different human bodies, and how the director's managed to integrate this very organic phenomena so seamlessly with this abandoned environment. Part of this film was also very useful for me to plan the territories where fungus could grow, and how it could manifest in a fully grown form.

The final 2 references, albeit they are not fungus exactly, referred to mould growth and interaction. Take Blackout from Ghost Rider 2, whose power was mould and decaying different objects and humans. Just observing how he interacted with organic and man-made objects like wood and plastic was really interesting to study. Also the pacing of this film was particularly useful to observe, mainly because fungus and mould is a very slow growing effect, so one of the challenges was finding a way to realistically portray its dynamics at an accelerated speed.

Real Life References

In the pre-production stage, I also thought of studying fungus, mould and decay using both primary and secondary sources of references. There were so many types of fungus and mould types that I found, but narrowing it down to cordyceps, aspergillus and mycellium fungal growth, really helped me focus on understanding just the behaviour and dynamics of these 3.

Researching into the location and the way cordyceps fungus grows, I found it was common to forests and woodland areas, creeping onto different trees or withered branches. Interesting characteristics I observed was also how this fungus grew and propagated like a spider-web, keeping very close to the host's surface. Furthermore, the colour scheme and texture of this fungus is very yellow and pus-like. A very small detail I found was that the fungus oscillates slightly as it spreads across a surface.

With mycellium or aspergillus growth, which I also found was very similar to household black mould, I actually conducted my own experimental time-lapse of black mould growing on different organic objects. This, coupled with some secondary sources of inspiration such as mould growing on bread or tomatoes, really allowed me to dissect the life-cycle of mould growth, spread, and how the interacting surface decays as an effect of this contamination.

Key observations I made was how this micro-hairs would start to grow, the interacting surface would turn white and frilly, and small spores in a very greenish, pus white colour would engulf the whole surface after a while.

Simulation Tests

Other steps I took was practicing some simulation tests experimenting with different fungus types, and studying how to simulate their growth using SOP's in Houdini.

In these experiments, I was always thinking of the proceduralism of my setup, so applying them using custom solver's in SOP networks were fundamental to execute this effect.

What I found was using a particle network based in POP's to first populate the mesh with spheres, then conjoining them using a VDB operation, simulated some of that mould lumpiness that I observed through my initial references. Extending this setup further, with multiple POP networks exhibiting slightly altered particle dynamics, could then be a non-destructive approach at introducing some break-up into the size and transforms of these lumpy overgrowths on the mesh. This was also the final approach I took for the effect.

Furthermore, I discovered that the pyrosourcespread node, intended for burning and fuel related systems, was also very useful for infection systems. Applying this logic coupled with a carve node, enabled me to simulate the cordyceph fungal propagation with a high level of creative control. By repurposing some of the source spread attributes I could control aspects of the simulation such as the rate of growth and introduce lots of break-up into the growth with procedural noises applied.


Shot planning

The final step in the pre-production stage was doing a very rough layout of the effect taking place between the 2 characters. 

One of the challenges with fungus growth, was finding a representative effect to encapsulate all the types of fungus, which also grow in different environments and settings, and creating one cinematic shot. While all the references were very helpful in particular shot aspects, I also introduced a colour script to the overall shot to better tie elements together.

Seeing as cordyceps fungus was the main type of growth I was trying to simulate, I chose dark blue to create this colour contrast between the victim and attacker. This maroon blue colour really helped identify between the victim and attacker mesh in the final shot. I also planned how the fungus could be repurposed as a weapon.

Films such as Resident Evil Biohazard helped in this aspect, as the film villainizes the fungus as a corruptive infectious element, so I was partly inspired by this very desaturated colour scheme, but also this element of veins and small tendrils bursting out on the victim's bodies to show an infection taking place. This would immediately redirect this type of fungus as a weaponizing force.

I was also inspired by this explosive or bursting aspect which I saw in films such as Alien Covenant or Prometheus, studying how these seemingly inanimate organic structures can immediately infect a victim with a lot of dramatic impact. Similarly, I wanted to introduce some of this impact into my shot.

Shot Production

Cordycephs Growth

From my pre-production simulation tests, I expanded on the entire pyrosourcespread growth setup to create primary infection masks for the overall spread system. The benefit of this approach was the amount of creative control it exposed on the growth dynamics, but also the performance of this setup was very optimized. 

Using these growth masks I found that using a VDB approach combined with some remeshing helped create the final organic look to the effect. The final aspect of this simulation was finding a way to bind the growth to an animated or deforming mesh.

This was counteracted by using the pointdeform node on rest state poses of the animated attackers. 

Mycellium Growth

This type of fungus was more interesting to simulate as it shared characteristics of hair and fine fibrous type growth. Thinking in a procedural manner, I created a custom SOP solver containing VEX script to simulate this growth on different types of input meshes.

The benefit of this approach was the amount of customization exposed to the user, as well as the flexibility of the setup with deforming and animated meshes. Again, implementing the pointdeform node with the rest state pose of the model helped achieve this outcome.

Aspergillus Growth

The final type of fungus I was aiming to simulate was aspergillus or black mould. For this effect it was really beneficial to study Blackout's powers in Ghost Rider 2, mainly his interaction with wood in his power reveal scene. While wood is an inanimate object and dissimilar to the mechanics of hair and skin, it was useful to observe details such as the blackening of the wooden grains, then the white spores that slowly overrun the surface. 

Expanding my pre-production simulation tests further, I decided to use 3 POP networks with different particle dynamics to introduce break-up into the rate of growth, scale of spores and an overall system to deform the sphere mesh as if overrun by black mould.

To produce smaller surface detail like the very fine spore hairs, I created another set of temperature groups with vellum hairs to simulate hair physics with the growing spore hairs, just like my primary mould references.

Vellum Explosions

In my previous iterations of this effect, I found that the impact on collision with the victim was not powerful enough. To improve this aspect of the effect, I returned to my chosen film references to gain inspiration for components lacking from the shot.

A great source of inspiration for this effect was based off The Alien Covenant and Prometheus films, when the alien foetus escapes from the pod and infects its host. The behaviour of the skin tearing, and the fibrous saliva strands that hang off the pod were small details that definitely added to the realism of the overall effect. Furthermore, it was crucial to understand the way the flesh tore apart with more elasticity to start, and then it stiffened in position after being torn. 

To simulate these characteristics, I chose a vellum system because it gave me creative access over elasticity controls, as well as exposing soft body forces. I used the vellum tetrahedral soft-body constraint in particular because it offered greater elasticity and springiness to the ripping tissue.

One of the issues in this process, was combining both a hard body fracture with a soft body simulation. Doing this was required so that I could create harsher cuts on the spore head, then apply this slimy and flexible soft body simulation above it to create more hanging strands.

The final aspect of this effect was looking at the rupture on impact. Again, I wanted to replicate some of that transitional elasticity initially, to something very stiff after. Applying glue constraints helped achieve this type of behaviour. To improve this even further, I also added distance to points constraints to create longer saliva and sinew strands on rupture.

Final Composite


Rendering

Inspired by the different films as previously discussed, I paid close attention to the appearance of specific elements of this effect. Specifically, I wanted the main growing mould spore and frilled hairs to be very fungus and froth like looking. Choosing pus-like colours, beige yellows and desaturated greens for the main colour scheme really helped sell this effect. 

I also wanted this white tint to grow as the mould propagated the surface of the sphere, so promoting the temperature masks from the object level into the render context was crucial to have a dynamic way of controlling the transitioning colours. In addition, we have the fine hairs that slowly creep on the surface, which should be a whiter, more desaturated hue of yellow, to blend it well with the mould around it, but distinguish itself enough as a separate layer of the effect.

The render engine I chose was a huge part to sell some of the sub-surface and transmissive layers of the spore rupture parts in the shot. Here, I paid close attention to roughness and sub-surface values to give the main spore a flesh-like, gruesome appearance on impact.

The choice of Solaris and Karma, also meant I received the optimizations of a GPU renderer. This made the process of shading and look-development a lot quicker.

Iteration 1:

In this iteration, the motion of the particles is too fast and overtaking most of the screen. This is distracting from some of the other elements in the effect. To improve it, reduce the particle speed, and reduce the number of spawned particles. Furthermore, the glow on the particles is over-exposed, so some of the colour data is lost because of it. Reducing exposure and colour grading the shot more is something to improve. There is not enough explicit impact from the mould towards the victim. Throwing the mould ball at the victim would make it more obvious that the fungus is destructive and deadly. 

Iteration 2:

In this iteration, the shot is under-exposed, and there is a lack of diffuse colour information to really understand the scene. In this version, I decided to remove the environment elements to really focus on improving the main effects between the 2 character assets. In addition, the amount of mycellium or (hairy fungus) that grows on the victim's body should be reduced, as this element is very distracting. Other elements are the seed that is not very visible and the lighting of the shot is not showcasing a lot of the rich texture detail of both the cordyceph and mycellium growth.

Iteration 3:

This iteration has less particles, which distracts less from the rich fungus growth on the character's body. The particles, however, lack glow and magical radiance, perhaps advecting them back as smoke and swirling them around the attacker's hands will give them more direction. The environment really adds shot context and integrates the assets and effects slightly better into the scene. The mycellium growth on the victim's body is less, which makes the interaction of the seed and the victim more obvious. There should be slightly more fungal hair, as now the impact of the fungus is feeling weak. Perhaps, the fungus should also rupture slightly, similar to the effects in Prometheus and Alien Covenant.

Critical Analysis and Conclusion

 In the process, I found that the particle layers and dry fungal growth effects are not very cohesive, so something I would improve in the future is making the particle effects emanate from his hands and the texture more powdery and dry. The magic also feels very smooth, lacking some grain and random noise to break-up the motion of the power. There also should be more glow from the particles onto the attacker's and victim's body to integrate the effects elements with secondary shot layers.

In conclusion, I found the overall effect to be a success as I learnt all about the power of reusing different FX systems for alternative use cases; in my scenario, I used pyro spread systems to propagate the mould on different surfaces. In addition, I found that the cordyceph fungus was successful in that I achieved a matching colour scheme. Better colour grading and compositing would be required as well as isolating and polishing FX layers such as the particles and some stepping with the seed smoke and rupture FX. Apart from that, perhaps simplifying and making the individual elements more cohesive would be something to improve it in the future.

Procedural Lightning Tool

One of the challenges I faced in the Unreal was creating convincing enough lightning with enough creative control. Researching into alternatives to this problem, I found that Houdini itself offers a lightning node which exposed greater creative direction onto the size, taper and number of spawned branches based on input geometry. A limitation of this was, however, the lack of secondary branch controls and way of manipulating the noises on lightning branches. To combat this problem, I created my own lightning HDA which exposed primary and secondary noise controls on primary and secondary branches, as well as a way of affecting the position of these secondary branches from the main source branch, which was very useful in production.

Niagara Systems

The way I brought the lightning back into Unreal for the cinematic was using a Niagara system. Within this Niagara system, I found an option of instancing points over a large bounding box. Then spawning these lightning branches using the mesh renderer and sprite renderer options to control the position and size of the individual lightning branches over the span of the effect. To make this effect more convincing, it was also important to add secondary layers in the cinematic strike, such as wood chipping, lightning sparks and very intense emissivity from the main lightning bolt.

Eye of the Sky

Another aspect of this cinematic was creating the atmosphere for a thunderstorm, which involved creating this eye of the storm type of effect to introduce the impending thunder. A limitation of Houdini volumes was that they were very performance heavy and a better pipeline is yet to be established between VDB's from Houdini and Unreal. To counteract this, it was essential to create the effect within Unreal completely.

Investigating this further, I found that using a shader was the most optimized way of achieving this effect and to make the whole process more procedural, a material function was required to then expose parameters to control the appearance of the lightning more easily. 

Research

To develop this shader, it was important to study this phenomenon in real-life and observe characteristics from a thunderstorm. In a real eye of the storm, the surrounding clouds are affected by the central eye's wind and resistive force. Furthermore, in a lowly polluted area, and to introduce a magical element into this effect, you can observe some stars and nebulae in the sky. If it's thunderous, then another element is this slight flickering that occurs within the clouds around the eye, which pulse from black to white showing this crack of lightning that occurs within them.

References

Studying different real-world and film references  such as Into the Storm (2014) or Journey to Mysterious Island (2012), I made some key findings about the dynamics and appearance of thunderstorms which I adapted into my shader function in Unreal. One of the key findings was that the cloud coverage inside the storm is quite billowy and moves very slowly.

The flickering that occurs within the eye, which I previously thought to be quite frequent and aggressive, was in actual fact very slow and progressive. In addition, the eye of the storm itself also swirls the clouds with this vortical force. This was a crucial observation when I previously imagined the clouds moving with more random swirl directions, but in reality, they move in the same swirl direction. 

There might be some debris swung around from man-made or organic objects that were uprooted from the storm front. In my effect, this might be some tree pieces or leaves, or in the case of a really powerful storm, perhaps a big chunk of the mother tree is uprooted.

Shader Development

During the process of shader development, I was largely inspired by some of the real world and film observations I made. One of the features of this shader function was your ability to control parameters such as flow-maps, storm scale and speed, dithering and cloud coverage tiling. These were all important parameters to create that billowy cloud build-up based on cloud maps, then using a flow map, I was able to distort the clouds around the eye as well as the tilable clouds within the storm interior. The dithering controls were significant because it gave me creative access over the circumference and start and end locations of the eye. 

Revival Shot

The final aspect of this cinematic shot was to create this magical power type effect emanating from the main character as he levitates in the air and revived by the wisp character. Part of this process was developing a production pipeline at creating this effect. I finally decided to use Houdini to create this effect for its procedural benefits in giving me more creative access to the pyro elements and dynamics of the smoke.

Shot Development:

To develop this effect, I wanted to utilize Houdini's procedural aspects and follow a simple and effective nodal workflow. Investigating a few nodes in detail, I discovered that the sweep node could be used to create this looping type curve shapes, then applying another sweep node in succession meshed and tapered these curves into organic structures. Applying a final carve node in this workflow allowed me to creatively control the dynamics of this curve growth as well as introduce some easing into this effect.

Inspired again by the Son of Bigfoot reference, I found in this shot there was a level of directionality to the smoke as it swirled and encapsulated the host. To replicate a similar behaviour, I researched into aura type effects and found the most common workflow was using smoke advected from particle velocities and positions. More involved setups included using a torus to advect these smoke fields which created very abstract and magical type effects. Following a similar technique, I used the organic growing vein structures as previously discussed to populate particles onto the mesh, and then advect them as smoke fields. These smoke fields would then be colour graded using their velocity and position values in the shader level.

Shot Production

Difficulties we faced in the process was by matching the lighting, camera, character position and effects between the Unreal scene and Houdini. To tackle this problem, I exported the Unreal camera into Houdini. However, we couldn't find a pipeline to export the character from the Unreal scene in the camera's position. So we had to manually match the character's position using the exported camera in Houdini.

The other problem was matching the Unreal scene lighting to the colour scheme and lighting conditions emanating from the character's smoke emissivity. The way we approached this was to colour grade and match similar nebulae colours on the water surface and rocks in the Unreal scene using point lights in the same position. We also animated these lights so that it would make sense from the shot's perspective.

Post - production

The most important part of this cinematic was combining the Unreal and Houdini effect renders together in Nuke. First applying some colour correction and grading processes on the Houdini effect, I could then enhance the colour details and give the overall effect richer contrast.

The next step was matching the position of the character in Houdini to his position in the Unreal scene. This also required some grading to blur the transition between elements as well as match the lighting intensity between renders.

Process

Another difficulty we had was finding the perfect colour balance for the main smoke field. We were struck between a choosing a more nebulae colour scheme or a blueish tint of purple with pink highlights. The smoke also required greater lift as it fell behind the background elements slightly. 

Procedural Generation - Rocks

One of my other responsibilities for Final Major Project was developing procedural techniques to shader and create some of the hard and organic assets in the Unreal environment. A primary asset I was in charge of was the procedural generation and texturing of different rock assets in the level.

References

Primary Research

To assist me in the production process of the rock tool, I took it upon myself to gather as much primary reference studying how rocks form, where they originate from, the climate and how their environment affects their appearance as well as understanding key characteristics that I could use to classify them into categories.

In the process of development, I travelled to different countries at different times of the year to investigate how the rocks change, if they do, and what I could learn from these climates and their affects to the rock form. I travelled to destinations such as The Grand Canyon, Kew Gardens in London, Wales and Scotland, to collect a variety of different types of rocks through my development process.

From these travels, I managed to create my own classification of different rocks, also thinking of their application to different parts of our game. I discovered with rocks from tropical regions, that there is a tendency for them to have this lichen or mossy overgrowth on their main surface. There was also wet patches that formed in the crevices where rainfall would collect and pool in indented regions of the rocks; which was usually the crevice areas.

Regarding larger rock structures such as boulder or cliff faces, I found they had a lot more convexity in the form of harsh ridges or cuts that swiped across the main exterior. Then they had a lot more colour variation in the concave areas, usually in the form of grey midtones or beige and dark brown grades of peach. With these rocks in particular, there was a lot of high frequency cuts in usually one direction; similar to stacking patterns.

The final rock was also in the Grand Canyon, where draught or very harsh and humid weather conditions affected the rock and weathered it. These were usually less bumpy, but with very deep cracks that formed in vein patterns across the surface. The main surface itself was displaced slightly, within the cracks themselves, however, there was a lot more interesting bump details.

Tool Pipeline

Before creating the tool, it was crucial to plan how the tool would be best applied in both a production pipeline, for our major project, as well as considering its scalability within a nodal based workflow for the use case of different Houdini artists.

A very important feature to implement into this tool was an ability to provide it with an input mesh and for it to output a rock version of that mesh, as well as a proxy version for the optimal use in a game engine setting. While game engines such as Unreal include decimation processes such as Nanite, a lower res version of the rock could be also used as a proxy to bake a higher res normal or displacement map onto in a real production.

Another level of research I conducted was what type of controls were best exposed to a user. Rocks or organic structures are formed from layers of different noises, and to make the tool more user-friendly, exposing noise and volume controls was a decision I made, to give the artist more of a hand in the development of the rock aesthetic.

The other aspect was using the previous classifications as separate section headers in the tool, which would make it easier for a user to navigate and use for unique rock variants in different applications. So the final tool has 5 different main sections, each of which have their own specific primary and secondary noise controls, which only apply to those specific types of rocks.

Tool Process:

Developing on this process further, I found that with larger cliff faces, in regions such as the Grand Canyon, it wasn't as important to provide users with direct access to an input mesh and then converting it to a rock version. The approach taken here was instead to expose areas to delete pieces from a proxy mesh, and then using these deleted pieces, you get these really nice harsh cuts on the rock surface. Applying further volumes and noises to these cuts matched the aesthetic of cliff faces and dry rocks very well.

With other types of rocks like the ground or dry rocks, there was a different section in the main tool for this, which exposed vein controls through primary and secondary noises. This allowed an artist to displace the rocks based on vein masks, or alternatively, bake onto low res versions of this craggy ground rocks for the optimal usage in a game engine.

Texture Baking

One of the bottlenecks of game engines is the performance overhead endured from high res imports, especially very complex shapes such as rocks. Also responsible for the procedural texturing of these rocks, it was crucial to find a way or UV unwrapping and baking out texture maps from these rock assets and exporting them to Unreal. 

An optimization I previously discussed with my tool, was one's ability to preview and use low and high res version of the rocks as tool outputs which really helped in baking out normal and displacement details onto the low resolution versions of the mesh.

Every rock surface is unique, so the process of UV unwrapping them varied from rock to rock. With very concaved or highly irregular volumes, I researched into this field and discovered a technique of using UV flatten nodes and procedural UV based approaches to first UV unwrap the rocks. Then baking out individual map layers was easily accomplished using the labsbaketexture node in Houdini.

Tool structure

The overall tool structure is organized into 5 different categories for different rock use cases. We have displaced rocks, which converts the input mesh into a rock version. The cliff faces section which creates realistic cliff faces and stacked rock layers. SDF rocks, which is another more craggy version of the input mesh. Finally, 2 types of ground rock surfaces, one is more craggy with veining, the other more flat with stacked and displaced rock layers.

Procedural Texturing

Another responsibility of mine was procedurally texturing these rocks. To make the whole pipeline more optimized, I created a selection of VEX based and MTLX shaders for automatically applying and exporting rocks to Unreal. Within these shaders it was important to consider attributes from the obj level such as surface curvature and custom masks to introduce break-up into the rock form from the shader context.

Importing Rocks

One of the optimizations we made through the rock pipeline was also finding a way of batch importing multiple .fbx files and iterating through all the rock variants in one for loop structure to then batch export them all into respective files; one for rock assets and another for texture maps. Part of this process was to create a python script in Houdini, which allowed me to import all the rocks from the .obj level into one object context and iterate through each of them.


Comments (1)