![Digital Dress Replacement on a budget](https://d3stdg5so273ei.cloudfront.net/pasophina/2024-05-28/958039/1400xAUTO/Frame_95_FinalDress_ShadowsAdded-pasophina-crop.0095.jpeg)
Digital Dress Replacement on a budget
Hi, my name is Paulina and this project is the result of my Bachelor semester at the HDA Darmstadt. My goal was to create a fully realized VFX shot with a digital costume applied to a live actress. Enjoy!
Digital Dress Replacement on a budget
This project was created as a part of my Bachelor Thesis as a student of the Animation and Game Bachelor program of the University of Applied Sciences Darmstadt. In my thesis, I researched the use of digital costumes in TV shows and feature films. Furthermore, I aimed to find a feasible workflow for digital dress replacement for projects with limited technical and financial ressources.
In the practical part of my thesis I created a project to implement the suggested workflow. My goal was to create a finished VFX shot that contained a fully digital costume applied to a live action actress.
The final result including the VFX breakdown
Getting started
In the beginning of this project, I did a lot of research to identify approaches which have been previously used to successfully replace costumes in post-production. Once I had figured out how costume replacement is done, I tried to find ways to apply the relevant concepts using the limited resources I had as a university student ( = no funding, limited access to software and technological equipment). Once my gameplan was set, I got to work.
I split my project into separate stages. In pre-production I created all the necessary assets I needed, meaning the digital double of the actress and of course the digital dress.
Process view: from the plate footage to the final result
The Dress
I used an antique French evening dress from the 1890s which I had found on the official MET museum website as a reference to model my digital dress after (https://www.metmuseum.org/art/collection/search/107006).
Though the creation of a realistic dress was important for the entire shot to work, the dress was not my top priority in this project. The textures and shading of the dress could definitely be improved upon. Instead I focused my efforts on the seamless integration and connection of the digital dress with the live action actress.
Marvelous Designer for Dress Creation
I used Marvelous Designer to model and simulate the dress. I separated the dress into 2 pieces: the petticoat and the corset. Creating the petticoat was a rather straight-forward process, though at some point the mesh density which was required to create realistic folding behaviour really slowed down my viewport performance.
The corset was a lot more difficult to model and also simulate later on. The original dress had some pleated details across the chest. Recreating this look in Marvelous Designer proved to be quite difficult. The many sewing connections which were needed to fold the fabric into place became more and more unstable and unpredictable in their simulation behaviour.
During the final simulation, some constraints around the neckline of the dress also shook uncontrollably, which was unfortunately noticable in the final result. I tried different methods to fix this problem, however in the end I resorted to separating the front and back part of the corset. The back part of the corset remained the original, simulated version. For the front part of the corset I ended up using just a single static frame from the simulated alembic cloth cache. Since the corset was supposed to be made of a stiff material anyways, I decided that I would prefer a completely static corset over a corset with shaking constraints that drew the attention of the audience to the wrong thing.
![](https://d3stdg5so273ei.cloudfront.net/pasophina/2024-05-28/271040/1400xAUTO/DD_Progress_Images-pasophina.png)
The Digital Double
I did not have a professional photogrammety setup, so I used free photo-scanning apps as well as handmodeling to create my digital double. Due to the limitations of the photoscanning software, I scanned single body parts separately and later combined several different scans to form one body. I used the meshes generated by the photoscans as a basis for my retopology.
This manual approach worked better than expected, however of course inaccuracies remained. This became especially evident during the roto-animation process. In hindsight, it would have been valuable to do some animation tests as soon as the digital double mesh was done, to test its deformation capabilities and make some improvements to the body mesh. I also hugely underestimated the impact that detailed weightpainting would have on the accuracy of deformations.
Due to time constraints, I did not create a rig from scratch but instead used Mixamo´s autorigger and the dedicated Blender AddOn to create animation controls.
![](https://d3stdg5so273ei.cloudfront.net/pasophina/2024-05-28/589983/1400xAUTO/Screenshot%202023-12-07%20093942-pasophina.png)
Frame from the pre-production workflow test (unfinished dress)
Early workflow tests during Pre-Production
Since my workflow was quite experimental, I did some tests to evaluate the actual usability of the workflow and planned production pipeline. Not only was the test an important practice opportunity for me (e.g. I had never done roto-animation for VFX before) but it also gave me insight into how long certain workflow steps take and showed me problematic areas of the workflow. The knowledgde I gained during the workflow test was very important for creating my production schedule.
Production
Filming for this project took place during one afternoon in December in my universities on-campus studio. Two film school majors from my university helped me with all the camera and on-set lighting related issues (again, a big thank you Tamara and Daniel, without them this project would not have been possible!)
In order to accurately recreate the set in 3D space, I took many reference pictures, measured the distances between lights, the camera and the actress and also created some HDRIs.
Screenshots from my Marvelous Designer viewport: For faster simulation and less re-work in case of iterations, I simulated the corset and the petticoat separately. I also removed some of the body mesh of the digi double to avoid collision problems during the simulation. Especially the armpit area was prone to errors and often tangled with the sleeves from the corset.
Post-Production
Match-Moving
After I had recorded my live action plate, things were finally getting serious. I began by recreating my set in Blender, calibrating my virtual camera and aligning my digital double with the live action actress.
Match-moving the acting performance was done entirely by hand, which was a tedious but ultimately fun process. Initially, I had considered to use some sort of MoCap / Image-based MoCap to possibly speed up the match-moving process, but I could not get any setup to work properly in time, so manual roto-animation it was.
Simulation
After the roto-animation was finalized, I exported the body animation to Marvelous Designer and simulated the cloth behaviour there. I chose to use Marvelous for simulation, as I did not want to introduce another separate 3D software into the production pipeline. The results were fairly good, but in hindsight I would choose a proper 3D software which gives more access to simulation settings and offers better options for troubleshooting (as previously mentioned I encountered some uncontrollably shaking constraints which I could only fix by using an uncomfortable workaround).
Rendering and Compositing
I imported the cloth simulations as MDD point caches to Blender for rendering. I chose to render in Cycles, as it is the render engine that I was most familiar with. I made some last manual changes to the cloth caches using shape keys to improve the fit of the dress on the actress. After finalizing the shading and texturing of the dress, I matched the lighting and finally rendered the dress. For higher control over my CG elements in compositing, I created seperate render layers for the corset, the sleeves and the petticoat. I used MultiLayerEXR to store my renders and the ACES colorspace to ensure color consistency throughout the project.
In Nuke, I used the basic workflow for the integration of CG elements into live action footage: I shuffled out all of the passes I had rendered and put them back together to create my CG rebuild. I ws already pretty content with the lighting and color match I had done in Blender, so I only made some minimal changes to the CG elements to further integrate them into the live action plate.
![](https://d3stdg5so273ei.cloudfront.net/pasophina/2024-05-28/592468/1400xAUTO/Nuke_Comp_script-pasophina.png)
Final Nuke Comp Script
Conclusion: Things I would like to improve upon
Throughout this project, there were many things that I did for the first time, as this was my first real venture into VFX in combination with live action footage, so I am pretty pleased with the result. Creating this whole project from start to finish gave me a detailed insight into the production pipeline of VFX projects. From dealing with live action footage, color space and format conversions to juggling large file sizes and keeping orderly naming conventions... I made a lot of mistakes but learned so much from them. Looking back, there is of course much room for improvement.
1. More accurate match-move
The match move could definitely be improved upon. There are still short moments, where especially the corset movement does not exactly match the motion of the live actress. It feels as if the dress is moving on its own. Eradicating this issue would elevate the overall look of the final shot tremendously.
2. More detailed rotoscope of the live actress
Sadly, there was not a lot of time for this workflow step during the project, so the rotoscope of the arms is a little rough in some places.
3. Better Integration of the CG elements
The live action footage and digital dress could be gelled together better. The color integration of the rendered dress is acceptable. However, further steps could be taken to support the impression of interaction between the live actor and the digital dress. Though initially planned, the reflection catcher was not implemented in the final result. Casting bounce light of the dress onto the live actor could further help the integration of the digital dress, especially given the reflective fabric that the dress is made of. For simplicity during production, camera specific details such as lens distortion were also not included. Adding in those elements as well as other camera-specific artifacts such as chromatic aberration could further improve the final look and realism of the digital dress.
Thank you for reading until the end! I hope you enjoyed this project as much as I did :)
Comments (0)
This project doesn't have any comments yet.