To create the zombie version for our NPCs we don’t do everything from scratch, as we would with a new set of clothes, or uninfected heads. For each step in the pipeline we use what we have, and start from there.
For the zombie version we only focus on the geometry of the head and arms, clothes are overhauled in the texturing process. The only parts that need to be put into ZBrush are the body pieces.
UVs and retopology are already finished and can’t be changed due to the morphing feature. If they differ from our sculpt in ZBrush, they need to be reimported, and the detail of our uninfected head reprojected to the retopologised and UV’d mesh on a layer. That way we keep the skin detail of our human and can later use it as a base. Disabling the layer allows us to work on the bare mesh without destroying the detail when smoothing areas; keeping in mind that altering the geometry too much will result in heavily deformed UVs (since they are bound to our retopology geo). Whenever we want to, for example, pinch or inflate certain areas of the skin, we make a copy of the mesh, apply the changes and project them back to the original sculpt. No evenly distributed topology will be harmed during this process.
If at some point our vertex order gets screwed up, as long as topology stays exactly the same, we can fix our blendshapes with the “Reorder Vertices” tool in Maya.
Since UVs are the same, we can reuse the Substance Painter project and just swap out the mesh + baked inputs. Unfortunately every stroke that’s painted by hand is reprojected onto the mesh instead of using the UV map. So if we want to use our custom painted masks, we need to export them first and add them to the zombie Substance Painter project as bitmaps.
The uninfected textures act as a good base for the zombie.
Now we can adjust the colours a bit and add any blood stains, wounds, infections etc. on top. And that’s it for texturing the head and arms of the zombie version.
The last step of the AI creation process revolves around additional masks for the clothes, in order to add dirt or blood textures on a subsequent layer in the engine.
Delivering the asset
Once everything is named correctly, textures and masks efficiently packed and blendshapes successfully tested, the asset can be delivered to the next department, where it will be rigged and animated.
We hope we were able to shed some light on the development process of our morphing system in this article. If you didn’t manage to read up on our modularity system, you can check out that article here. Make sure to head over to his ArtStation page, and show Daniel your appreciation by following and liking his work (artists love that stuff).