Character Finished on Real-Time Project

Calling the character for my real-time project done. Had paid an artist for the model (this is one configuration, there’s various outfits and styles), just took some effort to get everything connected on Blender. I’m using tools designed for Unreal Engine, so manual effort was necessary to get it up and running. Probably the biggest hurdle was retargeting the skinned animation, which didn’t really work, out-of-the-box, without plug-ins. It ended up not being a ton of work, though it took time to figure out what the issues were, and get comfortable with new tooling.

Video above shows the first test with the final version of the character model, and the custom facial and body motion capture I recorded. For the face I’m using Epic’s LiveLinkFace app, which thankfully will export to open-format CSV files, and the license allows use outside of Unreal. For the body I recorded regular video from an iPhone and feed it into QuickMagic, which uses AI to analyze the video and export a skinned mesh rig for popular products like Unity. Still needs some clean-up, cause the animation appears a bit choppy, but I’ve confirmed the pipeline works and there were no major technical problems, just needs some finesse to smooth out the raw output.

All the mo-cap animation for this app has been recorded custom, using an iPhone. Ended up being a 10 hour recording session, with an actress I hired and a large recording studio. Still have to go through all the files and tag them, since there are over 100 video clips for body animation, and around 60 for the facial capture. The session was recording for 2 projects at once, a mobile horror game, and more of an interactive experience for the web. Might not need all the footage at first, though most of what I checked looked good quality and should be usable.