Surprise Me!

MetaHuman Faceware + Cine Cam Live Testing (Unreal)

2022-02-20 9 515 Vimeo

Warning: Valley of the Uncanny - intense staring! Here's some screencaps testing live face-cap, puppeteering Metahumans in Unreal 4.27, using Faceware Studio, with realtime camera control 'Cine Cam'. Getting close to a viable virtual cinematography face-cap workflow. Trying to get some fairly subtle eye-movement. (Controlling Facecap AND camera at the same time is definitely a rub your tummy, pat your head type of thing. Actors - you are not replaceable!). Environment: Unreal's free 'Rural Australia' pack from the marketplace. Metahumans: I customized the female face a little. The male is one of the default Metahumans. Improvements: - I need a 60fps webcam (and faster PC) to avoid some of the stuttering here (this 2017 GTX1070 gaming PC is only good for proof of concept). - Mouth live capture / talking doesn't look all that realistic yet with Metahumans. - I should have added a tiny eye-light, to highlight the eyes. Possibly reflections are not working in this scene though. - need a way for Faceware plugin to puppeteer/translate the entire metahuman body, based on tracked shoulder movement. Currently it only can rotate the head on the neck. It can't move the entire body to match shoulder movement/translation like Faceware Studio can. This would be amazing for close-up shots like this, to help simulate breathing, or small body movements, side to side/up -down - which would add a lot of realism. Faceware Studio is processing this translation data, but the current motion logic supplied (I think) by Glassbox doesn't currently pass this body translation on to Unreal. Hopefully that will be coming soon!

Buy Now on CodeCanyon