TNG adds Motion Capture (MOCAP) to their Services

As a 3D scanning company who primarily scans human bodies and human heads, it was only a natural progression for us to branch out into motion capture. This technology brings static dormant objects to life. A 3D scan captures the surface of a person’s skin and clothing. Once we’ve put together the scan, we unwrap the UVs and remesh it to prep it for texturing. After this step we may render the model, but the next step is to insert joints (a skeleton) so that there is something to drive the skin of the character into movement. This process is called rigging.

Melika-Mocap

Once our skeleton is sitting nicely within our CG digital double (3D scanned human), a process called weighting takes place. This allows you to determine how much of each specific joint will drive the skin near that joint, and having smoothed dialed in weights will visually create lifelike movements when the character is animated. To animate these joints without having to actually grab the joints themselves, a GUI (graphical user interface) is created, which is connected to the joints via orientation constraints. This allows an animator to more easily and intuitively animate a 3D character. After animation, the video is rendered frame by frame on a network of computers called a render farm.

The MoCap system we are using is called Noitom’s Perception. It is fully wireless, fast to setup, and easy to use. The first step is to have the talent strap up the sensors. We calibrate it to the system, and then analyze their gait (walking style). As we know a person’s posture varies, you can adjust for any imperfections by having their torso lean forward or backward as well as for how heavy their heel strike is or how weak the strike is. These features allow us to basically pre-process the animation before it’s even recorded, which results in a human looking animation despite any real-life incongruities. Once the animation is recorded, we post-process the animation in a proprietary software from Noitom, focusing on each time the character’s foot makes and loses contact with the floor to make it more realistic and believable. This process is quite easy, and quite fascinating. After this, the animation is taken to a software like MotionBuilder in which further post processing takes place as well as binding the animation to a character, and finally you may export out your realistic human motion capture data along with your CG model and render it in any software you’d like.

 

Where’s your closest TNG location?

tng vfx locations

We welcome your feedback! Comment on our blog or visit our Facebook page.