An Overview of Current State of 3D Scanning in VFX

Seekscale interviewed Nick Tesi, Founder and President of TNG Visual Effects, a 3D scanning company based in North America known for its 3d scanning work on many hits lately (Agents of S.H.I.E.L.D., Once Upon a Time and Sleepy Hollow). While 3D scanning technologies are focused on 3D model production, which is a very specific part of a VFX pipeline, they are a great indicator of global trends at work in the industry.

3d scanned train

I was told by a studio that they used 3D scanning for 2 applications: to create 3D objects and for VFX insertion. What do you scan exactly?

We scan people, animals, cars, buildings, scenes, etc. Animals are typically difficult to scan because of their fur; however we use different technologies for different scanning targets.

We are a one-stop-shop for film, television, and all of the entertainment industry. One amusing point is sometimes we’ll scan and deliver 3D models that are within minutes of their death. Studios often need a 3D replica just before they blow up or freeze a character, or throw them from a cliff.

We’ll scan anything people need scanned! We take a lot of different types of work. A few weeks ago a friend called us from the California Science Center to 3D scan the Space Shuttle Endeavour.

What is your technology stack? Are you a services company or a technology company? Do you use proprietary algorithms?

We are a 3D scanning services bureau for the entertainment industry.

For example, we provide scanning services to Blur (a lot of cinematics and game trailers), Stargate (episodic television work), ABC and Fox (Sleepy Hollow) to name a few.

For our technology stack, we use LIDAR to scan buildings, structured light scanners for people, and professional photography for texture mapping. We really use different tools for different use cases.

On the software side, most scanners come with their own software, and proprietary algorithms. Once we get a 3D model out of the scanner software, we go to Maya for quality control, then Mari for texture mapping, and then Zbrush to fine-tune the details.

Are your 3D models used for previz or for final results?

The studios don’t always tell us what they do with our 3D models but they can be used for either. If it is a character that we scanned, then usually the next step is to rig the model by inserting digital bones. They may want to add cyber hair, etc., and we also offer 3d scanning for facial expression to give the character more life.

We could expand into a rigging business, but currently we leave that to the studios because there is no standard yet, everybody does it differently. We usually ask for wireframes so that we can anticipate problems and provide ready to use 3D models.

What are the transformative effects of 3D scanning over a VFX pipeline? From the logistics, timing (scheduling, fixing emergencies etc), rendering, data volumes point of views…

The big studios put together a VFX team for each project. This team will go out and find production houses with the right skills for different sets of shots. They then come and see us and ask us to send our 3D models out to the production houses that will be working on the models so they can match the plates.

As a result we don’t have much visibility over each house pipeline, so it’s hard for us to answer that. However, one thing is clear, more and more studios now call us at the last minute (1 or 2 days notice) and we’re now becoming known for our flexibility and speed of delivery. The industry is going towards shorter cycles, and towards local partners. We try to adapt, and recently launched satellite offices in several locations throughout North America. We think 3D scanning is becoming mature, and like all mature and efficient technology, people turn to it to assist in their production.

Is 3D scanning technology becoming commoditized?

You see consumer devices that are able to do some 3D scanning (Kinect for example), but they are still low-end, not good enough for production purposes. The better your scanner is the better the model is, and the less work humans will have to do. So there is a strong economic drive to pick top-notch hardware.

For example, Lightstage provides the most expensive 3D scanning solution, and then you have other companies doing photogrammetry (mainly in the UK with investments between $100K-$250K) with 100 or more cameras surrounding the target, you can find hardware at all price tags.

When picking hardware, you have 2 criteria: portability (if you need to go onsite), and resolution. For resolution, we provide 3 levels of service: low resolution (for an extra in the background), medium to high resolution, and high resolution for main characters needing a good amount of screen time.

There was a Star Wars teaser where an actor was moving, and in real-time a Stormtrooper was moving in the real movie environment. This is real-time MoCap, could we have real-time 3D scanning soon?

That’s a combination of a motion capture device and a rigged 3D model. These are clearly awesome technology, but keep in mind that 3D models have to be rigged first. We can’t just scan someone in real time, rig in real time, and then have the person appear in the movie environment. Moreover, doing this for high resolution would be problematic since there is always a lot of 3D modeling and manual cleaning involved. So unfortunately, I think we won’t be stopping expert manual work anytime soon!

Where do you see your business going?

Personally I think all movies will soon be all-VFX. Most characters would be filmed on green screens and cyber characters would be used more and more. Building digital assets should be quite a good market (and good for us!), with the hope that hardware and software will improve, and artists and technicians gaining even more knowledge in their fields. So I think we shall have ever more amazing VFX!

Written by François Ruty, Seekscale

Where’s your closest TNG location?

tng vfx locations

We welcome your feedback! Comment on our blog or visit our Facebook page.