Ubisoft - Starlink

Since the beginning it was clear that this job was going to be “out of this world”. We were tasked to create a variety of work including spaceships, alien landscapes, creatures, planets, asteroid belts, explosions, rocks, lasers, contrails, monoliths, heat haze, trees, lightening, the list goes on…

ClientUbisoft
RoleVFX Supervisor
DirectorDan di Felice

The work is a fruitful collaboration between departments. Production Design, Art Department, photography and post production worked closely together to achieve the final piece. Framestore did extensive work on the model of Zenith (our hero spaceship) to production design to create the real size scale and interior cockpit. [based on game asset, but had to look real – lived in, scrapes, battle damage etc.]

 

“I was blown away by the level of detail inside Zenith when I saw the final model in Prague. The ship could be mounted in several different ways to allow the camera to move freely inside the cockpit.” – Leonardo Costa, VFX Supervisor

 

01.

02.

03.

04.

One thing that allowed us being so efficient on set was the use of the theta 360 camera to capture IBL (image based lighting) and QuickTime references for lighting.

“We gripped the camera on the top of the cockpit which enabled us to do our 360s without stopping production. We could record live movies or accurate lighting IBLs per take and afterwards it would only take a couple of minutes to shoot grey and silver balls and Macbeth chart at the end of the setup “.

 

One crucial part of the job was to get interactive lighting on our plates. In order to achieve the correct light, framing and pace, extensive previs was done before the shoot.

 

“By the time we got on set we had a pretty good idea about framing, lenses, light direction and interactive lighting that was needed. 18 skypanels were running in cycles and changing colours to simulate lasers and explosions. The main spaceship was mounted on a gimbal to create movement. The sun, a 18k HMI keylight, was rigged on a technocrane and so was the camera. That helped us achieve the sense of rotation in space by combining the movements and produce shots where the spaceship performs a full u-turn to outsmart our enemy ship.”

 

The Utah shoot provided us with backplates for the planet sequence. The plates were shot with a drone and extensive clean up and sky replacement were done to the shots before adding the CG elements.

 

We used software called Reality Capture to give us real scale / texture / geometry of the incredible landscape of Utah. That helped us achieve accurate tracking for the shots, get good understanding of the environment and create precise shadows in CG.

A similar photogrammetry approach was used to get models for Zenith and Mason, our main talent. Mason was used as a CG character in some shots and some live action takes were used when we had mid-shots or close-up takes.

We decided not to have glass in the live action to avoid reflection, so the scan of the ship was essential for tracking and placement of the CG glass.

 

The comp team developed a tool named CG munge, which was used in all full CG shots. The tool combines a set of gizmos like chroma blur, chromatic aberration, film weave, hallation, bloom, convolve and a few other special elements to take the edge of the CG. The main goal was to add imperfections to the image so it looked it was shot in camera. We’ve also added glare, lens dirt, lens flares and camera shake to enhance realism to the shots.