Skip to content

Search the site

DevOps3DNews

How the Oscar-winning VFX giant DNEG streamlined its DevOps processes

"Much of our software stack had dependencies on various other older pieces of software..."

London-headquartered VFX firm DNEG may have won seven Oscars, but that scale of success hasn't spared it from the challenges of legacy software, asset tracking, and platform shifting.

In a conversation with The Stack, Victoria Adeyeri, Software Developer at DNEG discussed the biggest challenges to their pipeline – and fixes that worked.

For the company supporting films including Dune: Part Two, Furiosa: A Mad Max Saga and Godzilla x Kong: The New Empire, its biggest backend challenges came from attempting to deploy new technology within the organisation.

"Much of our software stack had dependencies on various other older pieces of software," Adeyeri explained. These dependencies in turn made deployment "slow and complicated" on its VFX pipeline.

For an animation and effects studio- its pipeline is the backbone of all processes and involves a series of interconnected and automated tools that allow artists and technicians to work together.

Victoria Adeyeri, Software Developer at DNEG

For DNEG, the deployment challenge was compounded by an industry-wide transition to the USD (Universal Scene Description) file format for data storage.

USD itself is an open-source file format developed by Pixar for 3D computer graphics scene descriptions. Files in the format can contain data about 3D graphic elements like scene layout, model geometry, material appearance, animations, and virtual cameras.

The company held metadata on hundreds of millions of unique assets and had to manage 1.5 billion relationships between those assets. According to Adeyeri, the transition to USD increased the complexity of its stored data "by about eight times."

Despite the transition's difficulty, it has been necessary for DNEG and its peers, as the file format centres data interoperability. In fact, the technology's value has been such that its engineers were accoladed by the Academy Awards earlier this year.

Ayderi noted that "newer pipelines using the USD file format allow data to be shared more easily between the different software programs that our different teams of artists will use."

Part one of streamlining DevOps for DNEG was deploying a new version of its asset-tracking system. The key goals included ensuring ease of deployment using GitOps, easy ways to monitor and output data for analysis, and easy troubleshooting.

"We used Red Hat OpenShift Pipelines (Tekton) to build native container workflows and then Red Hat OpenShift GitOps, ArgoCD + Kustomize, to cleanly deploy our environments across multiple clusters," Adeyeri explained.

"Once the new asset tracking system was in place, we found that to prevent any one active show from monopolising all the resources of the system, we could introduce the idea of rate limits," she added.

Setting these rate limits was possible through the use of Red Hat 3scale, an API management platform.

DNEG's experience iterates the need for efficient software transitions across board, and of prioritising data management. For SaaS clients, interoperability and integration remain crucial to any DevOps adoptions.

See also: Is DevOps dying as a discipline?

Latest