It’s easy to think theatre is all about traditional skills, handcrafted costumes, and painted scenery.
But for the Royal Ballet and Opera in Covent Garden digital technology is critical from running ticketing, to broadcasting its performances, to using GPUs and AI for ideation.
It also has an extensive collection of photographs, video and audio, that it is tasked to preserve and to share.
This is all essential to its mission to enrich the cultural life of the nation. Opera and ballet are known for lavish stagings – and the lavish lifestyles of some its bigger personalities.
But, as head of technology delivery Keith Nolan explains, it is also responsible to the taxpayer.
“We are part funded by the Arts Council, so this is taxpayer funds. And I think we've got we’ve got a duty to make sure that our journey uses those funds appropriately.”
And that imposes a particular burden on the technology team, because “people would like to feel that that money has been spent primarily on the art form.”
When Nolan joined the organisation from the education sector around six years ago, it was taking a hard look at costs. This resulted in a drive to build out a unified architecture to support its digital operations, one which made extensive use of Nutanix.
Stage by stage
One of its first efforts was to reduce its estate from two datacentres to one, delivering a cost saving of 25 percent. It subsequently drove that down from one to none, resulting in a further 25 percent saving.
In total, Nolan says, infrastructure costs were around £500,000 a year at the beginning of the project. They currently run at £120,000.
It now runs its tech infrastructure largely in the cloud. As a government-backed non-profit, the organization benefits from the massive framework agreements the government strikes with major providers. But those can also present their own problems.
“At some point the government changes hands,” he explains. “Crown Commercial Services are involved. A contract expires with AWS or with Microsoft, and then you find yourself as a not for profit in a world of costs that you didn't quite expect were going to come your way.”
At the same time, he says, some of the organization’s infrastructure has to stay on prem.
“When you're in the theatre industry, in the arts sector in particular, a lot of what you do, especially in broadcast, requires very low latency applications. So, we're very much … looking at precision time protocol-based applications. So, master clocks, slave clocks, and everything has to run in sub millisecond levels.”
The reality is, “A cloud only approach does not work if you're going to broadcast out to 700 UK cinemas in real time at Christmas. Putting it all in the cloud would not work. You would end up with audio streams being out of sync for video streams.”
That drive to a unified architecture is not just a pure tech issue. Theatre spans a collection of skills and crafts, many of which have their own tools and technologies.
So, he explains, lighting, sound and broadcast might all, historically, have different networks. “So, if you can imagine you'd have five networks, and then you'd have cables going to exactly the same place physically next to each other”
“What we’ve done over the last two years is a solid programme of unification,” he says, working closely with Cisco and Telstra. Again, this has incorporated Nutanix and a shift to a self-service architecture. And it has also uncovered some shadow IT.
“So, we've unified that process now. And what that means is that we can lower total cost of ownership for the business by using the same infrastructure.” It also means it is easier to spot and patch vulnerabilities.
Light and shade
“I think it's something that's particularly unique to the theatre space and opera houses. We're seeing that technology change from traditional protocols that they've used for years, such as DMX, to be more aligned with IP protocols”.
This has coincided with increasing use of AI, he said, for example to track lighting systems. But it also means that the theatre’s creative teams can use GPU-based systems for ideation and other tasks – freeing up space in the real world. The challenge is working out whether this work is best carried out in the cloud, or whether creative staff use dedicated, on-site rigs.
“If you've got a team of five people, do we move away seven people using £5000 or £6000 machines that need to be refreshed every few years to keep up,” he says. “That's quite a lot of money.” Again, some capacity still has to be kept on-prem, because of the broadcast issue.
This can all mean unsettling change for teams that are aligned to particular tools and particular ways of doing things. But the ultimate direction is clear, he says.
“The digitalization offers us a much secure way of running our shows and actually higher resiliency than we've had before. So that's the exciting part. And obviously, it's how you bring people onto the journey. It's making them realize that technology, for all its problems, is actually a good thing.”