In housing-obsessed Britain, Rightmove holds such a special place in the national consciousness that its web metrics track breaking news events.
“When something big happens in the UK, we can tell because the traffic levels drop off and then spike back up again when it stops happening," Andrew Tate, Head of Technology Operations, tells The Stack. "It's a very real national barometer.”
Rightmove is the number one property site in the UK, with 140 million visits per month. SimilarWeb estimates that it's the 12th-ranked website in the whole of Britain and 206th in the world. In the global real estate rankings, it's in fourth place. With 50 million page views and 10,000 properties uploaded every day, its users spend 1.3 billion minutes spent on the website every month
Internally, Rightmove has more than 200 engineers working across upwards of 20 product teams, as well as a huge and fast-growing data estate taking up roughly one petabyte of storage space. Put simply: it's a vast property that takes careful management.
The Stack spoke to Tate to discuss his role in leading cloud transformation at this famous website, which achieved £364.3 million of revenue last year - a 10% year-on-year uplift. Over the past year, it has increased tech spend by £2 million, with the increase mostly funding "increased spend on consultancy on AI; migration of our data centres to the Cloud; infrastructure maintenance; and higher costs for software licences following the increased headcount", according to its annual earnings report.
We started by asking Tate about his slightly unusual job title.
“I think it might be a bit old school,” he reflects light-heartedly. “It hasn't evolved like the role has evolved, necessarily.”
As that job title suggests, Tate oversees tech operations, which involves managing infrastructure and data stores, taking responsibility for site reliability, and leading on platform engineering. Tate’s teams provide the platform, tools and services that development and product teams use to deliver functionality to end users.
“It’s a top 10 website in the UK and has been around 20 years," he says. "We have a lot of data. We've got a huge amount of website traffic. And we're constantly shipping product functionality to customers who are estate agents, mortgage lenders, surveyors, and many other property and financial institutions, as well as the consumer side of the website.”
Steering Rightmove towards the cloud
Tate led on the cloud “from the get-go” after joining four years ago. “It became clear that if we really wanted to do something to modernise and simplify our platform this platform, that cloud is the really obvious option," he remembers.
Historically, Rightmove operated from three active data centres, with traffic distributed between all of the facilities, which has been “really reliable” but also “created a lot of complexity and overheads”, especially when it comes to database architecture. For instance, data was often pushed out of sync as writes took place in different locations, requiring the building of bespoke services.
Rightmove had an AWS account, but no cloud in its production environment. To change this, Tate first spent a year “understanding the lay of the land: the people, culture, and technology estate.”
The journey began in 2021 with the selection of Google Cloud Platform and the establishment of a secure landing zone for containerized workloads. By the end of 2022, Rightmove expanded the platform to handle large-scale traffic, migrated key services to the cloud, and introduced machine learning for content moderation. In 2023, it migrated 50% of it’s 180 microservices and built out a new data architecture for improved insights and analytics.
In the past, developers would write code and give it to the operations team, which would deploy and manage its reliability. Now, it’s adopted a bounded context microservice pattern, which enabled dev teams to “take more ownership and be in more control of the end-to-end flow of value from code in their head to to functionality on the website”. This means that today, teams are largely running the stuff they build, with the platforms teams focused on creating reusable components that enable self-service.
Rightmove now has a “high velocity” continuous integration and continuous delivery/continuous deployment (CI/CD) model for the applications and dev teams. Where once it was tied to a two-weekly release schedule for database changes, this has now been replaced with a modern distributed database, removing the need to wait for up to 14 days to make changes to the databases and allowing faster product development in greenfield spaces.
READ MORE: Sysadmins facing "decline": Bureau of Labor issues grim job outlook forecast
Critically, most of the website now runs in the cloud. “It used to take months to upgrade our on-premise application hosting environment," Tate adds. "It was a lot of different environments and data centres. Now, running Google Kubernetes Engine (GKE), that happens in 40 minutes without anyone having to do very much. So that’s a massive reduction of overheads and complexity."
Much of the cloud deployment was then undertaken in-house - with limited outsourcing in order - in order to upskill the workforce. Tate says this decision to focus on levelling up existing resources has led to “transformed careers” in teams such as database administration (DBA), which was once “a little bit isolated from some of the team” but were among the first people to “get into the weeds of understanding the GCP API and TerraForm".
"One of the engineers on the DBE (Database Engineering Team) team is now a really strong cloud engineer with a database specialisation," Tate tells us. "Now the team is writing the Terraform code to integrate cockroach dB, virtual private cloud (VPC) peering, provisioning Cloud SQL instances. They're not just managing databases anymore and have been freed up to focus on higher value tasks."
Real estate data challenges
One of the major challenges Tate faces is handling the vast amount of data that courses through Rightmove's systems. "We've never deleted a property image in the 24-odd years that the company's been around,” he says.
He's leading the implementation of dynamic search filters to sort through the vast amount of unstructured data in property images and expose them to AI models. " “There's a huge amount of data to be able to train models off in there," Tates adds. "What we're looking to do is to use that unstructured data to help people search for properties based on what's in the image.”
The result is that images are first scanned by AI and then categorised, so that properties can be surfaced and shown to users based on their preferences or search terms.
To help its data management, Rightmove deployed CockroachDB, a distributed database with standard SQL for cloud applications, as as a system of record, reducing architectural complexity and enabling multi-cloud and multi-regional resilience.
“One thing we really wanted with a new database provider as we moved to cloud was a simplified replication model to give us the geographical resilience that we needed, but not create complexity for our development team," Tate says. ""So having a single endpoint to be able to write to massively simplifies the estate so that developers don’t have to worry about replication or which URLs or load balancers to use."
What can enterprise tech leaders learn from Rightmove?
We asked Tate to pass on his lessons about driving change at a big organisation.
“There are two key aspects to consider,” he says. “First, the people and cultural side, which is often overlooked. It’s crucial to make early decisions about how you want to involve your teams in the transformation. If you want them to be part of it and manage the new system afterwards, you need to invest in them from the start. Get that right, and it becomes a force multiplier. Get it wrong, and it can slow everything down.
"The second aspect is data. It’s important to understand your core business entities and how well your data maps to them before trying to dismantle everything. Especially in a software company with many interdependencies, a good grasp of data requirements - both analytical and operational - will smooth the transformation process. Ensuring you understand your data and its consumers will help avoid roadblocks like entangled data stores used by different teams for various purposes.”