MongoDB Relational Migrator is GA and more, from Adrian Bridgwater in NY.
Data grows up. More specifically, our application structures extend over time and become more powerfully complex in line with the general weight of evolution in the data platforms that serve them. This core truism of course means that enterprises sometimes need to migrate their data to new databases, new data services, new data automation tools and new clouds.
But as we all know, growing up isn’t easy.
The growing pains experienced in data migration aren’t so much a question of adolescent angst (although there are a lot of reasons to worry about fragility, acceptance and usefulness), the challenges are more centrally down to code library disconnects, schema fragmentation and (especially in the age of multi-cloud) infrastructural misconfigurations.
Aiming to counter these inconvenient truths and provide an acne-free way to grow up in the new data universe is MongoDB, Inc.
New York state of data mine
The company used the New York leg of its developer conference programme (normally called MongoDB World, but this year in the post-pandemic recovery period run as MongoDB .local) this month to announce the general availability of MongoDB Relational Migrator, a new tool that promises to simplify application migration and transformation. This is a technology that works to power migration functions from legacy relational deployments to modern document-based data models. In short, it aims to provide organisations with a streamlined way to improve operational efficiency when faced with what are often inevitable migration responsibilities.
“Customers often tell us it’s crucial that they modernise their legacy applications so they can quickly build new end-user experiences that take advantage of game-changing technologies and ship new features at high velocity. But they also say that it’s too risky, expensive and time-consuming, or that they just don’t know how to get started,” said Sahir Azam, chief product officer at MongoDB.
Follow The Stack on LinkedIn
Data is (of course) the foundation of every application with a large portion of it still residing in legacy relational databases where it can’t easily support emerging applications that make use of new technologies. Rather like the point at which your old laptop, tablet or smartphone fails to be capable of running the 2023 iteration of a favourite application or data service, some portions of legacy data structure simply do not dovetail with 'modern' multi-cloud environments typified by their use of Application Programming Interface (API) connections, microservices, serverless provisioning advantage and let's not even start of generative AI and Large Language Models (LLMs) and open source topographies.
“With MongoDB Relational Migrator, customers can now realise the full potential of software, data and new technologies like generative AI by migrating and modernising their legacy applications with a seamless, zero-downtime migration experience and without the heavy lifting. It’s now easier than ever to modernise applications and create innovative end-user experiences at the speed and scale that modern applications require with MongoDB Atlas,” added Azam.
Okay, so generative AI did have to come in after all, but what matters most here is a question of exactly how enterprises will make use of MongoDB Atlas’s flexible document model and scale-out capabilities. The company says that with MongoDB Relational Migrator, more organisations across all industries can quickly, cost-effectively migrate from legacy databases with little-to-no risk.
Locked-in back-end relief
Pointing once again to the need for progression in light of the new technologies we have seen surfaced over the last couple of decades in particular, MongoDB says that organisations of all shapes and sizes want to be able to make use of new technologies to transform. However, many companies remain locked-in to legacy relational databases in the backend of their applications, limiting their ability to adapt and modernize.
But why are legacy databases so awfully bad? Many argue that legacy software is there for a reason i.e. it still works! Yes there are monolithic instances of applications and database use that need to be broken down like condemned buildings, but many of these older deployments (very often in government and public service institutions) do still work the way they were supposed to.
See also: MongoDB Atlas gets vector search, streaming and Kubernetes updates
Unperturbed by the 'it still works' argument, MongoDB insists that these legacy databases are rigid, unadaptable and difficult to use for supporting modern applications because of the complexity involved in mapping relationships between data when application requirements inevitably change. Additionally, because legacy databases were designed for earlier times before the advent of cloud computing, it is difficult to scale these databases without incurring significant costs. As a result, incorporating new technologies, quickly adapting to dynamic market changes, or continuously inventing new experiences for end-users are out of reach.
"For these reasons," says Azam and team. "Customers are increasingly looking to migrate to a more flexible and scalable document-based data model that is easier to use and adapt. However, there is often considerable time, cost and risk associated with these migrations because they require highly specialised tooling and knowledge to assess existing applications and prepare data for migration. Even then, the migration process can result in data loss, application downtime, and a migrated application that does not function as intended.
Well, MongoDB did slot in the 'flexible and scalable document-based data model' line there... which - spoiler alert - is obviously what MongoDB is known for in the first place. That said, as CEO Dev Ittycheria has explained many times, the document model works for developers i.e. it more closely aligns to the 'way developers think' when forming programming logic and it has (somewhat undeniably) been a large part of why MongoDB has become as successful as it has.
In terms of further product functionality, with MongoDB Relational Migrator users can migrate and modernise legacy applications without the time, cost and risk typically associated with these projects - making it significantly faster and easier to optimise business operations and encourage developer innovation.
How migration functions actually work
MongoDB Relational Migrator analyses legacy databases, automatically generates new data schema and code and then executes a migration to MongoDB Atlas with no downtime required. Users can get started by connecting MongoDB Relational Migrator to their existing application database (e.g. Oracle, Microsoft SQL Server, MySQL or PostgreSQL) for assessment.
After analysing the application data, MongoDB Relational Migrator suggests a new data schema, transforms and migrates data to MongoDB Atlas with the ability to run continuous sync jobs for zero-downtime migrations and generates optimised code for working with data in the new, modernised application. Users can then run the modernised application in a testing environment to ensure it is operating as intended before deploying it to production.
Using MongoDB Relational Migrator, the company insists that organisations of all shapes and sizes can eliminate the barriers and heavy lifting associated with migrating and modernising applications - and the key phrase there is 'heavy lifting' i.e. this falls in line with the imperative to apply automation and autonomous management control at every layer of the IT stack in the current age of technology with our ubiquitous predilection for Artificial Intelligence and all the Machine Learning advantages that lie in wait.