In a north tower meeting room at TD Bank in 2021, a team huddled around a conference table, laptops open.Ā
High above downtown Toronto, the group formed the equivalent of a āmission control room,ā according to TD Vice President of Growth Marketing, Akif Unal.
āWe were moving the entire digital memory of TD to the cloud.”
Michael Abbott, Accenture
Each colleague brought a specific competency: campaign strategy, data analytics, system architecture.
Together, they were part of a group involved in one of the biggest undertakings in the bankās history: migrating billions of data records into the cloud.
The project would span more than 20 business lines, involve thousands of employees, and require three full years of work. It prompted TD Bank to establish a specialized data FinOps function, work with some of the biggest players across the tech ecosystem, and spawned more than 15 separate patent application filings.
āIt was a massive effort,ā Unal said.
But the bankās large-scale data migration has quickly paid off, giving the bank the speed and flexibility to adapt to customers’ changing expectations, apply the emerging powers of AI to a wealth of financial data andāin doing soātake a huge leap forward in an increasingly competitive market.
āCustomers donāt just compare us to other banks,ā said Unal. āThey compare us to all of the digitally led players out there.ā
Unlocking the library
A bank, in many ways, resembles a massive library.
Each transaction, customer record, and financial forecast represents a unique data point or volume of information.
For an organization like TD, which has been keeping such records since 1955, those data points number in the billions, with more added every day.
Taken as a library, this data has massive potential to help the bank make better decisions based on customer insights.

TD began consolidating its enterprise data nearly a decade ago with an on-premises, Hadoop-based platform. Its recent migration initiative focused on retiring that platform and moving to modern data management tooling in the cloud, giving the bank a powerful new way to unlock its knowledge.
The process also held the promise of freeing the bank from the constraints of on-premise systems, where hardware investments must be made years in advance and server capacity acts as an ever present constraint.
Like many legacy organizations, TD was contending with an aging system that was straining under its own weight. But moving decades of data into the cloud is not a plug-and-play operation, especially in the world of financial services, where billions of dollars and vital customer trust are both in play.
āYou donāt just put data in the cloud and hope it works,ā said Jeff Martin, TD Chief Data Officer and CIO of Corporate Platforms.
An engine change midflight
The amount of data that TD needed to migrate into the cloud was massive, by any measure.
In all, it would require moving several petabytes of data to an entirely new system. To put that in perspective, one petabyte is equivalent to 500 billion pages of standard text.
As TD began its migration process in 2021, the bank worked with Microsoft, Accenture, and Databricks, asking them to help TD establish a modern data platform on the Azure cloud. Even for those tech giantsāeach of which is well-acquainted with enterprise migrationsāthis program represented a different level of complexity.
āIt is one of the largest migrations that we’ve ever been part of,ā said Arun Ulagaratchagan, Corporate Vice President, Azure Data at Microsoft.
Michael Abbott, who leads Accentureās Banking and Capital Markets industry group, likened the task to changing a planeās engines midflight.
āIn essence, we were moving the entire digital memory of TD to the cloud,ā he said.
And like most moves, packing was the hardest part.
Martin explained that before even āone byte of customer dataā could be moved into the cloud, the bank would need to leverage a series of cyber controls to ensure the informationās security.
The migration initiative included implementing more than 700 of these cyber controls to protect the bankās data; including measures like encrypting data during storage and transfer, as well as restricting access to data through secure digital safe rooms.
Behind the scenes, the bank also developed new frameworks to streamline data ingestion, curation, orchestration, and job schedulingāfunctions that previously depended on the performance of various applications and processes.
Systems that previously leveraged custom-coding were standardized, and infrastructure provisioning that previously required physical setup was automated.
“The biggest challenge was just the sheer magnitude and size of it all, and getting it moved over, tested, proven, and making sure itās accurate,ā recalled Martin.
Sylvie Makhzoum, Vice President, Executive Platform Product Owner and Product Technology Lead, said the migration also required a major culture shift at the bank. According to Makhzoum, technology, analytics, and business teams had to operate as a cohesive unit, with each contributing to a tightly integrated delivery process.
To make this possible, the bank heavily invested in upskilling. Starting in 2022, more than 4,000 employees completed training on the new cloud platform, taking part in regular webinars, live demos, Q&A sessions, and workshops.
Eventually, TD moved over 20,000 processes to the cloud, as well as years of customer records and operational data. All of this unfolded, Makhzoum noted, while the bankās daily operations carried on uninterrupted.
āYou have to remember that the world doesnāt stop,ā she said. āYouāve got the old system, the new system, and the old one is still getting updated. Itās a moving target.ā
The flip side of speed
While the cloud migration offered TD some enormous benefits, it also surfaced unexpected issues.
Makhzoum explained that on-premise servers have built-in capacity limitsāpart of the problem the bank was trying to address.
But those limits also had controlled costs.

In contrast, the cloudās inherent elasticity opened the door to virtually unlimited computing power. Every new process and computation facilitated by the cloud came with a cost, and without careful monitoring, spending could have quickly spiralled out of control.
āIn the cloud, we can add more compute very quickly, within seconds if needed,ā said Makhzoum. āOperating a data platform on the cloud is a complete paradigm shift compared to on-premises, and ensuring delivery team members take cloud consumption costs into consideration is critical.ā
To address this, TD established a specialized data FinOps team one year into the migration. This team worked with engineers to track consumption patterns and optimize how the bank used its computational resources on the cloud.
It implemented a range of strategies, such as using spot instances, which represent unused computing power that a cloud service provider sells off at a lower price to ensure maximum utilization of its resources. The bank also used different storage tiers depending on need and required service level, and also gave the bankās business units visibility into their own cloud usage at a more granular level than what was ever possible previously.
The new reality
In June 2024, TD officially shut down its 1,000 legacy servers, marking the final step of the migration.
While decommissioning those servers cut the bankās infrastructure costs immediately, the true breakthrough, according to Martin and Makhzoum, came in the flood of insights facilitated by new levels of access to meaningful data that the cloud provides.

For TD, moving to the cloud meant unlocking opportunities and insights their old, siloed systems could never identify.
Now, TD can process months of historical data every day, while adding millions of new pieces of informationāfrom credit card activity to mutual fund transactions.
That aspect of the cloud alone gives TD āa level of agility and speed that really isn’t even possible or viableā with older systems, Makhzoum said.
Unal, in his role as Vice President of Growth Marketing, said that sourcing data for marketing campaigns that used to take months are now rolled out in days. This, among other factors, has allowed his team to deploy 54 percent more campaigns year-over-year with the same resources.
And because the cloud helps TD analyze data more effectively, the bank can send more personalized messages to customers. For instance, if a customer consistently spends on travel, the bank might send them a message about a credit card that offers higher rewards for flights or hotels.
āWeāve seen two to three times higher engagement rates with personalized marketing,ā Unal said. To TD, that translates into stronger customer relationships, higher retention rates, and increased revenue from products and services that align more closely with what customers actually want.
The migration was also an investment in protecting the bankās ādigital crown jewels,ā the bankās sensitive customer and business data. These include customer account details, transaction records, and proprietary systems, where even a small breach could cause massive financial losses, disrupt operations, and damage the bankās reputation.
The cloud gives TD the ability to implement heightened security measures to protect these assets, often in less time.
Moving to the cloud also paved the way for practical advancements within TD Bankās operations. Makhzoum noted that the reusable frameworks developed by its teams helped to streamline operations, and many were included in the more than 15 patent applications in total that were filed as part of this initiative. Two of those patents have already been grantedāone for managing historical data and another for securing access to sensitive data within the platform.
Building for the long game
The cloud is also now the foundation for TD Bankās next big leap: artificial intelligence.
āTo have good AI, you need good data,ā Martin explained. āAI without high-quality data would just be artificial. It wouldn’t be very intelligent.ā
āCustomers donāt just compare us to other banks. They compare us to all of the digitally led players out there.ā
Akif Unal, TD
By centralizing its data on the cloud and making it easier to access trusted data at scale, TD can now experiment with generative AI with the same level of velocity as a FinTech startup.
LastĀ year, the bank introduced a generative AI-powered chatbot for its call centre, designed to help live agents find answers to customer queries with increased speed and accuracy.
āWe went from prototype to production in three months,ā Martin said, adding that this would not have been possible with the bankās old setup.
Still, rolling out generative AI in a bank is not taken lightly, especially when handling customer data.
āWe donāt assume accuracy, we ensure it,ā Martin said, adding that TD made sure every new tool was carefully tested with a human-in-the-loop approach before being used more broadly.
Microsoft’s Ulagaratchagan said āgetting enterprise data ready for generative AI has become a critical priority for their partners.ā
āTD has really leapfrogged forward by taking the entire data estate, bringing it to the cloud, modernizing it, making it GenAI-ready, and putting a strong digital foundation in place,ā he said. āIt puts TD in an incredibly powerful position to take advantage of the GenAI transformation.ā
Beyond the immediate benefits of moving off-premises, Martin said the migration has also prompted TD to fundamentally rethink how it operates and views itself in the financial services sector.
āTo be a great customer service company, you have to have a great handle on data,ā he said. āWe’ve got a strong enterprise approach to data, where all data is treated as an enterprise asset, and we’ve built out a platform that enables us to be able to use that data to serve our customers better.ā
Join the teams at the forefront of expanding technological innovation in Canada’s financial sector.
All photos provided by TD Bank. Feature image by TD / BetaKit.