Don’t Let Your Old ETL Tool Hold You Back From Migrating Your Data to the Cloud

headshot of Dominic Wellington
6 min read

Moving from an on-premises data warehouse to one in the cloud is only half the story; don’t forget about how you’re going to keep the data flowing.

Cloud migrations are not a one-and-done sort of project. The usual path tends to have some common stages, although of course each organization is different. There are also some pitfalls to avoid along the way. As a fully-hybrid integration platform, equally at home talking to the latest SaaS tool as it is dealing with the most established on-premises suites, can really help with all stages of these projects.

Let’s explore how SnapLogic can help make your cloud data warehouse migration project a success.

Will you be at Snowflake Summit in Las Vegas? Stop by booth #1331 in Basecamp West for a giant Jenga game and a chance to win a limited edition prize. If you’re a SnapLogic customer, reach out to your account manager to collect a special gift at the booth.

Phases of cloud migration

Experimentation

At the beginning, small teams or even individuals start building new services in the cloud. These are typically small demo projects or maybe small-scale utilities. This phase is often missed because it happens under the radar, and may even be opposed as “shadow IT”. However it is crucially important to start to develop organizational skills and reflexes around what works in the very different environment of the cloud.

Early migration

Building on those initial successes, some internal apps that need to be refactored anyway begin to be migrated to the cloud. Not all of these migrations will succeed on the first attempt, because the organization is still learning. A common failure mode is “forklift upgrades” that fail to take advantage of the specific characteristics of cloud infrastructure, instead importing architectures and process wholesale from the on-premises data center.

Move to cloud-first 

At this point the organization has built up enough comfort with what is required for success in the cloud to adopt a “cloud-first” policy: everything should be built in the cloud unless there is a very good reason for it not to be. The goal of such policies is not just consistency; instead the objective is to future-proof the organization by avoiding becoming reliant on technologies whose ongoing support might be coming into question, or where the skills pool is drying up as practitioners retrain or retire.

Strategic migration

Up to this point, the bulk of IT services is probably still running in self-managed infrastructure. This makes sense: services that are used to run the business have been in place for a long time, and have accumulated a lot of data — and a lot of connections to other services. The combination of the risk of failure and the sheer complexity of the task makes IT organizations very change-averse when it comes to these services, even when they have happily adopted cloud services in other parts of the business. 

Finally there comes a moment when the benefits of the cloud — its flexibility, both technical and financial — outweigh those fears. At this point the organization has built up enough skills, and has a deep enough relationship and experience with its cloud suppliers and partners, to be willing to consider migrating strategic run-the-business services to the cloud.

Dependency hairball

This is where things get (even more) complicated. Those strategic legacy services are complicated enough in their own right. None of them are off-the-shelf products; all of them have been customized to represent specific organizational processes and policies. That would still be manageable — but they have also accumulated any number of subsidiary helper tools around them to integrate them with all of the other services the organization relies on.

If we look at a data warehouse such as Teradata, the biggest dependencies are the tools that move data in and out of the data warehouse. If an organization is reliant on on-premises tools such as DataStage or Informatica PowerCenter, that is a further brake on their ability to move to a cloud data warehouse like, say, Snowflake. 

The risk is a half-hearted migration to the cloud that fails to deliver all the expected benefits. It can be tempting, when faced with a large complex project such as rehoming a critical business service, to try to reduce the scope to a “minimum viable refactoring”. This is an approach which brings along as much as possible of the old world: tools, processes, and culture. 

Unfortunately, the migration projects that take this risk-averse approach also tend to be the ones that need to pause for significant revisions or even reverse course entirely. The result is at best a delayed delivery; at worst, a project may fail completely.

Commit to the cloud

For a successful cloud migration, it’s best to commit all the way. Embrace a cloud data warehouse like Snowflake, and select cloud-native tools that will enable the organization to receive the value they expect from the migration. SnapLogic’s intelligent integration platform as a service (iPaaS) is perfectly placed to support both the migration project and the long-term operation of the cloud data warehouse.

The SnapLogic platform is cloud-native and future-proof, but is also perfectly able to interact with on-premises tools, without onerous connectivity or security requirements. In addition, the ease-of-use of our low-code/no-code platform ensures a short learning curve, as well as enabling people across the organization — not just in technical roles — to get involved in the project. This wide user base helps speed adoption and delivery, minimizing costly rework and misunderstandings.

Specifically to cloud data warehouse migrations, with a strong track record of partnership between Snowflake and SnapLogic, and a ready-to-use SnapPack of pre-built actions, SnapLogic is the tool to shorten time-to-value of your Snowflake project, as well as lowering costs and increasing adoption over time.

Pitney Bowes was able to build a single unified data lake with real-time data access and easy data sharing. They were able to do this thanks to over 500 people in twenty teams working in SnapLogic without hand-coding, instead leveraging reusable templates and those drag-and-drop Snaps to reduce integration time from days to minutes with real-time ETL and ELT.

Similarly, USAA optimized IT and Development teams’ workflows by removing the need for hand-coding. They are now executing over 22 million pipelines per year to automate connectivity across the entire enterprise.

Let’s talk about how SnapLogic can help ensure the success of your data integration project. If you’re going to be at Snowflake Summit, you should meet the SnapLogic team in Las Vegas. Or of course, just get in touch any time!

headshot of Dominic Wellington
Enterprise Architect at SnapLogic
Category: Cloud Partners
Topics: Snowflake
Don’t Let Your Old ETL Tool Hold You Back From Migrating Your Data To The Cloud

We're hiring!

Discover your next great career opportunity.