The commoditization of integration

By Dinesh Chandrasekhar

Eight years ago, dozens of integration vendors were offering scores of solutions, all with what seemed to be the same capabilities. Pick any ESB or ETL tool and each seemed to perform the same functions as their competitors. RFPs were no longer a viable way to weed out the inferior vendors as each solution checked all the boxes across the board. Plus, all vendors were ready to lower their prices at the drop of a hat to win your business. It was at this time that the integration market had truly reached a level of commoditization. Consumers could easily pick and choose any solution as there were no true differentiators amongst them.

But, several factors have changed the landscape since then:

  • NoESB – The NoESB architecture had started gaining interest – pushing the idea of the irrelevancy of ESB for many integration scenarios. Yet, an API Gateway was not the right alternative.
  • Cloudification – The cloudification of pretty much all your favorite on-premises enterprise applications began around the same time. Enterprises that were thinking of a digital transformation couldn’t get too far without a definitive cloud strategy in place.
  • Convergence of ESB and ETL – The lines between application integration and data integration were blurring. CIOs and IT managers didn’t want to deal with two different sets of integration tools. With the onset of mobile and IoT, data volumes were exploding daily. As a result, even data warehouses moved to the cloud. To serve such big data needs, the traditional/legacy ESB/ETL tools were incompetent and unfit.
  • Agile Integrations – Finally, the DevOps and Agile movements impacted enterprise integration initiatives as well. They had given rise to new user personas in the enterprise – Citizen Integrators or Citizen Developers. These are the LOB Managers or non-IT personnel that needed quick integrations within their applications to render their data in different views. The reliance on IT to deliver solutions to business was becoming a major hindrance.

All these factors have influenced the iPaaS (Integration Platform as a Service) market. Now, thousands of companies are already leveraging iPaaS solutions to integrate their cloud and on-premises solutions. iPaaS solutions break away from legacy approaches to integration, are cloud-native, intuitive, fast, self-starting, support hybrid architectures, and offer connectors to a wide range of on-premises and on the cloud applications.

Now comes the big question – “Will iPaaS solutions be commoditized, too?” At the moment, the answer is a definite NO and there are multiple reasons why. Beyond scale, latency, tenancy, SLAs, number of connectors etc., one of the key areas that will differentiate iPaaS solutions is the developer experience. The user interface of the solution will determine the adoption rate and the value it brings to the enterprise. So, for a citizen integrator to actually use the system, the interface should be intuitive enough to guide them in building their integration flows quickly, effectively, and most importantly, without the assistance of IT. This alone will make or break the system adoption.

iPaaS vendors are trying to enhance this developer experience with features like drag-and-drop connectors, pipeline snippets, a templates library, a starter kit, mapping enhancements, etc. However, very few vendors are offering AI-driven tooling that enables intelligent ways to predict next steps – based on learnings from hundreds of other users – for your integration flow. AI-assist is truly a great benefit for citizen integrators, who may be non-technical. Even technically savvy developers welcome a significant boost in their productivity. With innovations like this happening, the iPaaS space is quite far away from being commoditized. However, enterprises still need to be wary of cloud-washing iPaaS vendors that offer “1000+” connectors, a thick-client IDE, or an ESB wrapped in a cloud blanket. And, that is a post for a different day!

Dinesh Chandrasekhar is Director of Product Marketing at SnapLogic. Follow him on Twitter @AppInt4All.

Mossberg out. Enterprise technology still in

By Gaurav Dhillon

A few weeks ago, the legendary tech journalist, Walt Mossberg, penned his last column. Although tech journalism today is vastly different than it was in 1991, when his first column appeared in the Wall Street Journal, or even five or 10 years ago, voices like Walt’s still matter. They matter because history matters – despite what I see as today’s widely held, yet unspoken belief that nothing much important existed prior to the invention of the iPhone.

Unpacking that further, history matters because the people who learn from it, and take their cues from it, are those who will drive the future.

Enterprise tech history is still unfolding

I like to think of myself as one of those people, certainly one who believes that all history is meaningful, including tech history. As tech journalism’s eminence grise, Walt not only chronicled the industry’s history, he also helped to define it. He was at the helm of a loose cadre of tech journalists and industry pundits, from Robert X. Cringely to Esther Dyson, who could make or break a company with just a few paragraphs.

Walt is now retiring. So what can we learn from him? The premise of his farewell column in Recode is that tech is disappearing, in a good way.”[Personal] tech was once always in your way. Soon, it will be almost invisible,” he wrote, and further, “The big software revolutions, like cloud computing, search engines, and social networks are also still growing and improving, but have become largely established.”

I’ll disagree with Walt on the second point. The cloud computing revolution, which is changing the way enterprises think and operate, is just beginning. We are at a juncture populated by unimaginably large quantities of data, coupled with an equally unquenchable thirst by enterprises to learn from it. The world has gone mad for artificial intelligence (AI) and analytics, every permutation of which is fueled by one thing: data.

The way we use data will become invisible

In his column, Walt observed that personal tech is now almost invisible. We use and benefit from it in an almost passive way. The way data scientists and business users consume data is anything but. Data is still moved around and manually integrated, on-premises and in the cloud, with processes that haven’t changed much since the 1970s. Think about it – the 1970s! It’s no secret that extract, transfer, and load (ETL) processes remain the bane of data consumers’ existence, largely because many enterprises are still using 25-year-old solutions to manage ETL and integrate data.

Cloud Computing

The good news is, data integration is becoming much easier to do, and is well on its way to becoming invisible. Enterprise integration cloud technology promises to replace slow and cumbersome scripting and manual data movement with fast, open, seamless data pipelines, optimized with AI techniques.

Remember how, as Internet use exploded in the late 1990s, the tech industry was abuzz with companies offering all manner of optimization technologies, like load balancing, data mirroring, and throughput optimization? These days you never hear about these companies anymore; we take high-performance internet service for granted, like the old-fashioned dial tone.

I am confident that we are embarking on a similar era for enterprise data integration, one in which modern, cloud-first technologies will make complex data integration processes increasingly invisible, seamlessly baked into the way data is stored and accessed.

Making history with data integration

I had the pleasure of meeting Walt some years ago at his office, a miniature museum with many of the personal tech industry’s greatest inventions on display. There, his love of tech was apparent and abundant. Apple IIe? Nokia Communicator 9000? Palm Treo and original iPod? Of course. If Walt were to be at his keyboard, in his office, for another couple of years, I’m pretty sure his collection would be joined by a technology with no physical form factor, but of even greater import: the enterprise cloud.

Hats off to you, Walt. And while you may have given your final sign-off, “Mossberg out,” enterprise tech is most definitely still in.

Follow me on Twitter @gdhillon.

Gaurav Dhillon is CEO of SnapLogic. You can follow him on Twitter @gdhillon.

From helicopter to enabler: The new face of enterprise IT

Can an IT organization effectively run a 2017 business on 25-year-old technology? As someone who played a large hand in developing the data integration technology in question — at Informatica, where I was CTO for nearly two decades — I can tell you that the answer is simple: “No.”

A vastly different primordial landscape

That said, I know that when Informatica was created, it was the best technology for data integration at the time. The world was a lot simpler in 1992: there were five databases that mattered, and they were all pretty similar. There were just a few ERP systems: Oracle, SAP and a young PeopleSoft. Informatica was ideally suited to that software baseline, and the scale-up UNIX platforms of that era. The web, obviously, was not in the picture.

IT organizations were also a lot simpler in 1992. If any business person wanted new tech functionality — a new workstation added to a network, or a new report from a client/server system — they put their request into the IT queue, because that was the only way to get it.

IT is still important; it’s just different

Fast-forward 25 years to 2017. Almost everything about that primordial technology landscape, when Informatica roamed the world, is different. For example, now there’s the web, the cloud, NoSQL databases, and best of breed application strategies that are actually viable. None of these existed when Informatica started. Every assumption from that time — the compute platform, scale-up/scale-out, data types, data volumes and data formats — is different.

IT organizations are radically different, too. The command-and-control IT organization of the past has transformed into a critical enablement function. IT still enables core operations by securing the enterprise and establishing a multitude of technology governance frameworks. But the actual procurement of end-user technology, such as analyzing data aggregated from across systems and across the enterprise, is increasingly in the hands of business users.

In other words, the role of IT is changing, but the importance of IT isn’t. It’s like parenting; as your kids grow your role changes. It’s less about helicoptering and more about enabling. Parents don’t become less important, but how we deliver value evolves.

This is a good analog to the changes in enterprise IT. The IT organization wants to enable users because it’s pretty impossible to keep up with the blistering pace of business growth and change. If the IT organization tries to control too much, at some point it starts holding the business back.

Smart IT organizations have realized their role in the modern enterprise is to help their business partners become more successful. SnapLogic delivers a vital piece of required technology; we help IT organizations to give their users the self-service data integration services they need, instead of waiting for analysts to run an ETL through Informatica to pull the requested data together. By enabling self-service, SnapLogic is helping lines of business — most companies’ biggest growth drivers — to reach their full potential. If you’re a parent reading this, I know it will sound familiar.

Here’s another way to find out more about why IT organizations are embracing SnapLogic as a critical enabler: readSnapLogic’s new whitepaper that captures my conversation with Gaurav Dhillon, SnapLogic’s CEO and also an Informatica alumnus: “We left Informatica. Now you can, too.”

snp-informatica-wp-1000x744

Webinar: Introduction to iPaaS – Drivers, Requirements and Use Cases

Synerzip webinarIf you have heard the term “iPaaS” but still aren’t quite sure what it means, join us tomorrow, Wednesday, July 20th at 10am PST, for a webinar in partnership with Synerzip to hear more about this increasingly recognized term and why you might be ready to adopt an integration platform as a service (iPaaS) solution.

Continue reading “Webinar: Introduction to iPaaS – Drivers, Requirements and Use Cases”

8 Reasons Why Your Legacy Data Integration Plan Must Change

InfoQ-LogoIt’s great to see more and more discussion about the need to re-think your enterprise integration strategy. This week, I participated in an InfoQ Virtual Panel: The Current State of Integration Platform as a Service (iPaaS). The key takeaways were:

  • iPaaS is no longer just about integration occurring in the cloud. iPaaS takes advantage of inherent cloud platform capabilities such as dynamic scale, containers, resource management and self-service.
  • SaaS connectors are still relevant and a preferred approach over direct API calls for productivity, simplicity, abstraction and security reasons.
    SOAP, once largely entrenched in ESBs and Integration Brokers, has given way to REST and JSON in iPaaS.
  • The future of iPaaS is connecting and enabling other high value business outcomes related to analytics, machine learning and big data.
  • Digital transformation is a driver for organizations to move from traditional on-premises middleware to iPaaS in order to drive down costs and increase velocity.”

You can read the entire discussion here. Thanks to Kent Weare and InfoQ for the opportunity to participate.SnapLogic_iPaaS_Integration-Platform-as-a-Service Continue reading “8 Reasons Why Your Legacy Data Integration Plan Must Change”

10 Modern Data Integration Platform Requirements

SnapLogic’s Vice President of Engineering Vaikom Krishnan recently outlined the 10 New Requirements for Modern Data IntegrationHere is a presentation that summarizes these requirements.

Continue reading “10 Modern Data Integration Platform Requirements”

What You Need to Know About Modern Data Integration

dave_linthicumLast week we hosted a webinar with industry analyst, thought leader and author David Linthicum that focused how the enterprise problem domains are changing and how data integration technology must change with it. The presentation boiled this down to 5 critical and lesser known data integration requirements, how to understand them, and how to pick the right approaches and technology to solve the problems.

According to David, the 5 most critical things to understand about modern data integration are:

  1. Workloads and data are likely to be distributed across traditional systems, private clouds, and public clouds.
  2. Data is growing quickly, with big data and data lakes common within most enterprises.
  3. Data must be delivered in real-time, on-demand, in support of most modern applications and business processes.
  4. Security is now systemic, it can no longer be an after thought.
  5. DevOps is the new standard for building and deploying applications and data store.

Continue reading “What You Need to Know About Modern Data Integration”