Mossberg out. Enterprise technology still in

By Gaurav Dhillon

A few weeks ago, the legendary tech journalist, Walt Mossberg, penned his last column. Although tech journalism today is vastly different than it was in 1991, when his first column appeared in the Wall Street Journal, or even five or 10 years ago, voices like Walt’s still matter. They matter because history matters – despite what I see as today’s widely held, yet unspoken belief that nothing much important existed prior to the invention of the iPhone.

Unpacking that further, history matters because the people who learn from it, and take their cues from it, are those who will drive the future.

Enterprise tech history is still unfolding

I like to think of myself as one of those people, certainly one who believes that all history is meaningful, including tech history. As tech journalism’s eminence grise, Walt not only chronicled the industry’s history, he also helped to define it. He was at the helm of a loose cadre of tech journalists and industry pundits, from Robert X. Cringely to Esther Dyson, who could make or break a company with just a few paragraphs.

Walt is now retiring. So what can we learn from him? The premise of his farewell column in Recode is that tech is disappearing, in a good way.”[Personal] tech was once always in your way. Soon, it will be almost invisible,” he wrote, and further, “The big software revolutions, like cloud computing, search engines, and social networks are also still growing and improving, but have become largely established.”

I’ll disagree with Walt on the second point. The cloud computing revolution, which is changing the way enterprises think and operate, is just beginning. We are at a juncture populated by unimaginably large quantities of data, coupled with an equally unquenchable thirst by enterprises to learn from it. The world has gone mad for artificial intelligence (AI) and analytics, every permutation of which is fueled by one thing: data.

The way we use data will become invisible

In his column, Walt observed that personal tech is now almost invisible. We use and benefit from it in an almost passive way. The way data scientists and business users consume data is anything but. Data is still moved around and manually integrated, on-premises and in the cloud, with processes that haven’t changed much since the 1970s. Think about it – the 1970s! It’s no secret that extract, transfer, and load (ETL) processes remain the bane of data consumers’ existence, largely because many enterprises are still using 25-year-old solutions to manage ETL and integrate data.

Cloud Computing

The good news is, data integration is becoming much easier to do, and is well on its way to becoming invisible. Enterprise integration cloud technology promises to replace slow and cumbersome scripting and manual data movement with fast, open, seamless data pipelines, optimized with AI techniques.

Remember how, as Internet use exploded in the late 1990s, the tech industry was abuzz with companies offering all manner of optimization technologies, like load balancing, data mirroring, and throughput optimization? These days you never hear about these companies anymore; we take high-performance internet service for granted, like the old-fashioned dial tone.

I am confident that we are embarking on a similar era for enterprise data integration, one in which modern, cloud-first technologies will make complex data integration processes increasingly invisible, seamlessly baked into the way data is stored and accessed.

Making history with data integration

I had the pleasure of meeting Walt some years ago at his office, a miniature museum with many of the personal tech industry’s greatest inventions on display. There, his love of tech was apparent and abundant. Apple IIe? Nokia Communicator 9000? Palm Treo and original iPod? Of course. If Walt were to be at his keyboard, in his office, for another couple of years, I’m pretty sure his collection would be joined by a technology with no physical form factor, but of even greater import: the enterprise cloud.

Hats off to you, Walt. And while you may have given your final sign-off, “Mossberg out,” enterprise tech is most definitely still in.

Follow me on Twitter @gdhillon.

Gaurav Dhillon is CEO of SnapLogic. You can follow him on Twitter @gdhillon.

Cloud fluency: Does your data integration solution speak the truth?

There’s been a lot of cloud-washing in the enterprise data integration space — vendors are heavily promoting their hybrid cloud services, yet for many, only a skinny part of their monolithic apps has been “cloudified.”

In an era of “alternative facts,” it’s important to make technology decisions based on truths. Here is an important one: A big data integration solution built on genuine, made-for-the-cloud platform as a service (PaaS) technology offers important benefits including:

  1. Self-service integration by “citizen integrators,” without reliance on IT
  2. For IT organizations, the ability to easily connect multiple data sets, to achieve a bespoke enterprise tech environment

These are in addition to the traditional benefits of cloud solutions: no on-premise installation; continuous, no-fuss upgrades; and the latest software innovation, delivered automatically.

Why “built for the cloud” matters

You can’t get these benefits with “cloudified” software that was originally invented in 1992. Of course, I’m referring to Informatica; while the company promotes its cloud capabilities, the software largely retains a monolithic architecture that resides on-premises, and does most of its work there, too.

In contrast, SnapLogic is purpose-built for the cloud, meaning there are no legacy components that prevent the data and application integration service from running at cloud speed. Data streams between applications, databases, files, social and big data sources via the Snaplex, a self-upgrading, elastic execution grid.

In more everyday terms, SnapLogic has 100% cloud fluency. Our technology was made for the cloud, born in the cloud, and it lives in the cloud.

The consumerization of data integration

Further to point 1 above, “citizen integrators,” industry futurists like R. “Ray” Wang have been talking about the consumerization of IT for more than half a decade. And that is exactly what SnapLogic has mastered. Our great breakthrough, our big innovation, is that we have consumerized the dungeon-like, dark problem of data integration.

Integration used to be a big, boring problem relegated to the back office. We’ve brought it from the dungeon to the front office and into the light. It is amazing to see how people use our product. They go from one user to hundreds of users as they get access to data in a secure, organized and appropriately access-controlled manner. But you don’t have a cast of thousands of IT people enabling all this; users merely help themselves. This is the right model for the modern enterprise.

“An ERP of one”

As for the second major benefit of a true cloud solution — a bespoke enterprise tech environment, at a fraction of the time and cost of traditional means — here’s a customer quote from a young CEO of a hot company that’s a household name.

“Look, we’ve got an ‘ERP of one’ by using SnapLogic — a totally customized enterprise information environment. We can buy the best-of-the-best SaaS offerings, and then with SnapLogic, integrate them into a bespoke ERP system that would cost a bajillion dollars to build ourselves. We can custom mix and match the capabilities that uniquely fit us. We got the bespoke suit at off-the-rack prices by using SnapLogic to customize our enterprise environment.”

To my mind, that’s the big payoff, and an excellent way to think about SnapLogic’s value. We are able to give our customer an “ERP of one” faster and cheaper than they could have ever imagined. This is where the world is going, because of the vanishingly low prices of compute power and storage, and cloud computing.

Today you literally can, without a huge outlay, build your own enterprise technology world. But you need the glue to realize the vision, to bring it all together. That glue is SnapLogic.

Find out more about how and why SnapLogic puts best-of-breed enterprise integration within every organization’s grasp. Register for this upcoming webinar featuring a conversation with myself, industry analyst and data integration expert David Linthicum, and Gaurav Dhillon, SnapLogic’s CEO and also an Informatica alumnus: “We left Informatica. Now you can, too.”


Gaurav Dhillon is CEO at SnapLogic. Follow him on Twitter @gdhillon.

Why enterprises must meet millennials’ expectations


Millennials’ attitudes in the workplace have gotten a bad rap, the roots of which are explored in this extremely popular video by author and speaker Simon Sinek. But this blog isn’t a slam on millennials’ expectations for job fulfillment. It’s about meeting their expectations of how easy it should be to use enterprise technology — and that’s a good thing.

A very vocal majority

Since 2015, millennials have been the largest demographic group in the US workforce, numbering 53.5 million. They are now mainstream enterprise tech consumers, and there’s a thing or two we can learn. For example, millennials came of age using smartphones. In fact, 97% of millennials aged 25-34 own a smartphone. And I doubt that a single one would want to give up their smartphone for a separate flip phone, music player and camera.

The reality is, we live in an age where people expect multiple utility from technology, a driving force in innovation. How about a washer that also dries your clothes, that’s pretty rad. Or the motorcycle helmet that puts an entire dashboard of information right in front of your eyes, that’s radder still.

Expectations for multiple utility are similarly all over the workplace, and millennials are approaching the data consumption challenge with a clean slate. They say it should be easy, like a smartphone, and be self-service. Once again, millennials are clamoring for multiple utility.


SnapLogic meets millennial expectations of modern business

This is an area where SnapLogic trumps legacy technologies. On its best day, the 25-year-old data integration technology offered by Informatica creates ETLs (extract, transfer, loads) and has some other capabilities added on. But at its core, Informatica was designed to deal with batch, relational, ETL-like kinds of problems. Unfortunately, no one in the working world, not even retiring Boomers, lives in batch mode. Business change happens in real-time, and our data and analytics need to support that.

From day one, SnapLogic’s integration platform has been designed to solve all kinds of data-in-flight problems in the enterprise. These include, as we called them in the last century, application integration problems like connecting Salesforce with SAP, or data integration problems, providing information feeds to solve modern analytic sorts of questions. We can use SnapLogic’s application integration architecture to solve problems with technologies that weren’t widely available in the last century like predictive analytics, machine learning, or wiring up large industrial enterprises with IoT sensors, to give you new profit pools and help do a better job of building products.

That’s the kind of multiple utility that people expect from their technology — it’s not about feeds or speeds, it’s about having a smart phone versus having a separate phone, camera and music player. That’s just so 1992, you know?

This is the “match point” that SnapLogic can defend into eternity. Hundreds of our customers around the globe testify to that. Almost all of these companies had some flavor of Informatica or its competitor, and they have made the choice to move to SnapLogic. Some have moved completely, in a big bang, and others have side-by-side projects and will migrate completely to SnapLogic over time.

Want to learn more about meeting today’s lofty expectations for enterprise tech? Read SnapLogic’s new whitepaper that captures my conversation with James Markarian, SnapLogic’s CTO and also an Informatica alumnus: “We left Informatica. Now you can, too.”


The need for speed: Why I left Informatica (and you should, too)

guarav_blog_headshotInformatica is one of the biggest, oldest names in enterprise technology. It’s a company I co-founded in 1992 and left over 10 years ago. Although the reasons why I left can be most easily summarized as “disagreements with the board over the direction of the company,” it all boils down to this: aging enterprise technology doesn’t move fast enough to keep up with the speed of today’s business.

About a year after I left, I founded SnapLogic, a company that has re-invented data integration for the modern enterprise — an enterprise that is increasingly living, working and innovating in the cloud. The pace at which enterprises are shifting operations to the cloud is reflected in stats like this: According to Forrester Research, the global public cloud market will top $146 billion in 2017, up from $87 billion in 2015.

Should you ride a horse to the office?

need-for-speedGiven the tidal wave of movement to the cloud, why would a company stick with Informatica? Often, it’s based on decisions made in the last century, when CIOs made strategic commitments to this legacy platform. If you’re the CIO of that shop today, you may or may not have been the person who made that decision, but here you are, running Informatica.

Going forward, does it make sense to keep running the company on Informatica? The truthful answer is it can, just as you can run a modern company on a mainframe. You can also ride a horse to the office. But is it something you should do? That’s where I say “no.” The direct path between a problem and a solution is to use appropriate technologies that are in synch with the problems being solved, in the times and the budget that are available today. That is really the crux of Informatica inheritance versus the SnapLogic future.

It’s true that the core guts of what is still Informatica — the underlying engine, the metadata, the user interface and so on — have to some extent been replenished. But they are fundamentally still fixed in the past. It’s like a mainframe; you can go from water cooling to air cooling, but fundamentally it’s still a mainframe.

The high price of opportunity cost

IT and business people always think about sunk costs, and they don’t want to give up on sunk costs. Informatica shops have invested heavily in the application, and the people, processes, iron and data centers required to run it; these are sunk costs.

But IT and business leaders need to think about sunk opportunity, and the high price their companies pay for missing out because their antiquated infrastructure — of which Informatica is emblematic — doesn’t allow them to move fast enough to seize opportunity when they see it.

Today, most enterprises are making a conscious decision to stop throwing good money after bad on their application portfolios. They recognize they can’t lose out on more opportunities. They are switching to cloud computing and modern enterprise SaaS. As a result, there’s been a huge shift toward solutions like Salesforce, Workday and Service Now; companies that swore they would never give up on-premise software are moving their application computing to the cloud.

Game, set, match point

In light of that, in a world that offers new, ultra-modern technology at commodity prices, you start to realize, “We ought to modernize. We should give up on the sunk costs and instead think of the sunk opportunity of persisting with clunky old technology.”

This is the “match point” that SnapLogic can defend into eternity. Hundreds of our customers around the globe testify to that. Almost all of these companies had some flavor of Informatica or its competitor, and they have made the choice to move to SnapLogic. Some have moved completely, in a big bang, and others have side-by-side projects and will migrate completely to SnapLogic over time.

Need more reasons to move fast? Read SnapLogic’s new whitepaper that captures my conversation with James Markarian, SnapLogic’s CTO and also an Informatica alumnus: “We left Informatica. Now you can, too.”


5 Data Predictions for 2016

We all know that huge amounts of data are being turned into information, and then insight, allowing businesses to redefine industries and win over today’s consumers. And yet most companies are still dabbling on the edges, still fighting against restrictive silos and old IT. As you evaluate your data strategy for 2016, here are four data predictions to keep in mind, as well as a look on the start-up industry feeding us data solutions. Continue reading “5 Data Predictions for 2016”

The Business Internet

In the minds of most people, the Internet is an immense assortment of consumer-oriented Web sites that are accessed by individuals using a browser. Well-known examples of these types of destinations include eBay, Yahoo, Barnes & Noble, Expedia, and so on. In this view, Internet transactions are typically carried out by a person performing a single, discrete interaction such as placing a bid, buying a product, running a query, reserving a flight, and so on. While the Consumer Internet has benefitted from a seemingly never-ending series of exciting innovations, many observers would correctly assess business-oriented applications as, frankly, somewhat staid and stodgy. But things are changing, fast.

The trend towards massive, monolithic packaged software from a handful of mega-vendors has been reversed. Instead, the momentum has shifted to what Marc Andreessen has dubbed the Business Internet, which is exemplified by innovative Software-as-a-Service (SaaS) solutions such as, Workday, and NetSuite. Although on-premise applications are still being created and installed – and are certainly present as legacy solutions – the majority of new software is being deployed into the cloud (whether public or private). According to Saugatuck Technology, an IT research consultancy, 40 percent or more of all NEW business application/solution decisions in the enterprise will be Cloud-based by 2014 (up from 15-20 percent in 2009). All customer segments are impacted.

Why the big shift from on-premise applications to the Business Internet? The answer is part technology and part psychology. Ubiquitous, well-proven technology standards such as HTTP/S for transport, RESTful APIs, and the unlimited scale of the underlying infrastructure of the Web have provided the foundation. The other part is psychological. In many cases, the Business Internet, all these new cloud-aware applications, can now simply be thought of as Web sites— whether they’re running inside the firewall or in the cloud. Your employees don’t view as the CRM Application, they view it as the “Salesperson’s Website.” And for Application Managers concerned about the hundreds of highly focused new Web-based applications that have arrived on the market, each with its own unique niche and value proposition, it may help to think of them just as your users do. Websites.

And most of these websites have one thing in common: a RESTful API to make integration possible. And since they all follow standards, these new solutions can easily ‘plug in’ to the enterprise’s existing infrastructure. This also validates and preserves the significant investments that many organizations have made in Service Oriented Architecture (SOA).

These cloud-aware, standards-based applications have been particularly disruptive to the way that integration is performed. It’s no longer necessary to expend large parts of the IT budget to create brittle, heavyweight solutions that only work inside the firewall. Instead, you can now perform integration in the cloud. This is much faster, cleaner, and cheaper to build and maintain. Even though the majority of integration projects still occur within the confines of a single organization, the standards I described earlier make it possible for novel cross-enterprise collaboration, which will revolutionize the way businesses work together.

We saw the handwriting on the wall way back in 2006, so we designed our product architecture to thrive in this new environment. Our customers have benefitted from notably shorter time and effort necessary to connect their applications, no matter where these solutions are hosted. For developers, SnapLogic makes it easier than ever to create new and inventive mashups, using the same standards and APIs that they’re already familiar with. Since they’re exposed as effortlessly invoked RESTful services, SnapLogic processes can be seamlessly integrated with the rest of the enterprise’s infrastructure. By breaking down these barriers, we’ve made it much easier to interoperate with partners, suppliers, and customers than ever before. The Business Internet is real today—where is your organization taking it tomorrow?


While applications, users, and data continue to float into the Cloud, there’s a boisterous argument underway back on the ground among the architects, designers, and software developers who make all of this magic possible. At its core, the debate centers on the proper technique to use when communicating with services that reside in the Cloud. On one side of the fence are adherents of SOAP-based services. They’re facing off against fans of the Representational State Transfer (REST) style of interoperability. In many ways, though, this is a false dichotomy: the same core service logic can be exposed for concurrent access via many different protocols, including SOAP, REST, JMS, plain XML, and JDBC to name just a few. However, if you’re creating a technology platform for the mass market, you’ll need to select one as your native protocol.

Way back in 2006 we decided that the relatively new REST approach offered distinct advantages over SOAP. Before I tell you why, it’s important to remember that while SnapLogic is built on REST, it’s perfectly capable of interacting with SOAP services (and many other protocols as well).

SOAP is all about servers talking to servers, with rigid standards, extensive design, serious programming, and heavyweight infrastructure all essential parts of the equation. If you’re building a mission-critical distributed application that will spend its life behind your corporate firewall and be serviced by a squadron of highly trained developers who know your environment inside and out, SOAP is a great choice.

Made for the Cloud

On the other hand, if you’re interested in building your applications quickly and with maximum portability – especially if the Cloud (public, private, or hybrid) is in the picture – it’s hard to beat REST. It sports a mere handful of simple HTTP API commands, and every object (known as a ‘resource’) has its own unique Uniform Resource Identifier (URI) that provides a path and distinct name. This should be very familiar to you: every time you access a specific Web page, you tell your browser where the page is hosted along with its name.

Developer-Friendly, Secure & Scalable

This straightforward API and clear, consistent labeling philosophy is far more developer-friendly than SOAP, which mandates deep understanding of site-specific APIs. REST lets you publish your data and have others – regardless of where they might be – work with it. Just looking at the URI gives you an indication of how to proceed. Yet despite all of these advantages, basing SnapLogic on REST gave us the same security and massive scalability as the overall Web itself.

Uniform Interface

Critics of REST might argue that generalizing the component interface limits the capabilities of the system. These critics fail to see the great power in the simplicity of a uniform interface— especially at larger scales.

Aaron Skonnard explains this beautifully using a LEGO® metaphor. If you’ve played with LEGO® products before, you know there are only a few ways to connect them together, which represents the LEGO® uniform interface. All LEGO® products, regardless of when or where you buy them, can connect with one another through its uniform interface. When new LEGO® products are released, they can be incorporated into existing LEGO® systems and the user doesn’t have to learn anything new. The new components work just like all other LEGO® products. You might wonder if the limited number of connection possibilities will constrain what you’re able to build with LEGO® products. But if you’ve ever been to LEGOLAND®, you’ll know it doesn’t. You’ll find some incredibly complex objects at LEGOLAND® – including replicas of cars, buildings, even dinosaurs – which were all built from a variety of LEGO® products, connected through the LEGO® uniform interface.

For those of you that read my previous post on the Science Behind Snaps, these concepts should start to sound familiar to you— Snaps have a uniform interface, interoperate with each other because they follow the same pattern, use the same API, and leverage the same underlying infrastructure supplied by the SnapLogic integration platform.

Our Bet Has Paid Off

I’m happy to report that the bet we made in 2006 has paid off. The market has spoken: RESTful services are much more popular than SOAP-based services, particularly in organizations that have ‘grown up’ using the Internet. It’s a self-reinforcing loop: since REST allows you to easily integrate hundreds of data source that are already available on the Web, and more arrive each day. To work with them, you only need to know one protocol, a basic naming convention, and a few simple API commands.

How about you? What experiences have you had with REST or SOAP?