Case study: Connecting the dots at Box

“Data needs to be delivered to a user in the right place at the right time in the right volume.”

Spoken by veteran SnapLogic user Alan Leung, Senior Enterprise Systems Program Manager at Box, Alan provides insight in this case study for why an analytics cloud-first ecosystem with self-service integration is the right solution for many enterprise companies. Just as Box is on a mission to improve and innovate cloud-based file storage, internally they have moved toward a cloud-centric infrastructure that benefits from a cloud-based integration platform.

connecting-the-dots-at-box

Read the full case study here or take a look at some highlights below:

  • Overall problem: Box needed to more efficiently integrate cloud-based applications, including Salesforce, Zuora, NetSuite, and Tableau, all of which they relied on for daily operations.
  • Challenges: The primary challenge was around APIs – each application’s endpoints for integration purposes behaved differently, limiting abilities to build useful connections quickly and resulting in a series of disjointed silos. Manual upload and download processes caused a strain on resources and a waste of time and effort.
  • Goal: To satisfy the need to aggregate the business data piling up in various applications into a cloud-based warehouse to enable self-service, predictive analytics.
  • Solution needed: A cloud-based integration platform that vastly reduced or eliminated the time-consuming manual processes the users faced.
  • Solution found: With the SnapLogic Elastic Integration Cloud, Alan and his team benefitted from:
    • A platform that did not require sophisticated technical skills
    • The agility to enable quick and efficient integration projects
    • The ability to handle both structured and unstructured data at speed
    • An enhanced ability to quickly analyze and make sense of so much data, allowing the company to “rapidly pivot [our] operations to seize opportunity across every aspects of the business.”

For a quick snapshot, Box currently has 23 applications connected through the platform, resulting in 170 data pipelines processing over 15 billion transactions daily. They also have eliminated the need to build a single interface internally; and an ongoing benefit of Box’s partnership with SnapLogic is that more Snaps are always being created and can be implemented for future integration needs.

Learn more about all of our customers here, and stay tuned for more customer stories.

Enterprise iPaaS and API Management

snaplogic_API_managementIt’s been said we have entered “the API Economy.” As our partner 3scale describes it: “As web-enabled software becomes the standard for business processes, the ways organizations, partners and customers interface with it have become a critical differentiator in the market place.”

In this post I’ll summarize SnapLogic’s native API management capabilities and introduce the best-in-class partnerships we announced today with 3scale and Restlet to extend the cloud and big data integration capabilities that SnapLogic’s Elastic Integration Platform delivers. In the next set of posts, we’ll provide a deeper overview of our API Management partners.

Exposing SnapLogic Pipelines as APIs
SnapLogic’s unified integration platform as a service (iPaaS) allows citizen integrators and developers to build multi-point integration pipelines that connect cloud and on-premises applications as well as disparate enterprise data sources for big data analytics, and expose them as RESTful APIs. These APIs can be invoked by any authorized user, application, web backend or mobile app through a simple and standardized HTTP call, in order to trigger the execution of the SnapLogic pipeline. Here are two options for exposing SnapLogic pipelines as Tasks:

  • SnapLogic_TasksTriggered Tasks: Each Task exposes a cloud URL and, if it is to be executed on an on-premises Snaplex (aka Groundplex), then it may have also have an on-premises URL. These dataflow pipelines may be invoked using REST GET/POST, passing parameters and optionally a payload in and out. These pipelines use SnapLogic’s standard HTTP basic auth authorization scheme.
  • Ultra Pipelines: These Tasks are “always on”, where the pipeline is memory-resident, ready to process incoming requests with millisecond-level invocation overhead. Ultra Pipelines are ideal for real-time integration processing. In terms of authentication, Ultra Pipelines may be assigned a “bearer token,” which is an arbitrary string that must be passed in the invoking HTTP/s request in the HTTP headers. The token is optional, allowing customers to invoke a pipeline with no authentication, which may be appropriate on internal trusted networks.

SnapLogic Platform APIs
SnapLogic also provides an expanding set of APIs for customer systems to interact with our elastic iPaaS. Examples include:

  • User and Group APIs: Programmatically manage and automate the creation of SnapLogic users and groups.
  • Pipeline Monitoring APIs: Determine pipeline execution status. A common use case is when you are using external enterprise schedulers to monitor your pipelines.

SnapLogic Snaps Consuming Application APIs
SnapLogic Snaps are the intelligent introspecting, dynamic connectors that provide the building blocks for pipelines. Snaps may interface to applications utilizing the APIs provided by the applications. As the application’s APIs evolve, SnapLogic takes care of keeping the Snaps up to date, allowing our customers take care of the business of their business, not the business of API-level integration.

SnapLogic provides a comprehensive set of 300+ pre-built Snaps, but no collection of connectors will ever be complete. To deal with the expanding cloud and on-premises application and data integration needs of our customers, SnapLogic provides an SDK for customers and partners to be able to create their own Snaps, addressing custom endpoints, or custom transformations.

Best in Class API Management Partnerships
From API authoring and authentication to governance and reporting to protocol translation, our enterprise ISV partners extend SnapLogic’s cloud and big data integration capabilities with best-in-class API Management capabilities. From our partners:

“Making it easy to share digital assets is at the core of what we do. Being able to offer our customers this seamless way to expose their data and application and integration pipelines as RESTful APIs is another way to make that happen.”

– Manfred Bortenschlager, API Market Development Director at 3scale

 

“With over 300 pre-built connectors available in SnapLogic’s Elastic Integration Platform, this partnership enables virtually any data sources, from legacy to cloud, big data and social platforms, to be exposed as RESTful APIs.”

–  Jerome Louvel, Chief Geek and founder at Restlet

Read the press releases to learn more about our API Management partners and Contact Us for more information:

Webinar: It’s the 21st Century – Why Isn’t Your Data Integration Loosely Coupled?

LKDN_IntellyxWebinar_180x110“The problem with traditional connectors is that they are tightly coupled – any change in the data format or interface requirements for either end of any interaction would require an update of the connector, at the risk of a failed interaction.”

– Jason Bloomberg, President, Intellyx

Join us next Tuesday, May 19th for an interactive webinar with digital transformation and SOA thought leader, Jason Bloomberg. In this webinar we’ll hear from Jason about how connectors have been a traditional enterprise application integration (EAI) tool since the dawn of EAI back in the 1990s and how the rise of SOA and Web Services was in part intended to resolve the limitations of such traditional connectors, but often fall short. Additional topics covered will include:

  • A discussion of the age-old problem of implementing loosely coupled data integration
  • An architectural approach to solving this difficult problem
  • A demonstration of SnapLogic’s approach to solving the data integration challenge in a scalable and cloud-friendly manner that aligns with modern application architectures

Before joining the webinar next week, you can also review last week’s Spring 2015 release and learn a little more about Jason Bloomberg here:

Jason Bloomberg is the leading industry analyst and expert on achieving agile digital transformation by architecting business agility in the enterprise. He writes for Forbes, Wired, and his biweekly newsletter, the Cortex. As president of Intellyx, he advises business executives on their digital transformation initiatives, trains architecture teams on Agile Architecture, and helps technology vendors and service providers communicate their agility stories. His latest book is The Agile Architecture Revolution.

A few of our past blog posts also address some of the topics we’ll be diving into next week. Check them out:

Register for the webinar here – we look forward to next week’s interactive discussion.

iPaaS Requirements: Single Platform for Data, Application and API Integration

In this series of posts I’m reviewing 6 essential integration platform as a service (iPaaS) ingredients, as outlined in this post: What to Look for in a Modern Integration Platform. I covered why it’s important that the iPaaS be a fully-functional cloud-based service (based on a software-defined architecture) here. In this post I’ll cover the requirement for a single platform for data, apps and API integration. challenges_legacy_integration_tools

We’ve written a lot about the so-called Integrator’s Dilemma faced by enterprise IT organization whose lightweight SaaS apps are being weighed down by their heavy-weight data and application integration tools that were built well before era of big data, social, mobile and cloud computing (SMAC). One CIO I spoke with recently referred to his legacy technologies for connecting SaaS and on-premises applications with each other (and all of his companyies’ disparate data sources) as “boat-anchor integration.” The good news is that innovation is back in the integration market and as sure as death and taxes, disparate silos of integration will converge in the cloud under the more agile iPaaS umbrella. At SnapLogic, we’re talking to customers and prospects about the benefits of an integration platform that is software-defined and built from the ground up to handle batch and streaming requirements. Yes, these can be for very different use cases:

  • Batch data integration is typically for getting data in and out of a SaaS app (initial load or scheduled replication) for analytical requirements. The industry has historically called this capability: extraction, transformation and load (ETL), which extends to data cleansing and quality as well as master data  management (MDM). The relevance of traditional ETL approaches are now being questioned as Hadoop deployments grow and legacy tools in this category were built to handle rows and columns and structured data integration requirements. A few months ago we ran a popular webinar with Dave Linthicum called Beyond Batch: Is ETL Dead? At the recent Hadoop Summit, the relationship between the data warehouse, SQL and Hadoop was one of the hottest topics. Check out the tweet stream or headlines like this one: Vendors fret as Hadoop encroaches on lucrative data warehouse business.
  • Real-time streaming integration is typically for more operational than analytical use cases. Common SaaS application integration uses cases are connecting front-office apps like Salesforce and Workday with back-office financial systems like SAP, Oracle EBS and NetSuite. Typically event-based, SaaS application integration continues to require features like Guaranteed Delivery, but the need to adapt quickly to frequent changes, API updates and the demand for end-user self-service has made legacy approaches inadequate. XML and SOAP-based integration tools in this category were only designed to handle small data – high-frequency, low-latency messages. We’ve written about why the heavy-weight enterprise service bus (ESB) will become less and less relevant for today’s agile iPaaS requirements in this whitepaper – Thoughts on ESBs: Why Buses Don’t Fly in the Cloud.  You can also read about the technical advantages of a JSON-centric iPaaS here.

So as we continue to see the convergence of historically distinct integration categories (data, application, process), increasingly we believe a single platform will be the right approach and as such it is one of the primary requirements for a modern iPaaS solution. SnapLogic_ipaasIn my next post I’ll talk about Elastic Scale. In the meantime, you can check out a demonstration of the SnapLogic Elastic Integration Platform here.

Four Critical iPaaS Requirements You Can Ignore Only At Your Own Peril (Part 1)

The concept of integration platform as a service (iPaaS), as defined by Gartner, Forrester, Ovum and other analyst firms, has gained significant awareness for enterprise IT organizations facing a new set of challenges posed by rapid cloud adoption. There are a number of options to consider once you recognize that you need a cloud integration strategy, most of which cater to the classic requirements of:

  • Metadata-driven integrations
  • Drag and drop user experience
  • Pre-built connectivity (no coding necessary)
  • Management & monitoring that includes comprehensive error management
  • Transactional support
  • SOAP and REST API support
  • Data transformation and other operations
  • Hybrid deployment model

In my conversations with IT leaders over the past few months, two recurring themes are clearly driving the urgency of iPaaS in the enterprise:

  • “Cloudification”:  Cloud expansion has hit a tipping point and most IT organizations are either running to keep up or trying to get ahead of the pace of this transformation.
  • Agility:  Business users’ need for speed stemming from mobile, social and SaaS expectations.

As a result, four new iPaaS requirements have arisen:

  1. Resiliency
  2. Fluidity in hybrid deployments
  3. Minimal lifecycle management of the platform
  4. Future-proofing for the world of social, mobile, analytics, cloud, and internet of things (SMACT)

In this series of posts, I’ll cover each one of these requirements in detail while highlighting the importance of each, starting with Resiliency today.

Resiliency
Changing business requirements has always been the norm. Expectations of IT responding to them in real-time is a much more recent requirement. More often than not, these changes result in data changes that end up impacting the integration layer. The most common changes on the application, data and API side are additive, where a new column is added to the table or a field to an API to record or deliver additional information.

iPaaS technologies built in the last decade are strongly typed. This means the integration developer must define the exact data structures that will be passing through the integration flow while designing them. Any departure from this structure results in the integration layer breaking down. If you want to see what I mean, run a simple file-to-file integration using a product built in 2000s. Now, edit the source file and add a couple of more unexpected fields to it, and run the flow again. The integration will fail because it cannot recognize these additional fields. This brittle integration layer can bring your business to its knees before you know it. Hence, you should expect your iPaaS to be resilient enough to handle such updates and variations in its stride. Of course, there are going to be situations where you want to enforce strong typing. A good iPaaS provider will include a data validation step just for that. But the default way of initially handling such changes should certainly be a lot more resilient.

By adopting a modern iPaaS, customers benefit greatly by negotiating these new requirements gracefully. There is tremendous cost saving and business risk containment benefits that arise from avoiding downtimes resulting from changing API and data structures. A resilient iPaaS will keep your business running even when unexpected changes occur. A high degree of integration agility equates to greater business agility.

In my next post, I’ll talk about Fluidity in a Hybrid Architecture. You may also want to check out how the SnapLogic Integration Cloud handles the broader set of iPaaS requirements in earlier posts and in this technical whitepaper.

Spring into Cloud Integration with SnapLogic

Today SnapLogic announced our Spring 2014 release, which is codenamed “Stallion” for a number of reasons:

  • 2014-year-of-the-horseFirst and foremost, because it’s focused on making our Elastic Integration platform as a service (iPaaS) even faster.
  • Secondly, because of our continued focus on delivering a single, best-of-breed cloud integration service that can solve the Integrator’s Dilemma by handling diverse requirements: cloud-to-cloud, cloud-to-ground, data integration, application integration, etc.
  • And finally, because it’s the Year of the Horse. Happy Chinese New Year!

SnapLogic Integration Cloud Spring 2014

The SnapLogic Integration Cloud announcement includes elements of our January release and updates that the team is planning to deliver in March. The key innovations that we’ve highlighted here are:

  • Built-in API Management and Development
  • New Performance Monitoring Dashboard
  • Powerful Data Integration for Cloud Analytics
  • Enhanced User Experience for “Citizen Integrators”
  • New and Updated Snaps on the SnapStore
  • Enterprise-Ready Cloud Integration for Hybrid IT

Want to learn more? Be sure to register for the webinar. You can read the press release here and find out what some of our partner are saying about the latest release here. I’ve also embedded a demonstration of the SnapLogic Integration Cloud in action below. Contact Us if you’d like to dive deeper.