Summer Release 2017: Focus on operational efficiency and artificial intelligence

By Dinesh Chandrasekhar

SnapLogic Enterprise Integration Cloud’s Summer Release for 2017 (v4.10) is generally available for customers today. We are excited to share a lot of new features and enhancements that revolve around the themes of operational efficiency for digital transformation, continuing innovation in the use of Iris Artificial Intelligence, and added support for Cloud Data Warehouse and Big Data. While we stay focused on offering one of the finest and easiest user experiences in the iPaaS space, we wanted to extend that experience over to the operational side as well.

A few key highlights from this release that focus on improving operational efficiencies and the use of Artificial Intelligence:

  • AI-Powered Pipeline Optimization: SnapLogic’s Iris technology now evaluates pipeline configuration and operation to recommend specific performance improvements based on machine learning, delivered to users via dashboard alerts and notifications.
  • Local Snaplex Dashboard: Ultra Pipelines have been used by our customers in mission-critical use cases that warrant uninterrupted data flow across multiple data sources. Now, we have introduced a local dashboard that provides full visibility into Ultra Pipelines. This offers customers increased confidence that their high-priority projects are running automatically without any interruption.
  • API Threshold Notifications: On a similar vein around dashboards, we bring in some monitoring capabilities that focus on operations and project development. Real-time alerts and notifications provide customers with an early warning system to improve monitoring of complex integrations projects. Alerts can be set up to notify stakeholders when API limits are approached and exceeded when users are added or removed from the platform, and when projects are modified.
  • Project-level Migration: We’ve been listening to your requests! With the fast pace at which integration projects are being executed and delivered across the enterprise, it is pertinent that project lifecycles are not a bottleneck. We’ve introduced a new feature that automates the migration of projects across departments and organizations, eliminating hand-coding and hard-coding, resulting in seamless project migrations.
  • Public API for Groundplex Installation: A new API is now available to automate the installation and configuration of on-premises SnapLogic nodes, eliminating the time and effort to manually download and install additional nodes. 

A few key highlights from this release that focus on enhancing our already comprehensive integration capabilities into cloud data warehousing and big data:

  • Microsoft Azure SQL Snap Pack: A brand new Snap Pack that allows customers to quickly and easily migrate on-premises databases to Microsoft Azure SQL Database for fast cloud-based reporting and analytics.
  • Teradata Select Snap: Expands the Teradata Snap Pack, allowing users to retrieve data from a Teradata database to display on a table for easy reporting tasks.
  • Parquet Writer Snap: Allows users to store data in specific partitions in Hadoop HDFS, improving the ability to create reports and derive meaningful insights from big data.
  • Parquet Reader/Writer Snap: Allows users to write Parquet files to Amazon S3 and AWS Identity & Access Management in addition to HDFS, expanding SnapLogic’s support for cloud-based data warehousing via Redshift.

Watch the Summer 2017 Release webinar to learn all the features and enhancements across our Snaps, IoT, CRM, and more.

Dinesh Chandrasekhar is Director of Product Marketing at SnapLogic. Follow him on Twitter @AppInt4All.

SnapLogic seeks partners to help drive LATAM growth

By Carlos Hernandez Saca

I am very excited to announce that SnapLogic is now seeking Latin America (LATAM) - Mexico, Brazil, Argentina, Colombia and Chile – partners to support our global expansion. This news follows our successful launch of SnapLogic’s global/US partner program, SnapLogic Partner Connect, just over a year ago.

We’ve got momentum. We’ve seen first-hand that SnapLogic dramatically transforms how companies integrate data, applications, and devices for digital business. In December 2016, SnapLogic secured an additional $40 million in funding to accelerate its global expansion. And we see the channel as an important driver of this expansion.

We’re making this call for partners to help us build a rich ecosystem of technology, consulting, and reseller partners in the LATAM market so that joint customers see the benefits of adopting software-as-a-service (SaaS) applications, cloud data management, and big data technologies. SnapLogic’s technology enables this by accelerating integration and driving digital transformation in the enterprise.

The SnapLogic Partner Connect program includes several categories, reflecting the breadth of the SnapLogic platform and depth of adjacent partner solutions. Partner types include:

  • Technology Partners– These are providers of technologies with which SnapLogic integrates, and includes partners such as Salesforce, Workday, AWS, Microsoft, Cloudera, and Snowflake.
  • OEM/Managed Services Partners- OEM partners integrate SnapLogic into their product or service offerings. OEM partners include Manthan, Reltio, Planview, and Verizon.
  • Global Systems Integrator Partners– SnapLogic’s GSI partners are implementing SnapLogic as part of comprehensive digital transformation projects at some of the world’s leading enterprises. These partners include firms such as Accenture, PwC, Cognizant, HCL, Tata Consultancy Services, and Tech Mahindra.
  • Consulting Partners– These are organizations that provide integration consulting and best practices services, and include partners such as Interworks, Softtek, and Semantix Brazil.
  • Reseller Partners– The roster of SnapLogic international resellers is growing rapidly and increasing SnapLogic’s presence in new markets worldwide. Resellers include MatrixIT in Israel, KPC in France, Sogeti in United Kingdom, Systemation in the Netherlands, Softtek in Mexico, DRZ and Semantix in Brazil.

SnapLogic CEO Gaurav Dhillon says, “Extending our partners program into the LATAM channel is a crucial part of our business growth strategy. Working alongside strategic channel partners will provide an exciting opportunity to reach new LATAM customers and help them realize their digital transformation goals through connected data and applications.”

And I agree. The market opportunity for data, application, and device integration is large and growing, representing a unique opportunity for our partners in LATAM. As customers across the region increasingly use our modern, cloud-first platform to integrate and manage data across cloud, big data, IoT, mobile, social, and more, we look forward to working with new partners in LATAM who can help accelerate their digital transformation efforts.

SnapLogic’s self-service integration platform was recognized as a Leader in Gartner’s 2017 Magic Quadrant for Enterprise Integration Platform as a Service (iPaaS). Global enterprise customers such as Adobe, AstraZeneca, Box, Capital One, GameStop, Verizon and Wendy’s rely on SnapLogic to automate business processes, accelerate analytics and drive digital transformation. If you’re interested in partnering with us, we want to hear from you.

Carlos Hernandez Saca is the Sales & Partnership Area Director for Latin America at SnapLogic. You can follow him on Twitter @rivaldo71.

 

Our live demo series is back

By Rich Dill

Due to popular demand, we’re bringing back a regular demo series where you get a first-hand look at SnapLogic’s Enterprise Integration Cloud in action, hear from a technology expert, and have your questions answered … live.

If you want to learn more about what SnapLogic can bring to your application and data integration plans, this demo is for you. Join me Wednesday, August 9 from 10:30AM – 11:00AM PT for the kick off of our Live Demo Series.

In this month’s demo, I will be highlighting the multiple reasons why the Enterprise Integration Cloud is the best option for quickly connecting and efficiently managing legacy and modern data, cloud application, and API integration needs.

I will be providing an overview of iPaaS and SnapLogic along with a detailed look inside the industry’s first AI-powered integration technology, Iris. I will also highlight how to connect an application to a database in minutes.

Each month we will be showcasing our leading industry integration cloud, so if you can’t attend this demo, we’ll have others for you to participate in with different themes each time.

Register now for the SnapLogic Live Demo Series

Rich Dill is Enterprise Solutions Architect at SnapLogic. You can follow him on Twitter @richdill.

 

 

Get your game plan on: Data warehouse migration to the cloud

By Ravi Dharnikota

You’ve decided to move your data warehouse to the cloud, and want to get started. Great! It’s easy to see why – in addition to the core benefits I wrote about in my last blog post, there are many more benefits associated with cloud data warehousing: incredibly fast processing, speedy deployment, built-in fault tolerance and disaster recovery and, depending on your cloud provider, strong security and governance.

A six-step reality check

But before you get too excited, it’s time to take a reality check; moving an existing data warehouse to the cloud is not quick, and it isn’t easy. It is definitely not as simple as exporting data from one platform and loading to another. Data is only one of the six warehouse components to be migrated.

Tactically and technically, data warehouse migration is an iterative process and needs many steps to migrate all of the components, as illustrated below. Here’s everything you need to consider in migrating your data warehouse to the cloud.

1) Migrating schema: Before moving warehouse data, you’ll need to migrate table structures and specifications. You may need to make structural changes as part of the migration, including indexing or partitioning – do they need to be rethought?

Data Warehouse Migration Process

2) Migrating data: Moving very large volumes of data is process intensive, network intensive, and time-consuming. You’ll need to map out how long will it take to migrate and if you can accelerate that process. You may need to restructure as part of schema migration and transform data as part of the data migration? Alternatively, can you transform in-stream or should you pre-process and then migrate?

3) Migrating ETL: Moving data may be the easy part when compared to migrating ETL processes. You may need to change the code base to optimize for platform performance and change data transformations to sync with data restructuring. You’ll need to determine if data flows should remain intact or be reorganized. As part of the migration, you may need to reduce data latency and deliver near real-time data. If that’s the case, would it make sense to migrate ETL processing to the cloud, as well? Is there a utility to convert your ETL code?

4) Rebuilding data pipelines: With any substantive change to data flow or data transformation, rebuilding data pipelines may be a better choice than migrating existing ETL. You may be able to isolate individual data transformations and package them as executable modules. You’ll need to understand the dependencies among data transformations to construct optimum workflow and the advantages you may gain – performance, agility, reusability, and maintainability – by rebuilding ETL as modular data pipelines using modern, cloud-friendly technology. 

5) Migrating metadata: Source-to-target metadata is a crucial part of managing a data warehouse; knowing data lineage, and tracing and troubleshooting is critical when problems occur. How readily will this metadata transfer to a new cloud platform? Are all of the mappings, transform logic, dataflow, and workflow locked in proprietary tools or buried in SQL code? You’ll need to determine if you’ll be able to export and import by either reverse engineering the metadata or rebuilding it from scratch.

6) Migrating users and applications: The final step in the process is migrating users and applications to the new cloud data warehouse, without interrupting business operations. Security and access authorizations may need to be created or changed, and BI and analytics tools should be connected. To do this, what communication is needed and with whom?

Don’t try to do everything at once

A typical enterprise data warehouse contains a large amount of data describing many business subject areas. Migrating an entire data warehouse in a single pass is usually not realistic. Incremental migration is the smart approach when “big bang” migration isn’t practical. Migrating incrementally is a must when undertaking significant design changes as part of the effort.

However, incremental migration brings new considerations. Data location should be transparent from a user point of view throughout the period when some data resides in the legacy data warehouse and some in the new cloud data warehouse. Consider a virtual layer as a point of access to decouple queries from data storage location.

A hybrid strategy is another viable option. With a hybrid approach, your on-premises data warehouse can remain operating as the cloud data warehouse comes online. During this transition phase, you’ll need to synchronize the data between the old on-premises data warehouse and the new one that’s in the cloud.

Cloud migration tools to the rescue

The good news is, there are many tools and services that can be invaluable when migrating your legacy data warehouse to the cloud. In my next post, the third and final in this series, I’ll explore the tools for data integration, data warehouse automation, and data virtualization, and system integrator resources that can speed and de-risk the process.

Learn more at SnapLogic’s upcoming webcast, “Traditional Data Warehousing is Dead: How digital enterprises are scaling their data to infinity and beyond in the Cloud,” on Wednesday, August 16 at 9:00am PT. I’ll be presenting with Dave Wells, Data Management Practice Lead, Eckerson Group, and highlighting tangible business benefits that your organization can achieve by moving your data to the cloud. You’ll learn:

      • Practical best practices, key technologies to consider, and case studies to get you started
      • The potential pitfalls of “cloud-washed” legacy data integration solutions
      • Cloud data warehousing market trends
      • How SnapLogic’s Enterprise Integration Cloud delivers up to a 10X improvement in the speed and ease of data integration

Sign up today!

Ravi Dharnikota is Chief Enterprise Architect at SnapLogic. Follow him on Twitter @rdharn1

Gaurav Dhillon talks disconnected data on BBC Business Matters radio show

By Scott Behles 

After his inaugural appearance on BBC Business Matters in May, SnapLogic founder and CEO, Gaurav Dhillon, was back on the airwaves with the BBC World Service radio show last week.

Broadcast around the world, BBC Business Matters reaches millions of listeners every day with expert business views from journalists, academics, and business leaders. On this occasion, Gaurav joined BBC host Fergus Nicoll and Professor Jasper Kim of Ewha University in Seoul to discuss the pressing business stories of the day, alongside the findings of SnapLogic’s recent research, The High Cost of Disconnected Data.

As Gaurav explains in the clip below, the cost of disconnected data for businesses is staggering. Large enterprises in the UK and US are wasting $140 billion a year due to missed opportunities and lost productivity. The areas hardest hit by disconnected data? Companies can’t get new products to market quickly enough, and customer experience suffers.

Even though we are, as Gaurav puts it, “living in an age of data enlightenment,” in reality, old data silo habits die hard, with disconnected systems and collaboration challenges across the enterprise still getting in the way. Innovation is the lifeblood of business, and if it’s to thrive, these data barriers need to be torn down.

You can listen to the full broadcast here. The focused discussion about disconnected data starts around the 26:30 mark.

Scott Behles is Head of Corporate Communications at SnapLogic. Follow him on Twitter @sbehles

The commoditization of integration

By Dinesh Chandrasekhar

Eight years ago, dozens of integration vendors were offering scores of solutions, all with what seemed to be the same capabilities. Pick any ESB or ETL tool and each seemed to perform the same functions as their competitors. RFPs were no longer a viable way to weed out the inferior vendors as each solution checked all the boxes across the board. Plus, all vendors were ready to lower their prices at the drop of a hat to win your business. It was at this time that the integration market had truly reached a level of commoditization. Consumers could easily pick and choose any solution as there were no true differentiators amongst them.

But, several factors have changed the landscape since then:

  • NoESB – The NoESB architecture had started gaining interest – pushing the idea of the irrelevancy of ESB for many integration scenarios. Yet, an API Gateway was not the right alternative.
  • Cloudification – The cloudification of pretty much all your favorite on-premises enterprise applications began around the same time. Enterprises that were thinking of a digital transformation couldn’t get too far without a definitive cloud strategy in place.
  • Convergence of ESB and ETL – The lines between application integration and data integration were blurring. CIOs and IT managers didn’t want to deal with two different sets of integration tools. With the onset of mobile and IoT, data volumes were exploding daily. As a result, even data warehouses moved to the cloud. To serve such big data needs, the traditional/legacy ESB/ETL tools were incompetent and unfit.
  • Agile Integrations – Finally, the DevOps and Agile movements impacted enterprise integration initiatives as well. They had given rise to new user personas in the enterprise – Citizen Integrators or Citizen Developers. These are the LOB Managers or non-IT personnel that needed quick integrations within their applications to render their data in different views. The reliance on IT to deliver solutions to business was becoming a major hindrance.

All these factors have influenced the iPaaS (Integration Platform as a Service) market. Now, thousands of companies are already leveraging iPaaS solutions to integrate their cloud and on-premises solutions. iPaaS solutions break away from legacy approaches to integration, are cloud-native, intuitive, fast, self-starting, support hybrid architectures, and offer connectors to a wide range of on-premises and on the cloud applications.

Now comes the big question – “Will iPaaS solutions be commoditized, too?” At the moment, the answer is a definite NO and there are multiple reasons why. Beyond scale, latency, tenancy, SLAs, number of connectors etc., one of the key areas that will differentiate iPaaS solutions is the developer experience. The user interface of the solution will determine the adoption rate and the value it brings to the enterprise. So, for a citizen integrator to actually use the system, the interface should be intuitive enough to guide them in building their integration flows quickly, effectively, and most importantly, without the assistance of IT. This alone will make or break the system adoption.

iPaaS vendors are trying to enhance this developer experience with features like drag-and-drop connectors, pipeline snippets, a templates library, a starter kit, mapping enhancements, etc. However, very few vendors are offering AI-driven tooling that enables intelligent ways to predict next steps – based on learnings from hundreds of other users – for your integration flow. AI-assist is truly a great benefit for citizen integrators, who may be non-technical. Even technically savvy developers welcome a significant boost in their productivity. With innovations like this happening, the iPaaS space is quite far away from being commoditized. However, enterprises still need to be wary of cloud-washing iPaaS vendors that offer “1000+” connectors, a thick-client IDE, or an ESB wrapped in a cloud blanket. And, that is a post for a different day!

Dinesh Chandrasekhar is Director of Product Marketing at SnapLogic. Follow him on Twitter @AppInt4All.

Why citizen integrators are today’s architects of customer experience

By Nada daVeiga

Lately, I’ve been thinking a lot about customer experience (CX) and the most direct, most effective ways for companies to transform it. As I recently blogged, data is the centerpiece – the metaphorical cake, as it were, compared to the martech frosting – of creating winning customer experiences.

That being said, which internal organization could possibly be better than marketing, to shape customer experience?

Nearly every enterprise function shapes CX

As it turns out, there are many teams within the modern enterprise that serve as CX architects. Think of all the different groups that contribute to customer engagement, acquisition, retention, and satisfaction: marketing, sales, service, and support are the most obvious, but what about product development, finance, manufacturing, logistics, and shipping? All of these functions impact the customer experience, directly or indirectly, and thus should be empowered to improve it through unbridled data access.

This point of view is reflected in SnapLogic’s new white paper, “Integration in the age of the customer: The five keys to connecting and elevating customer experience.” From it, a key thought:

[W]ho should corral the data? The best outcomes from customer initiatives happen when the business takes control and leads the initiative. The closer the integrators are to the customer, the better they can put themselves in their customers’ shoes and understand their needs. Often, they have a clear handle on metrics, the business processes, the data, and real-world customer experiences, whether they’re in marketing, sales, or service, and are the first to see how the changes they’re making are improving customer experience — or not.

Democratizing data integration

Because most departmental leaders in sales, service, and marketing are typically not familiar with programming, they look for integration solutions that provide click-not-code graphical user interfaces (GUIs) that enable a visual, intuitive process to democratize customer data integration. SnapLogic believes that GUI-driven, democratic data integration is an essential first step in empowering today’s CX architects to gain the analytic insight they need to improve customer experience.

In short, we believe that “citizen integrator” is really just another name for “citizen innovator;” fast, easy, seamless data integration shatters stubborn barriers to CX innovation by igniting exploration and problem-solving creativity.

To learn how to design your integration strategy to improve customer experience across the organization, download the white paper, “Integration in the age of the customer: The five keys to connecting and elevating customer experience.” In it, you’ll find actionable insights on how to optimize your organization’s data integration strategy to unlock CX innovation, including:

  • Why you need to ensure your organization’s integration strategy is customer-focused
  • How to plan around the entire customer lifecycle
  • Which five integration strategies help speed customer analytics and experience initiatives
  • How to put the odds of customer success in your favor

Nada daVeiga is VP Worldwide Pre-Sales, Customer Success, and Professional Services at SnapLogic. Follow her on Twitter @nrdaveiga.