Workday is rapidly becoming the centerpiece of the enterprise HR ecosystem, and some of the most successful Workday customers are running the SnapLogic Elastic Integration Platform to accelerate their cloud-based human resources management system (HRMS) and ensure maximum end-user adoption. Use cases include new employee on-boarding, sales compensation, payroll, talent analytics, active directory and back-office integration.
However, according to Lisa Rowan, Research Vice President of HR, Talent and Learning Strategies for IDC, “the apparent appetite to change [human capital management] solution providers in the coming 12-18 months is higher than it has ever been. This is both good and bad news. The good news is that there will be ample new business opportunity. The bad news is that there is likely a lot of churn ahead.” What this means for Workday customers and users of the SnapLogic Elastic Integration Platform is that the importance of integration and adaptation for cloud applications has never been greater.
As we’ve mentioned before, at SnapLogic we’re working with leading IT organization on two primary challenges – the “cloudification” of your enterprise application portfolio, including the addition of SaaS products like Workday, and managing the resulting shift in data gravity. Join us in a few weeks on Wednesday, September 17th for a live webinar to hear more about how our elastic integration platform as a service (iPaaS) is used for integrating Workday with the rest of the enterprise. We’ll also be giving a live demonstration, reviewing common use cases such as employee on-boarding, payroll, talent analytics and active directory. Our integration experts will be sharing their insights and best practices to help ensure that you have a successful journey to the cloud, whether you’re integrating on-premises systems like SAP, other cloud applications like – in addition to Workday - Salesforce and ServiceNow, or building a a next-generation analytics infrastructure running on Hadoop. We encourage any HRIS Managers, Operations and Project Leads to attend, as well application and data integration specialists, solution architects and “citizen integrators” – basically anyone interested in learning more about modern integration solutions!
To summarize, we’ll be discussing integration with Workday HR and Finance, featuring a case study using SnapLogic and Workday for employee on-boarding and providing a live demonstration of the SnapLogic Elastic Integration Platform in action. In the meantime, take a look at more details and examples of Workday integration here, and check out this video featuring an example of an on-boarding solution. And don’t forget to register here!
A few months ago we published the results of a TechValidate survey that found that Cloud Speed and Time to Value are the number one drivers for adopting a modern integration platform as a service (iPaaS). Our team has put together a new series of graphics to highlight the benefits of being able to SNAP IN…to the past, present and future of your data, applications and APIs. I’ve posted a few of the graphics below created by Roger Chan, our graphic design leader. You can download the results from the TechValidate research here.
SNAP IN…to Data, Apps, and APIs
SNAP IN…to AWS Redshift and RDS
SNAP IN…to Workday
SNAP IN…to Big Data
SNAP IN…to Salesforce
Snap Integration from SnapLogic.
And because it’s Friday, here’s one from one of my favorite movies….
To learn more about SnapLogic elastic integration for big data, SaaS and on-premises applications and APIs visit SnapLogic.com. Have a great Labor Day weekend!
Call it what you want:
Whatever your terminology, we’ve clearly come to the end of one technology cycle and entered into a new cycle of innovation and business and IT transformation. Whether it’s Social, Mobile, Analytics and Big Data, Cloud Computing or the Internet of Things (SMACT), not to mention Bring Your Own Everything, these are exciting and challenging times for enterprise IT leaders seeking to transform their organizations while improving business alignment. CIOs have become “change agents.” They’ve become Chief Innovation Officers and even Chief Integration Officers, as one of the first places they’re looking to initiate this change is by improving the connections between internal and external data, applications and APIs, and eliminating disconnected data silos. We’ve written about the Integrator’s Dilemma extensively in recent months, as well as enterprise “cloudification” and the resulting shift in “data gravity.” Increasingly, however, I’m hearing terms like “heavyweight,” “technical debt,” and “IT burden” used to describe the legacy middleware that was purchased and implemented well over 10 years ago. It was built for a different time and now often seen as the change blocker, not a change agent. On the other hand, in conversations with industry analysts, customers, and partners I’m hearing terms like “fabric,” “agility layer,” and “self-service” used to describe the desire for a more flexible approach to integration and data management.
As integration platform as a service (iPaaS) gains market awareness and acceptance, it’s becoming more and more apparent that a new “middle kingdom” is emerging to address the integration requirements of the modern enterprise. Using Old ETL or EAI to tackle the volume, variety and velocity of today’s data, application and API integration requirements (not to mention IoT) is a bit like driving a Tesla Model S on a dirt road. (Okay, I’m still working on the metaphor, but this image comes to mind.) Recent financial results aside (examples here and here), I’ve put together a list of 10 reasons why old extraction, transformation and loading (ETL) and enterprise application integration (EAI) tools (and solution providers) will continue to struggle in the SMACT era:
- Cannibalization of the Core On-Premises Business: This is the obvious one. Innovator’s Dilemma. It’s hard to “skate where the puck is going” without making big bets on the future and being willing to walk away from underperforming product lines. It can be done (here’s a great example), but I haven’t seen this level of commitment from the incumbents in the data and application integration market. Back in 2007, my friend Ken Rudin presented “SaaS Cannibalization and the Civil War Within.” It still rings true.
- Heritage Matters in the Cloud: I’m generalizing, but if you come from the EAI/ESB/SOA world you typically have an application-centric view of the world. If you come from the world of business intelligence, data warehousing and master data management, you naturally have a data-centric view. Re-thinking, or to use Mary Meeker’s word – re-imagining - integration requires the vision to tackle both equally. Message-oriented EAI tools struggle with bulk/batch data movement and transformations. Rows and columns-based ETL tools struggle with real-time, unstructured and hierarchical data. Modern, JSON-centric, RESTful platforms clearly have an advantage in the new era, whether your requirement is event-based, real-time, streaming or scheduled batch-oriented data integration.
- EAI without the ESB: Features like guaranteed delivery, triggers, monitoring and orchestration are just as important in the cloud as they were on-premises. But the enterprise service bus (ESB) does’t fly in the cloud. Here are some comments from Forrester’s Stefan Ried on the topic and here’s why SOA was DOA thanks to the ESB.
- Beyond ETL: Features like re-use, aggregation, joins, union, splitting, SCDs and scheduling are just as important in the cloud as they were on premises so I’m not going to jump in on the ETL is Dead debate, but multi-purpose or multimodal integration is clearly the future. This webinar with Dave Linthicum and Gaurav Dhillon provides some good perspective on the topic.
- Point to Point Misses the Point: For legacy ISVs, all roads lead back to point number one – cannibalization – but simply introducing a light-weight cloud (or cloud-washed) version of your on-premises software does not a cloud strategy make. While there is a growing demand for more approachable, easier-to-use cloud integration services, there must also be advanced functionality (see examples of iPaaS requirements here), broad connectivity and the ability to go beyond simple point-to-point integration scenarios. Otherwise, the only thing that’s going hybrid is your integration hairball.
- Franken-tegration: Speaking of hybrid, it’s a hot topic in enterprise IT and cloud computing, but when it comes to an iPaaS solution, hybrid shouldn’t mean half-baked. A cloud integration service must able to handle complex on-premises integration use cases (cloud to ground) and cloud to cloud. It also has to provide more than just monitoring, but also a sophisticated integration design and administration environment that does not require on-premises tools. More on this topic in the post Fluidity in Hybrid Deployments.
- Big Data Integration is not Core…or Cloud: For EAI vendors, big data integration isn’t an option. For legacy ETL vendors, built to scale up, not out, and designed to primarily handle structured, relational sources and targets, big data is a big problem. And that’s just for the on-premises technology. Cloud computing and big data, not to mention mobile, social, APIs and IoT, are changing the rules and requirements of integration and a modern, elastic platform is going to be necessary to keep up.
- Elastic Scale Out: I wrote about why this is important in this post, iPaaS Requirements: Elastic Scale. Re-purposing legacy technology for integration design or run-time processing guarantees that the solution wasn’t built for the cloud and elastic to the core.
- An On-Ramp to On-Prem: Some of this comes down to sales compensation plans (paying reps only on first year annual contract value (ACV) and not on multi-year agreements, for example), but often point-to-point cloud integration tools from legacy vendors are used as a loss leader to open doors in new accounts or expand into divisions of accounts. The longer term objective is a bait and switch, which leads me to my final point.
- Focus and DNA: As I mentioned, all roads lead back to point number one and the Innovator’s Dilemma, but there is an Innovator’s Solution and proven techniques to Escape Velocity. Of course, we believe there’s also an Integrator’s Solution, but for legacy ETL and EAI independent software vendors (ISVs), the question is whether they have the wherewithal to adapt and truly embrace the change that the SMACT era will require. It will not only require a complete re-focus on the future and a willingness to walk away from the past, which is particularly difficult for public companies, and require a shift to subscription pricing, a new approach to customer adoption and renewals and agile development. It will also require a DNA of innovation and a willingness to change. When you’re on the inside, I compare it to the frog in boiling water story, but I think Aaron Levie, the visionary CEO of Box.com, put it best when he recently tweeted:
At SnapLogic, we’re always focused on delivering new Snaps and updating our library of over 160 existing Snaps. Snaps are the building blocks of a pipeline that perform a single function such as read, write or act on data. They are found in the Snap catalog, accessible on the left hand side of the SnapLogic Designer. Drag a Snap from the Snap catalog onto the workspace to use it in a pipeline.
We are pleased to announce the addition of the following Snaps for the SnapLogic Elastic Integration Platform:
PostgreSQL Snap Pack
With this Snap Pack, you can lookup records, fetch data, and execute SQL statements to delete, update or insert data within a specified PostgreSQL table.
The latest Snap added to the JIRA Snap Pack, JIRA Transition, adds support for performing a transition on a JIRA issue by supplying the updates in an input document.
SAP IDoc Listener
The SAP Snap Pack was expanded with the SAP IDoc Listener. This Snap receives SAP IDoc requests and sends the IDoc data contained in the requests to the output view.
The following Snap Packs are available as Beta Releases:
- Reltio Snap Pack: Read, write, and delete resource objects in Reltio.
- Xactly Snap Pack: Support for searching Xactly is available.
With the August 2014 Snap Release, we are also delivering minor updates and fixes for Oracle Stored Procedure, File Reader/Writer/Delete, Multi File Reader, ZipFile Read/Write, Fixed Width Parser. These updates will be pushed this evening, with no required down time. Look for the Release Notes at the time of release for more information. We also have some additional resources below for you to check out:
A few weeks ago, Rishabh Mehan wrote about the new SnapLogic Community – for Developers By Developers. The initial goal of the community was focused on API development and custom Snap building for customers and partners. The site has expanded and we’re starting to see some great participation. Here’s a brief summary: You can now access all of our documentation, Elastic Integration Platform and Snap Development: You can now download an on-premises Snaplex (known as a “Groundplex”), the Java SDK, public Snap Packs and installers for Mac, Linux and Windows. The Community forum for asking questions and sharing best practices has expanded dramatically. You can now select a category and post your question or answer to the forum. The SnapLogic team has started using the tag: didyouknow when posting tips and tricks. Here are some examples:
- Which is the most used Snap?
- How do you remove white spaces from a string?
- If I can’t use pipeline parameters directly in my SQL select statement, how do I do it?
- Can you comment on the suitability for using Snaplogic for moving (not transforming) large binary (100s MB or GB) files using FTP?
- If I can’t use pipeline parameters directly in my SQL select statement, how do I do it?
In your questions (and answers) you can now add screenshots, documents and formatting to ensure it’s clear and detailed. To help us (and others) know what’s most important, you can vote up and down questions and answers: The team is adding features to the SnapLogic Developer Community every day and we’re excited to see our customers and partners joining the discussion. As we continue to see more and more Citizen Integrators logging into our cloud integration service, we’re looking forward to with you. Simply login to Developer.SnapLogic.com with your SnapLogic username and password.
This month’s webinar was all about big data and how customers can use the SnapLogic Elastic Integration Platform and SnapReduce 2.0. In yesterday’s live discussion and demonstration, we talked with SnapLogic Chief Scientist Greg Benson, who is Professor of Computer Science at University of San Francisco and has worked on research in distributed systems, parallel programming, OS kernels and programming languages for +20 years. The webinar went into the details of SnapReduce 2.0 for big data integration (more on that later), but first we talked about Hadoop in terms of where it’s been, where it’s going and what the implications are for traditional enterprise data warehousing. Here’s a brief recap below:
- The Big Data journey: Greg talked about early initiatives and use cases and how so much “data exhaust” was getting left on the floor.
- Hadoop and data warehousing: Many believe Hadoop and the Hadoop ecosystem will eventually replace what relational data warehouses do today because of the economics of Hadoop and what has now become possible in terms of data storage. Right now, though, they’re complementary.
- Implications on data integration: There was a good discussion about why old tech won’t work in the new era of SMAC and the variety of sources and use cases for both streaming and batch data processing.
- The need to acquire, prepare and deliver big data: This includes both batch and streaming processing for a new generation of ETL/ELT.
Following the big data discussion, Greg and the team moved on to SnapReduce 2.0 and the concept of elastic scale out, with a Q&A session to address customer and prospect questions. Check out the presentation slides and questions below:
How do you make SnapLogic work across two clouds…say Salesforce in one cloud and social data in another cloud?
The first thing to understand is that the SnapLogic Snaplex respects data gravity. From this question, it looks like “services” are seen as separate clouds. SnapLogic easily connects separate services and applications and can do so either in our cloud or through a Snaplex running on premises or in a VPC. As we covered in the webinar, with SnapReduce, the Snaplex can also now run natively as a YARN application within a Hadoop cluster.
Is it possible to do transformations on data before it is actually written on to HDFS?
Yes, absolutely. When streaming data into HDFS, the data can be filtered or transformed before writing to HDFS.
Are the data flows (pipelines) converted into jar files or something like pig?
MapReduce ode is generated directly and issues to Hadoop as a jar.
Can SnapLogic directly write a .tde file for Tableau or is it a CSV file which Tableau later converts to its native format?
The SnapLogic Tableau Snap directly writes to a TDE.
Once I have read data from HDFS using HDFS Reader, would I be able to do join with data sitting on a source / database (viz. Oracle) / SQL Server)? If so, where will that pipeline run?
Yes, this can be done and in this scenario the pipeline would run in Hadoop, but on a single Hadoop node. It would not run as a MapReduce job.
Let us know in the Comments section below if you have any more questions, and be sure to watch the webinar recording if you couldn’t make yesterday’s live event and sign up for the Early Access Program. In the meantime, here’s a brief SnapReduce 2.0 demonstration:
“A unified approach to connecting growing data volumes, types and sources as well as new and legacy business applications and proliferating APIs is essential to becoming a truly elastic enterprise.” – Gaurav Dhillon, SnapLogic CEO
Creating a unified platform for connecting data, applications and APIs - this is what we’re focused on here at SnapLogic and we’ve had a busy few weeks with the announcement of our Summer 2014 Release. Check out some of the highlights below, and take a look at our recent webinars to learn more about the specific topics that are most relevant to you and your cloud integration and big data integration challenges, questions and needs.
- Integration Developers News addresses how SnapLogic helps iPaaS “cross the chasm” with visibility, security and pre-built integrations for apps, data, APIs and Hadoop.
- Jason Bloomberg’s article, Because Bulldozers and Buses Don’t Fly, covers traditional ETL and ESB strategies, as well as the elasticity required for today’s cloud integration initiatives. Jason described SnapLogic as being “at the heart of cloud-centric architecture” and says that “it’s the SnapLogic Designer toll that is downright sexy.”
- 451 Research impact report reviews SnapLogic’s growth, readying for ‘big data’ and the summer release. Check out Carl Lehmann’s outlook on cloud integration, at right.
- Gaurav talks SaaS at Enterprise Founders Upfront, speaking about his career and experiences in integration and what he sees happening in the industry.
- SnapLogic was named an AlwaysOn Global 250 Winner, recognized as an elastic integration leader for creating technology innovations for the global silicon valley. As Tony Perkins, founder and editor of AlwaysOn says, “As the expansion of the mobile and cloud markets continue, the companies on this year’s AlwaysOn Global 250 are racing to provide business and consumer users with the best products and services. The new generation of hardware and software solutions is making it possible for businesses to become even more efficient and profitable.”
Additionally, a few internal highlights from the past few months have been:
You can also find more information on our newest Snaps for cloud integration, and watch the Summer 2014 recorded webinar for more details on all of the new and improved features of last month’s release.