In enterprise IT environments, “the future” rarely extends beyond three years, while legacy applications have an unlimited lifetime. Strategic visions of “disruption” and “digital transformation” are tempered with the reality of “peaceful coexistence” and an enterprise service bus (ESB). This is the operating environment that any new application and data integration will be dropped into your company’s chief technology officer (CTO), chief information officer (CIO), and everyone else in the IT organization want to support the business in the most effective, cost-efficient way possible while accommodating the trilogy of legacy, existing, and future technologies.
The right application and data integration solution has a modern integration architecture to span both legacy and cloud-based deployment environments. The platform can coexist with current technology and be used to create new data, application, and process integrations whenever it makes business sense to do so. Instead of a “big bang,” you can phase in the application and data integration technology over time, project by project — without a forklift upgrade.
A checklist for modern enterprise data architecture
The enterprise integration platform should embrace emergent technology such as the Representational State Transfer (REST) API, an architectural style that takes advantage of existing communication protocols and supersedes Simple Object Access Protocol (SOAP) as the enterprise gold standard. In doing so, the integration platform addresses legacy, existing and future technology generations.
- A 100% REST-based architecture reads and writes data between applications incorporating HTTP and HTTP Uniform Resource Identifier (URI), affording library-less integration functionality.
- Parity between on-premises and cloud deployments smooths the transition between the two environments without short-changing performance on-premises, or in hybrid combinations of the two.
- Elastic scale-out architecture of cloud-based deployments allows new compute resources to be added in near real-time, to seamlessly support new workloads or spikes in processing requirements.
- Multi-tenancy lets development teams create integrations once and then serve multiple businesses and projects, while meeting security, performance, and business requirements.
- API-readiness allows users to consume APIs of any application they want to integrate and build APIs to any application in REST format, thus removing the burden of manual updates.
- Native support for poly-structured data (such as hierarchical and relational data [JSON]) enhances the performance of data-interchange formats that are critical enablers of modern application integration.
- Pre-built capabilities for ASCII, EBCDIC, and other common formats ensure that data from mainframe and other legacy applications can be quickly integrated into the evolving environment.
- Multiple integration modes accommodate event-based, real-time, batch, or scheduled activity.
- Big data and IoT integration allow very large amounts of data to be quickly ingested from virtually any source, including IoT device logs
How can we get the most performance and reduce IT cost?
Manual application and data integration is a labor-intensive, never-ending IT task. Everyday standard processes in global organizations can each entail hundreds of integration jobs that must be managed, maintained, and upgraded. The right application and data integration solution can deliver immediate value in any enterprise, expanding into new areas to produce cost savings and lower total cost of ownership (TCO) in multiple ways:
- Less time and fewer full-time employees (FTEs) to build, manage, and update individual integrations and pipelines, as usage changes and applications are updated.
- Systemic efficiency gains through newfound collaboration capabilities, through sharing, re-use, and real-time shared development.
- Lower cost of replacing legacy technology by supporting legacy technologies (such as ETL, ESB, file transfer protocol [FTP], and XML) within the new integration paradigm, as well as data stored on-premises and in the cloud.