Event streaming between SAP and Azure – how others are doing it?

In the fast-paced IT world, SAP customers are looking into advantages of cloud solutions. Cloud offers agility and innovation, allowing businesses to run machine learning, AI, data mining and predictive analytics scenarios. To replicate SAP data into the cloud, companies need a robust solution that overcomes integration challenges. And as things change fast, the technology needs to catch up.

A particularly interesting use case is application integration, which calls for real-time replication of events in SAP. In this blog we will describe a customer use case focused on event streaming. The business scenario, in this case, is streaming shipment events from SAP to the native Azure cloud services. The shipment events are then pushed to a mobile application to notify the business user. We will go through the challenges to be considered when integrating SAP systems, and how Datavard Glue tackles the challenge.

Solution for SAP ERP events integration on Azure

For the event streaming scenario, you need a tool for real-time application integration. Datavard Glue is a strong solution for a tight, native SAP data integration with data lakes running on big data platforms.  It is already used by many customers to replicate meaningful SAP business objects in real-time to modern cloud data storages like the Azure Data Lake Gen2. This enables them to analyze their SAP data and discover insights using modern and powerful analytics services of Azure.

Datavard Glue can now go beyond the classic data replication for analytical purposes and supports near real-time application integration with event streaming. This enables customers to build scalable cloud native applications that require SAP data, without being worried about SAP system utilization, or lengthy development process.

Three critical factors for designing and implementing event streaming

Events streaming is a specific type of data integration required when different applications need to communicate and exchange information in real-time (application integration).

We need to consider a few critical requirements when it comes to designing and implementing this Integration pattern:

  1. We need to capture the event as soon as it is generated on the source system and propagate it in real-time. Datavard Glue can achieve this by using its stream engine in combination with its change data capture capabilities.
  2. Handling events is far more complex than handling table records. Replicating tables separately with the intention of combining them into a meaningful business object (a combination of information coming from multiple tables) after it lands on the target storage could create consistency problems. Datavard Glue guarantees transactional consistency by building the business object using BPL business functions directly on the SAP application layer.
  3. Not every table update corresponds to an event being triggered, but particular conditions need to be verified (for example a specific change of a record status). With Glue, you have the unique capability to enhance the replication with your business logic to make sure that only relevant information is replicated at the right time.

When considering the next-generation data platform for modern data-warehousing and advanced analytics, Microsoft Azure is the go-to cloud solution for many customers.

In this specific customer case, leveraging tools like Azure Event Hubs, Cosmos DB, and Azure functions, made it possible to orchestrate a cloud native architecture designed for the ingestion and processing of high throughput of small batches of data such as SAP transactions events.

With the recent enhancements, Datavard Glue has the unique capability to directly integrate SAP with Azure Event Hubs through a dedicated Glue consumer. This new innovative integration scenario was made possible with just a lightweight SAP plugin that enables streaming events from SAP directly to Microsoft Azure Event Hubs.

The use case: Application integration in the construction industry

A major company in the construction industry decided to use Datavard Glue to enable application integration between SAP and Azure based native cloud applications.

The business scenario, in this case, is streaming shipment events from SAP to the native Azure cloud services. The shipment events are then pushed to a mobile application to notify the business user.

There are 3 challenging requirements which had to be solved by the architecture:

  1. SAP data needs to be adjusted and combined with IoT sensor data
  2. Fetching SAP data through SAP gateway or similar technologies negatively impacts the performance of the productive system which is already under high load
  3. Low latency requirements, only 4 till 6 seconds were acceptable globally until the notification is displayed on the mobile app.
Azure based architecture enables pushing events from SAP via Datavard Glue to various Azure platform services up to the mobile application.

Datavard Glue streaming functionality and control UI

Datavard Glue has the option to execute its replications in a streaming mode. When the streaming mode is activated, an optimized process is triggered and replications are executed in near real-time as soon as the business event and corresponding data is generated in the system.

It is possible to configure the process through a UI, and among other things:

  1. Limit the number of background jobs executed by the streaming engine in parallel
  2. Adjust the replication latency
  3. Quickly edit and monitor the extractors that are executed in streaming mode
  4. Add and remove extractors without disrupting the workflow of other extractors

Azure Event Hubs consumer

Glue connects to Event Hubs directly from SAP application servers using Rest API calls and authenticates with an SAS token. With Glue’s modular and open architecture, it is easy to extend the functionality to support new storages with custom fetchers and consumers.

Summary

Now that all the pieces are in place, we can prepare Glue extractors, define an event replication to Azure Event hub, and activate Datavard Glue streaming mode.

Below you can see a recording of implementation we prepared in our demo landscape. To be able to push and display the events recorded into Cosmos DB, we used a combination of:

  • Azure functions deployed to process the incoming events from Event Hubs
  • CosmosDB to store events as denormalized records
  • Azure SignalR service to push notifications based on Cosmos DB database triggers to a fronted web application when a database record is created in Cosmos DB.

About the authors

Roman Broich is a Senior Cloud Solution Architect at Microsoft, specializing in SAP and Azure ecosystem. He has over 20 years of experience in business intelligence and data management.

Click here to watch our recent interview with Roman Broich about SAP on Azure – Microsoft’s Cloud Migration Strategies

Roman Broich, Senior Cloud Solution Architect at Microsoft

Mattia Molteni is a Solutions Engineer at Datavard. He has over 7 years of experience working with SAP systems and data analysis.

Mattia Molteni, Solutions Engineer at Datavard