Integrating SAP With Serverless Camel
Camel in RHI allows you to connect to any preferred protocol. Developers can configure to connect to the endpoints with their address, credential, or SSL settings.
Join the DZone community and get the full member experience.
Join For FreeSAP is the world's leading enterprise business processing solution. There is always going to be a need to connect an organization's core to other SaaS offerings, partners, or even another SAP solution. Red Hat Integration (RHI) offers flexibility, adaptability, and the ability to move quickly with framework and software to build the event-driven integration architecture. Not just connect, but also maintain data consistency across platforms.
SAP offers interfaces such as OData v4, OData v2, RESTful API, and SOAP as the HTTP-based one, or you can also use the classic RFC(remote procedure call) and iDoc Messages. And there is a recent event enablement add-on that offers AMQP and MQTT protocol. Camel in RHI allows you to seamlessly connect to any of your preferred protocols. Developers can simply configure to connect to the endpoints with their address, credential, and/or SSL settings.
Here is a quick demo on how to Integrate SAP with 3rd party services. Using Camel K with Knative Eventing.
This demo uses the Classic Enterprise Procurement module in the Central Component of SAP ERP (SAP ECC) and gets the sales order details using Telegram Messenger. An employee enters the order ID into the chatbot on Telegram, it will immediately receive the order result back in the same conversation screen.
To build this true event-driven integration platform in the middle and available on multiple cloud vendors, this time, I have Red Hat OpenShift installed on AWS and installed the Red Hat Serverless on to it.
To decouple the functions/services, I created two channels (event delivery mechanism that can fan-out received events, through subscriptions, to multiple destinations, or sinks). And have a Camel (Part of Red Hat Integration) application subscribe to the query events in the channel. This ABAP function was exposed as an OData V4 endpoint via the 'SAP Netweaver Gateway,' therefore I can simply use the Olingo4 components.
xxxxxxxxxx
import org.apache.camel.builder.RouteBuilder;
import java.util.Map;
import java.util.HashMap;
public class SapOdata extends RouteBuilder {
Map<String, String> tempMap = new HashMap<String, String>();
public void configure() throws Exception {
from("knative:channel/ordercheck")
.unmarshal().json()
.setHeader("CamelOlingo4.keyPredicate", simple("'${body[text]}'"))
.bean(this, "create(\"${body[text]}\",\"${body[chat][id]}\")")
.to("olingo4://read/SalesOrder")
.split(simple("${body.properties}"))
.choice()
.when(simple("${body.name} == 'Customer' "))
.bean(this, "aggregate(\"CUSTOMER\",'${body.value}')")
.when(simple("${body.name} == 'Transactioncurrency' "))
.bean(this, "aggregate(\"CURRENCY\",'${body.value}')")
.when(simple("${body.name} == 'Grossamountintransaccurrency' "))
.bean(this, "aggregate(\"SUM\",'${body.value}')")
.when(simple("${body.name} == 'Taxamountintransactioncurrency' "))
.bean(this, "aggregate(\"PRICE\",'${body.value}')")
.end()
.end()
.setBody(method(this, "getSalesOrder()"))
.marshal().json()
.log("${body}")
.to("knative://channel/returntxt")
;
public void create(String salesorder, String chatid) {
tempMap= new HashMap<String, String>();
tempMap.put("SALESORDER", salesorder);
tempMap.put("CHATID", chatid);
}
public void aggregate(String name, String value) {
tempMap.put(name, value);
}
public Map<String, String> getSalesOrder() {
return tempMap;
}
}
The above camel route simply subscribes to one of the channels called order
to check and query SAP using Olingo4 components. I have placed the configuration separately in a configuration file when there is a need to change for different circumstances.
xxxxxxxxxx
camel.component.olingo4.configuration.serviceUri=https://xxx/sap/opu/odata4/sap/mmm/default/sap/yyy_salesorder/zzzz/
camel.component.olingo4.configuration.httpHeaders[Authorization]=Basic oooooo
OData v4, RESTful API, and Events protocols are better suited for Serverless. As OData v4 has significant performance over the older version and support for the analytical application. Whereas OData v2, RFC, and iDoc are better used in traditional Camel Project.
What Components to use for SAP endpoints.
Serverless Camel K/Camel Quarkus |
Camel |
|
OData V4 |
✓ |
✓ |
OData V2 |
✕ |
✓ |
Restful API |
✓ |
✓ |
SOAP |
O |
✓ |
RFC/IDoc |
✕ |
✓ |
Events |
✓ |
✓ |
Camel can then use the built-in data format components to transform the payloads. In the serverless case, data is mostly in the form of JSON. By marshaling and unmarshaling incoming payload, we can easily access and retrieve the content value we needed.
Once complete, we can simply deploy this route to the cloud using Camel K cli or extensions in VS Code. Camel K will deploy it as a serverless application.
xxxxxxxxxx
kamel run SapOdata.java
We also use Kamelets — a simplified set of templates for connecting source and sink for all the functions. You can have it pre-built by integration developers (or camel developers). And other function developers to use it as a tool to connect. Therefore eliminate time-consuming "EventSource" development every single time. We want to grow our lists of Kamelets!!! Apache Camel community would love your contribution!! https://camel.apache.org/camel-kamelets/latest/
In the demo, we used the Telegram Kamelet, and bind it to my demo telegram bot. Queries from users are then passed through the Kamelet Binding, gets converted to standard CNCF CloudEvents, and becomes the events that will trigger the subscribed functions.
Summary: Architectural Overview
This is how Red Hat Integration can help to achieve modernized integration with SAP.
SAP exposes business functionality through the Netweaver Gateway. Camel K or Camel in RHI can be used by developers to integrate(bi-directional) these functionalities. Camel K/Camel not only connects the dot but also provides a set of built-in patterns and data transformations components making customized integration easy. They can be deployed in the form of a serverless function, serverless source/sink, or a long-running microservice.
At the time of writing, I don’t see complete support from SAP event enhancements, developers do still require to retrieve real data via other methods such as OData and APIs. To implement a true event-driven architecture, AMQ Streams (Kafka) can be used as the event stream store to handle streaming of events, for reducing decoupling and achieving near real-time latency.
Since the system is based on events, we can also capture changes of data state in Databases using Debezium. Keeping all data consistent by passing the updated state back to SAP.
When there is a need to expose any functions or services as API endpoints, we can easily implement it with Camel using the OpenAPI Standard Specification. And have the API managed and secured by the 3scale API management platform.
Openshift is the platform that can run on major cloud vendors and on-prem, so it’s truly cloud-agnostic. It provides a serverless platform to deploy and manage all functions. And with Interconnect we will be able to broadcast events, to the closest data center to optimize traffic control.
As a result, it is now ready to connect to endless 3rd party and partner services, streaming large amounts of edge signals and providing real-time processing from edge devices. Legacy, mainframe systems can also be part of the ecosystem. Lastly, this is a good tool for SAP to SAP integration too.
See more on Red Hat Integration.
Opinions expressed by DZone contributors are their own.
Comments