0% found this document useful (0 votes)
410 views393 pages

SAP Integration Suite - Course

The document introduces the SAP Integration Suite and discusses distributed architecture, its challenges, and the API-first approach. It defines distributed architecture as a system of interconnected subsystems and outlines challenges such as protocol diversity and error management. The API-first approach emphasizes designing APIs before other software aspects to enhance collaboration, scalability, and developer experience.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
410 views393 pages

SAP Integration Suite - Course

The document introduces the SAP Integration Suite and discusses distributed architecture, its challenges, and the API-first approach. It defines distributed architecture as a system of interconnected subsystems and outlines challenges such as protocol diversity and error management. The API-first approach emphasizes designing APIs before other software aspects to enhance collaboration, scalability, and developer experience.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 393

Introducing SAP Integration Suite

Explaining Distributed Architecture and Its


Challenges

Distributed Architectures and Its Challenges

In this lesson, the following topics are discussed:

 What is a distributed architecture?

 What challenges have to be solved?

What Is a Distributed Architecture?

A distributed IT system is an architectural paradigm and according to


the Encyclopedia of Business Informatics Online Dictionary is defined as
follows:

"A distributed IT system comprises subsystems (components in the


broadest sense) that are coupled together within the framework of a
specific architecture and handle tasks cooperatively."

In contrast, in a monolithic IT system, the functions of a system are


bundled (centralized). The logical distribution of system functions among
components can be accompanied by a coordinated physical
decentralization in a computer network.

We understand all IT installations and services as components. For


example, ERP On-Premise, SAP S/4HANA On-Premise and/or SaaS
applications.

An Example of a Distributed Architecture

Below is an example of a customer landscape. You can see that many


different technical systems communicate with each other with all their
technical peculiarities.
What Challenges Have to Be Solved?

Due to heterogeneity, expensive and complex solutions must be sought,


found, and implemented to the following challenges:

 Many different transport and message protocols

 Release management

 Monitoring

 Error identification and correction

 Latency

 Quality of service

 Security

 Availability of implementations

 Observability

 Documentation

One way to overcome these challenges is to use an API first approach.

An API-first approach is a design methodology where the primary focus


during the development process is on designing the application
programming interface (API) before any other aspects of the software
solution. This approach emphasizes creating a well-defined and robust API
that meets the needs of developers and integrators who use it to build
applications, services, or integrations.

Key aspects of an API-first approach include:

1. Design-Centric: API design becomes a central part of the software


development life cycle. Design decisions prioritize clarity,
consistency, and usability to ensure developers can easily
understand and use the API.

2. Iterative Development: Developers iterate on the API design based


on feedback and evolving requirements before moving onto the
backend implementation or other aspects of the application.

3. Facilitates Collaboration: By defining the API early, different teams


(front end, backend, mobile, third-party developers) can work
concurrently and in sync. This reduces dependencies and
accelerates development.

4. Ensures Scalability and Flexibility: An API-first approach promotes


scalability as it encourages modular, reusable components. It also
provides flexibility to adapt to changing business requirements and
technological advancements.

5. Focus on Developer Experience (DX): DX becomes crucial, aiming to


provide developers with clear documentation, SDKs (Software
Development Kits), code samples, and testing tools to ease
integration and usage of the API.

6. Supports Ecosystems Growth: APIs designed with an API-first


approach can foster an ecosystem around the platform or service,
allowing for third-party integrations, partnerships, and innovation.

Summary

For automated technical processes, many different software components,


different installations, technology, and spatial availability are often
coupled via networks with different protocols. The functionalities of these
implementations are called services. The interface is provided via APIs
(Application Programming Interfaces). The type, location, and
implementation of the APIs are irrelevant for now. These APIs are language
agnostic. That's why we also speak of an API first architecture or
approach. Cloud native architectures are based on APIs. This is
accompanied by new challenges.

Describe the Whole Exercise Scenario

Business Example

A well-known retailer faces a major challenge due to global supply chain


disruptions, port congestion, and logistical issues. This causes delays in
delivering popular products like electronics, home appliances, and
specialty items. As a result, many customers are affected, putting
customer satisfaction, and the company's reliable reputation at risk.
To address this situation effectively, the company has devised a
comprehensive plan to inform all affected customers about the delays and
manage their expectations.

You are part of the IT Team, which must identify all orders impacted by the
delivery delays.

Message Flow of This Exercise

The following figure shows the business example as a step-by-step


process:

Description

1. The responsible department creates a list of non-deliverable


ProductIDs.

2. The business process gets started with this list.

3. First, the list gets split and processed for each product In the list one
after the other (sequentially). To do this, the process runs through a
loop.

4. It needs to be checked if the processed product can be found in the


database. To do this, a request needs to be sent to the database
with it ProductID.

5. If the product is available in the database, another request will be


sent to the database to get the order data.

6. We get the matching order data to our products from the database.
7. For each product there is various order information associated to
different customers. To find out all customers for each order
information another request needs to be sent to the database.

8. The customer information gets stored in a data store.

9. Once all products from the list have been processed and the
customer information has been saved, the process ends.

In a subsequent business process, the customer information is retrieved


from the data store and the customers are informed by e-mail about the
delayed delivery. However, this process is not the subject of the exercise.

The Solution Architecture

In a previous phase, the corresponding solution architecture was defined


with the help of the SAP Integration Solution Advisory Methodology (ISA-
M). The entire process is to be realized with the tools and concepts of the
SAP Integration Suite.

The list of products that can't be delivered on time is created by the


specialist department (1).

The iFlow in Cloud Integration (2) is started via a request. API


management (3) encapsulates the API of the actual order database (4).

Authorization takes place via the SAP Cloud Identity Service (5).
Prerequisites

Knowledge of the design, development, and operation of APIs is helpful.

From a technical point of view, you need either one:

 SAP BTP trial account

 Free tier model for SAP BTP

 Enterprise SAP BTP account

The SAP Integration Suite must be provisioned in your subaccount. The


following capabilities must be activated in the SAP Integration Suite:

 API Management

 Cloud Integration

The appropriate role collections must then be assigned to the user. We will
build on this later in the exercises.

Set up the prerequisites for your SAP BTP trial account

If you decide to proceed using a SAP BTP trial account, the following blog
post will show you how to set up an SAP BTP trial account and how to
enable SAP Integration Suite: SAP BTP Trial Account Creation and Enabling
Integration Suite service

Please make sure that everything is set up accordingly before proceeding.

Create an Account on the SAP Gateway Demo System (ES5)

Business Scenario

The SAP Gateway Demo System is based on SAP NetWeaver AS ABAP


7.51. It is used, for example, to try OData services. Various sample
services are implemented for this purpose. These services are accessible
via the internet. In this exercise, the GWSAMPLE_BASIC service is used.
This sample service is based on the Enterprise Procurement Model (EPM).

 Documentation of GWSAMPLE_BASIC: Sample Service - Basic

 Documentation of other services: New SAP Gateway Demo System


available | SAP Blogs

Task Flow

In this exercise, you perform the following tasks:

1. Create a new SAP Gateway Demo System Access.

2. Go to SAP Gateway SAP GUI for HTML.


3. Change the password.

4. Open the SAP Fiori launchpad Menu.

Find a matching tutorial: Create an Account on the SAP Gateway Demo


System | Tutorials for SAP Developers

Prerequisites

 A browser and internet access.

 You need an SAP account. You created one in the previous task.

 The description is for those who do not yet have a free account on
the SAP Gateway Demo System (ES5).

Outcome After This Exercise

A working account in the SAP Gateway Demo System (ES5) with which
you can consume OData APIs based on the EPM SalesOrder model.

What Do You Learn with This Exercise

You learn how to create a free account in the SAP Gateway Demo System
(ES5).

Task 1: Create a New SAP Gateway Demo System Access

Steps

1. Create a new SAP Gateway Demo System Access.

1. Open a fresh chrome browser window.

2. Go to https://register.sapdevcenter.com/SUPSignForms/.

3. Enter your S-User or SAP ID User/E-mail to register.

4. Your registered SAP ID Service account retrieves your user


details.

5. Check the I have read and understood the Terms and


Conditions box.

6. Choose the Register button.


7. You will receive a success message for your registration via e-
mail.

8. Do not close this successful registration step page.

9. On that page, select Show password and write it down.

________________________________________________

10. Go to https://sapes5.sapdevcenter.com and log in with


your credentials.

11. Change your log in credentials as necessary.


12. Congratulations, you have successfully registered on the
ES5 Gateway System.

Task 2: Open the SAP Fiori Launchpad Menu

Steps

1. Go to the SAP Easy Access page.

1. Open https://sapes5.sapdevcenter.com/sap/bc/gui/sap/its/
webgui# and log in with your user and password.
2. Check that you have successfully logged on to the Gateway
Demo System.

3. Open the Fiori launchpad.

4. Choose the Manage Products menu entry.

5. Enter HT as ProductID and pressEnter to show the available


products.

6. You find the following product details:


We use this function to check whether the ProductIDs we use later are
present in order to verify the processing.

Log in to your SAP Integration Suite

Business Scenario

For an integration developer to work with the SAP Integration Suite,


configuration steps within the SAP BTP cockpit are necessary.

Task Flow

In this exercise, you perform the following tasks:

1. Log in to your SAP BTP Trial Account.

2. Log in to the SAP Integration Suite.

Prerequisites

As described in the last exercise, an SAP BTP Trial Account with a


configured Integration Suite.

Outcome After This Exercise

You can log in your SAP BTP Trial Account. You can log in your configured
SAP Integration Suite as a developer. You can identify and test the
capabilities assigned to your user.

Task 1: Log in to Your SAP BTP Subaccount

Steps

1. Log in to your SAP BTP Trial Account.


1. Open your browser and insert the given training
URL: https://cockpit.hanatrial.ondemand.com/trial/#/home/tria
l.

2. Log in with your user and password when prompted.

3. Click on the "Go To Your Trial Account"- button.

2. Log in to your SAP BTP Trial Subaccount

1. Click on the tile called "trial". The subaccount overview will


appear.
Task 2: Log in to the SAP Integration Suite

Steps

If you have only just created your SAP BTP Trail Account, no API proxies
and integration packages are visible when you call up the corresponding
capability.

1. Log in to your SAP Integration Suite application.

1. In your SAP BTP subaccount, navigate to Services → Instances


and Subscriptions and click on the link Integration Suite.
2. Open the Cloud Integration capability

1. Open Design → Integrations and APIs at the navigation bar.

3. Open API Management.

1. Open Configure → APIs at the navigation bar.

Introducing First Approach to an API First


Architecture

The API First Approach

In this lesson, the following topics are discussed:

 What is an API First Approach?

 What are APIs?

 Types of APIs.

 Interface types mainly used in SAP.


 How to design APIs with description languages.

 Fulfill the contract between API Provider and API Consumer.

What Is an API First Approach?

The API First Approach is an approach to software design that focuses on


the API to create applications that can be easily connected to each other.
API First creates ecosystems of applications that are modular, reusable,
and extensible, just like Lego bricks.

An API First Approach means that your APIs are treated as first class
citizens. Everything revolves around the end product being used by
mobile devices and client applications. An API First approach involves
developing APIs that are consistent and reusable. This is achieved by
using an API description language.

Explanation:

API Provider

An API provider provides an interface with technical features (No. 1).


Symbolically, this is shown in this picture with the ball- or lollipop notation.
This interface can be consumed by a service (No. 5). The representation
corresponds to the socket notation (No. 2). The service acts in the role of
the API consumer.

API Consumer

The API consumer requires an interface (No. 3). The representation


corresponds to the socket notation (No. 4). In this case, the API provider is
the service (No. 5). The presentation is again made with the ball- or
lollipop notation.

Find More Information at:

 Understanding the API-First Approach to Building Products.

 It is not Cloud first or API first but Strategy first. API Management
Strategy in Multicloud Environments | SAP Blogs.

What Are APIs?

API stands for Application Programming Interface. An API specifies the


operations as well as inputs and outputs of a software component. It
defines functionalities that are independent of their respective
implementations, so that these implementations can vary without
affecting the user of the API.

Types of APIs

In the literature, there are many overviews of different types of APIs.


Below is a simple overview of the APIs that we must understand for this
course.

Here, four different APIs are defined under the superset of APIs, based on
their use.
Databased APIs

These are intended for file exchange between systems. Files can be, for
example, configuration files.

Object-orientated APIs

These are used in object-oriented programming languages to define the


communication of the classes with each other.

Remote APIs (No. 1)

This includes today's important web APIs, such as REST and SOAP APIs.
REST itself is not a protocol but a software architectural style.

Messaging APIs (No. 2)

These are asynchronous APIs that send events based on events. These are
used in Event Driven architectures.

Interface Types Mainly Used in SAP

Technically, a few protocols have been agreed upon. The following


overview figure shows the APIs with their relationship to each other.

These are at SAP:

 REST APIs, like RESTful HTTP APIs:

o OData 2.0.

o OData 4.0, both like SAP Graph.

 Remote APIs, like SOAP APIs.

 Messaging APIs, like Event APIs.


The presentation formats are either JSON or XML.

How to Design APIs with Description Languages

SOAP-based APIs are described with Web Services Description Language


(WSDL). It is an XML-based interface description language that is used for
describing the functionality offered by a web service.

You will find more information about WSDL here: Web Services Description
Language

REST-based APIs can be created with the following main description


languages:

 RAML

 OpenAPI

RAML

RAML is a YAML-based language for describing RESTful APIs.

You will find more information about RAML here: https://raml.org/

OpenAPI

The OpenAPI specification, previously known as the Swagger specification,


is a specification for a machine-readable interface definition language for
describing, producing, consuming, and visualizing RESTful web services.

You will find more information about OpenAPI here: openapis.org

The OpenAPI specification is used in API management.

Fulfill the Contract Between API Provider and API Consumer

The description in RAML or OpenAPI is the contract between API provider


and API consumer. The API must be implemented along this contract.

Two cases are possible:

Implementation - First Approach

The implementation is first created on the part of the API provider. The
description (contract) is then generated automatically. This is used by the
consumer API.

Contract - First Approach

The API description is first created with RAML or OpenAPI. Generators


automatically create the rudimentary implementation for the API provider
and -consumer. These rudimentary implementations in different
programming languages and concepts must then be fully implemented.
Summary

In an API First Approach, only the provided APIs are of interest. By using
standardized communication protocols and concepts, such as REST and
OData, almost any use case can be mapped across all borders.
Communication takes place between an API provider who has the
interface available and the API consumer who consumes this interface.
The APIs are described with standardized description languages, such as
RAML or OpenAPI. The implementation can be created at first at the API
provider and then at the description (implementation - first approach
based on it) or the other way round by first creating the description and
then automatically creating server and client stubs from it (contract - first
approach). We see that SOAP APIs, OData 2.0, and 4.0 APIs, as well as SAP
Graph play the leading role in the SAP universe.

Explaining the Clean Core Approach

Clean Core Approach

The Clean Core Concept

To thrive in the digital era, organizations must adapt to changing business


environments and embrace new capabilities.

Organizations depend on IT to deliver the capabilities that drive their


strategic initiatives. Flexibility and speed in supporting strategic changes
or fostering innovation are crucial. However, legacy systems burdened
with significant technical debt can hinder organizational agility. Modern
enterprise resource planning (ERP) software now serves as a dynamic,
evolving platform, offering extended capabilities such as built-in insights,
automation, workflows, standard integrations, and easy extensibility.
These features provide distinctive advantages without the encumbrance
of technical debt. To apply these new technologies and enable business
evolution, organizations must address the complexities of their legacy
systems.

Organizations depend on IT to deliver the capabilities that drive their


strategy

IT must deliver essential capabilities while maintaining organizational


agility. The primary goal of IT is to enhance an organization’s competitive
edge by equipping the business with the right technological capabilities.
Historically, this has led to variations within ERP systems—such as
modifications in data, processes, integrations, extensions, and code. In
some instances, these changes were crucial for providing business-critical
capabilities or integrating disparate systems. However, some changes did
not yield valuable outcomes. Regardless of their necessity, the methods
used to extend standard functionality often introduced technical debt,
demanding significant maintenance efforts.

Changes in both the business and technology landscapes are compelling


organizations to address legacy complexities. Disruptions in global supply
chains, evolving customer preferences, and shifting employee dynamics
require businesses to adapt rapidly to new demands. Technological
advancements are delivering new capabilities at an accelerated rate.
However, significant technical debt hinders the adoption of these new
technologies, with 10 to 20 percent of the technology budget for new
products being diverted to resolving issues related to tech debt. This
diversion limits the ability to respond effectively to emerging business
requirements.

A "core" serves as the foundation of IT's ability to support and enable the
strategy.

It pertains to the dimensions used to deliver capabilities through an ERP


system. We consider six dimensions when discussing an organization's
core. These technical and procedural aspects work together to equip your
business with the capabilities needed to achieve desired outcomes.
The clean core approach aims to create modern, flexible, and cloud-
compliant ERPs. Achieving a clean core involves integrating and extending
a system to ensure it aligns with cloud compliance standards, while
maintaining effective governance of master data and business processes.

A common misconception is that a clean core means a system free of core


customization. In reality, a truly "clean" core adheres to standardized
guidelines for all its elements. This adherence ensures that when system
upgrades are necessary, changes can be implemented with minimal
manual effort for testing and adapting existing structures.

Organizations can find it challenging to achieve a perfectly clean core.


However, the more they can integrate these elements into their
landscape, the greater the benefits they experience in business
performance and cloud delivery.

Elements and Criteria for Clean Core


A clean core enhances current operations and establishes a solid
foundation for the future.

Adhering to standard guidelines for innovation enables the creation of a


competitive edge while sidestepping technical debt. Introducing new
capabilities into the organization often yields benefits for both its top and
bottom lines. Organizations operating within standard environments can
quickly and affordably adopt new capabilities compared to those deviating
from standard practices. The benefits projected from the new capabilities
are realized more rapidly and extensively when the core is clean.
Establishing a clean core, whether in readiness for transitioning to the
cloud or already within it, optimizes the advantages of cloud delivery.
Transitioning to a clean core necessitates an overarching strategic
direction.

Organizations must start by comprehending the extent of necessary


changes and the urgency with which they must be implemented.

Amount of Change: Organizations with a significant demand for new


capabilities to meet business requirements can contemplate a project to
transition to a modern ERP system. Conversely, those already equipped
with adequate capabilities prioritize optimization or innovation within the
existing environment.

Required Speed of Change: Organizations requiring rapid innovation to


adapt to market dynamics need to transition to modern platforms for
enhanced capabilities. Conversely, those facing a less urgent need for
change seeks to enhance access to existing capabilities. These
considerations dictate whether the focus can be on optimizing the current
core, migrating to a less complex landscape, embarking on a complete
transformation with a new greenfield system, or innovating differentiating
capabilities beyond the core. Grasping this overarching strategic direction
will guide the precise actions needed to enhance agility in the near future
and ready the organization for forthcoming initiatives like transitioning to
the cloud. Given that each organization varies in its level of
standardization, we advise collaborating with SAP to identify the most
suitable transformation approach for your specific needs.

Once the strategic direction is defined, organizations need to initiate


action.

Certain organizations achieve cleanliness through migration


transformation, while others do so through new implementations.
Regardless of the approach, maintaining cleanliness needs establishing
robust governance. Transitioning to and sustaining a clean core demands
dedication from both the business and IT sectors.

Recognizing the potential value will bolster investment in this initiative.

Through collaboration among business, IT, and partners, achieving a clean


core becomes achievable. SAP provides a proven methodology to assist
organizations in comprehending both business and technology
imperatives, along with the complete spectrum of transformation benefits.
Our process involves benchmarking business Key Performance Indicators
(KPIs) against industry peers and evaluating digital maturity compared to
industry standards to offer recommendations on areas to prioritize, along
with the necessary enabling capabilities. We work together with you to
articulate the qualitative and quantitative benefits of addressing identified
gaps.

Tackling the Clean Core is a continuous strategic endeavor.


Conclusion: 'CLEAN CORE' is a method aimed at achieving and preserving
the cleanliness of an organization's core enterprise management systems
to enhance 'maintainability' and reduce the total cost of ownership (TCO).
This encompasses activities across software, data, interfaces, processes,
and operations.
Describing Common Uses of Metadata
in XML
Common Uses of Metadata in XML

Metadata in XML

The metadata functionality serves several important functions, especially


in contexts where data must be understood, processed, or shared by
different systems and users. Here are the common uses of metadata in
XML.

Descriptive Information

Purpose: To provide a high-level description of the XML document.

1. Author: Identifies the creator of the document.

2. Title: Provides a brief title or name for the document.

3. Description: Offers a summary of the document's content and


purpose.

4. Keywords: Lists keywords relevant to the document, aiding in search


and categorization.

Structural Information

Purpose: To describe the structure and format of the XML document.

1. Schema Location: Points to the schema file (XSD) that defines the
structure of the document.

2. Namespaces: Declares the XML namespaces used in the document,


ensuring element and attribute names are unique and avoiding
conflicts.
Administrative Information

Purpose: To manage and control the usage and versioning of the


document.

1. Version: Specifies the version of the document.

2. Creation Date: Indicates when the document was created.

3. Modification Date: Records the last time the document was


modified.

4. Access Rights: Defines who can access or modify the document.

Technical Information

Purpose: To provide technical details that assist in the processing of the


document.

1. File Size: Indicates the size of the document.

2. Format: Specifies the format or encoding used in the document.

3. Checksum: Provides a checksum value for verifying document


integrity.

Provenance Information

Purpose: To track the origin and history of the document.

1. Source: Indicates the source from which the document originated.

2. History: Logs changes and updates made to the document over


time.
Rights Management

Purpose: To manage intellectual property rights and usage terms.

1. License: Specifies the licensing terms under which the document


can be used.

2. Copyright: States the copyright holder and related information.

3. Usage Restrictions: Lists any restrictions on the use or distribution of


the document.

Enhancing Metadata in OData Services

To address the challenges we've encountered, one proposed solution is to


extend the `$metadata`endpoint to function as an OData Service itself.

To use the $metadata functionality, use the following GET URL, where you
need to insert your individual parameters as follows:

Code Snippet

Copy codeSwitch to dark mode

123

http://<yourAPI>:<PORT>/$metadata
Summary

Metadata in XML is essential for providing context, structure, and control


over XML documents. It enhances the document's usability by offering
descriptive, structural, administrative, technical, provenance, and rights
management information. By embedding metadata within XML, users and
systems can better understand, manage, and use the data effectively.
Describing Operating Modes of API
Architectures
Operating Modes of API Architectures

In this lesson, the following topics are discussed:

 Request-driven architecture.

 Event-driven architecture.

 Combination of request-driven and event-based architecture.

What Is a Request-Driven Architecture?

A request-driven architecture is based on direct communication between a


service provider and a service consumer. This communication is
synchronous.

Sample Request

A sample SOAP request can look like the following:

Sample Response

A sample SOAP response can look like the following:

Event-Driven Architecture

What are events?


From a technical perspective, an event is a small message that provides
information about a business occurrence. An event could be, for example,
the creation of an order in an SAP S/4HANA system. The event is fired via
push (asynchronously) to a broker.

An event can look like the following:

Event Producer at SAP

SAP offers the Event Enablement add-on in ECC and SAP S/4HANA (Cloud
and On-Premise) to support you.

What is an event-driven architecture?

The event-driven architecture (EDA) is a software design pattern in which


decoupled applications can publish and subscribe asynchronously to
events via an event broker. An event broker is a modern, message-
oriented middleware, for example, Service Mesh at SAP.

Read more here: What Is EDA (Event-Driven Architecture)?

Pull Variant

The event provider (No.1) fires an event (No. 2) with a designation


called Topic (No. 3). A topic can be, for example,
BusinessPartner_Changed. This is an asynchronous communication. The
topic is subscribed to by a queue (subscribe to topic). In this case, from
queue A (No. 4). The message can now be actively picked up by the event
consumer (No. 5). For this purpose, the Event Mesh provides an API. The
communication comes from the event consumer, triggered by a pull on
the queue. This is a synchronous communication.

Push Variant

The event Provider (No. 1) fires an event (No. 2) with a designation


called Topic (No. 3). A topic can be, for example,
BusinessPartner_Changed. This is an asynchronous communication. The
topic is subscribed to by a queue (subscribe to topic). In this case, from
queue B (No. 4). A webhook is now assigned to this queue. (No. 5). If an
event with a topic arrives in the corresponding queue, the webhook is
called up and the event is sent directly to the event consumer (No. 6) via
push. The communication comes from the Event Mesh, triggered by an
incoming event. This is also an asynchronous communication.

What Are Webhooks?

A webhook is an HTTP callback: an HTTP POST that occurs when a change


in state is made, that is, an event notification via HTTP
POST. Webhooks are used for real-time notifications, so that your system
can be updated directly at the time of the event. Basically, a webhook is a
simple URL, which you can also call up regularly in the browser. In the
context of the SAP Service Mesh, the webhook URL is used when
subscribing. This allows the service to know where to send the message
with the subscribed topic.

In combination with the Event Mesh, a webhook URL is included in the


subscription of a topic to a queue.

Combination of request-driven and event-based architecture

In reality, classic requests and events are combined.


No. 1 to No. 6 works as described in a previous section, but what is new is
that the event consumer (No. 7) submits a request to the event producer
(No. 1). For example, to read the product data that has been changed. The
event consumer can then process the dataset.

Summary

There are two types of operating the API First Approach. The one, request-
driven, is based on synchronous communication of the partners involved.
The second, event-driven, works asynchronously and decouples the
transmitter from the receiver in terms of time and content. This loose
coupling offers many advantages. Both types are often used one after the
other.

Using Common APIs


Introducing to REST

In this lesson, the following topics are discussed:

 What is REST?

 Realization as a Web API.

What Is REST?

Generally, REST describes a machine-to-machine interface. In web


development, REST allows content to be rendered when it's requested,
often referred to as Dynamic Content. RESTful Dynamic Content uses
server-side rendering to generate a website and send the content to the
requesting web browser, which interprets the server’s code and renders
the page in the user’s web browser.
Architectural properties

The constraints of the REST architectural style affect the following


architectural properties:

 Performance in component interactions, which can be the dominant


factor in user-perceived performance and network efficiency.

 Scalability allowing the support of large numbers of components and


interactions among components.

 Simplicity of a uniform interface.

 Modifiability of components to meet changing needs (even while the


application is running).

 Visibility of communication between components by service agents.

 Portability of components by moving program code with the data.

 Reliability in the resistance to failure at the system level in the


presence of failures within components, connectors, or data.

Architectural Constraints

The REST architectural style defines the following six guiding constraints:

 Client–server architecture

 Stateless

 Cache-ability

 Layered system

 Code on demand (optional)

 Uniform interface

Realization as a web API

Web service APIs that adhere to the REST architectural constraints and
properties are called RESTful APIs. HTTP-based RESTful APIs are defined
with the following aspects:

 A base URL, such as http://api.example.com/.

 Standard HTTP methods (for example: GET, POST, PUT, and


DELETE).

 A media type that defines state transition data elements (for


example, Atom, microformats, application/vnd.collection+json). The
current representation tells the client how to compose requests for
transitions to all the next available application states.
Sources

A great deal of information can be found on the internet in the shape of


texts, videos, podcasts, and more about REST. There are also various
books about this topic from different perspectives:

 Representational state transfer#

 REST Services in ABAP

Summary

A RESTful web API is created with the REST software architecture style.
The interface must correspond to both the architectural properties and the
constraints. This results in an implementation that, in addition to a
base URI, uses standard HTTP methods and supports many media types.
This places RESTful Web APIs at the heart of an API First Approach
(architecture).

Introduction to OData

In this lesson, the following topics are discussed:

 What is OData?

 Architectural constraints

 Tutorial: Learn about OData Fundamentals

What Is OData?

In computing, Open Data Protocol (OData) is an open protocol that allows


the creation and consumption of queryable and interoperable RESTful APIs
in a simple and standard way. OData builds on HTTP, AtomPub,
and JSON using URIs to address and access data feed resources. It enables
information to be accessed from various sources including (but not limited
to) relational databases and file systems.

Architectural Constraints

The following constraints must be fulfilled:

 Resource identification

 Fixed documents

 The service document

 The metadata document

 Dynamic resources

 Resource operation
 Querying

 Resource representation

The following are explanations about the constraints:

Resource identification

OData uses URLs to identify resources. We use the following base URL
with:

https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BASIC/

Fixed documents

The following fixed resources can be found:

The service document

The service document lists entity sets (collections), functions, and


singletons that can be retrieved. Clients can use the service document to
navigate the model in a hypermedia-driven fashion. The service document
is accessed directly with the base
URL: https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_B
ASIC/. Below is an example from a later exercise.

The metadata document

The metadata document describes the types, sets, functions, and actions
understood by the OData service. Clients can use the metadata document
to understand how to query and interact with entities in the service.
The metadata document is available
at: https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BA
SIC/$metadata.

Dynamic resources

The URLs for the dynamic resources can be computed from the
hypermedia information in the service and metadata documents. The data
feed for the ProductSet collection also contains links to other entities. The
URL is as
follows: https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPL
E_BASIC/ProductSet
Resource operation

OData uses the HTTP verbs to indicate the operations on the resources.
This is a REST aspect, as we have already seen.

Querying

URLs requested from an OData endpoint can include query options.


The OData protocol specifies various system query options that endpoints
can accept, they can be used to filter, order, map, or paginate data. In the
following, only the product with the product number HT-1000 is retrieved.
The URL
is: https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BAS
IC/ProductSet('HT-1000')
Resource representation

OData uses different formats for representing data and the data model.
In OData protocol version 4.0, JSON format is the standard for
representing data, with the Atom format still being in the committee
specification stage. For representing the data model, the Common
Schema Definition Language (CSDL) is used, which defines
an XML representation of the entity data model exposed
by OData services.

 In JSON

The URL
is: https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BAS
IC/ProductSet('HT-1000')?$format=json
 In XML

The URL
is: https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/GWSAMPLE_BAS
IC/ProductSet('HT-1000')?$format=xml

Tutorial: Learn about OData Fundamentals

There is a fantastic tutorial Learn about OData Fundamentals with the


following topics:

 Where OData came from and why it’s designed the way that it is.
 What the standard OData operations are and how they relate
to HTTP.

 What the public Northwind OData service has to offer.

 What OData service documents and metadata documents describe.

 The basics of OData entity types, sets, and relationships.

It is recommended that you work through this tutorial to get fully


acquainted with OData at SAP.

Summary

The OData protocol describes a RESTful Web API. In addition to


the REST principles, it also offers other advantages that are beneficial in
the professional environment. The interface itself provides two meta
documents, service and meta document. This comprehensively describes
the interface. The most effective function is the ability to filter, search,
and execute functions such as $count via query parameters on the
interface. The representation takes place either in XML or JSON.

Explore the APIs From the SAP Gateway Demo System

Business Scenario

In this exercise, we examine the GWSAMPLE_BASIC service. This sample


service is based on the Enterprise Procurement Model (EPM). We want to
use the APIs directly to find the customerID and the associate address
data for the productID HT-1000.

Documentation of GWSAMPLE_BASIC: Sample Service - Basic

Task Flow

In this exercise, you perform the following tasks:

1. Check if the productID HT-1000 is available.

2. Get the Sales Order IDs to the productID HT-1000.

3. Find the respective customer for each order of the productID HT-
1000.

Prerequisites

You have access to the SAP Demo Gateway system ES5.

Outcome After This Exercise

You check whether all OData APIs required later can be called up and that
technical data, for example, the productID HT-1000, is available.
What Do You Learn Within This Exercise

You get to know the OData APIs to be used later. You understand the
OData model.

Task 1: Check If the ProductID HT-1000 Is Available

Steps

1. Check if the productID HT-1000 is available.

1. Choose the following link: Check for HT-1000

2. If the productID exists, you get a data record to this productID.

3. Check out the response.

Task 2: Get the Sales Order IDs to the ProductID HT-1000

Steps

1. How many sales orders are there for the product HT-1000?

1. Choose the following link: Count the SalesOrders

2. Check the result:

Note

It is possible that the count gives you a different value than the one shown
in the screenshot. The reason for that is that the system is reloaded
cyclically with products.
2. Find the Sales Order ID and Item Position for the productID HT-1000.

1. Choose the following link: Check SalesOderID and ItemPosition

2. The OData navigation/ToHeader is used to find out the


customerID. Therefore, we need
the SalesOrderID and ItemPosition for every dataset entry.

Task 3: Find the Respective Customer for Each Order of the ProductID HT-
1000

Steps

1. Find the CustomerIDs for all SalesorderIDs and ItemPositions.

1. Choose the following link: Check the CustomerID

2. You get a dataset with the CustomerID.

2. Find the address data to one customer.

1. Choose the following link: Check the communication data


2. You get the address data for notification for every customer.

SAP Graph

In this lesson, the following topics are discussed:

 What is SAP Graph?

 SAP Graph is a Data Graph

 Sample Data Graph

 Developing SAP Graph APIs

What Is SAP Graph?

SAP Graph is a unified API for SAP, using modern open standards like
OData v4. SAP Graph is one connection to your business data. SAP Graph
introduces a new, unified API to access all business data as a single,
semantically connected, Business Data Graph.

Summary:
 SAP Graph is based on *OData v4*.

 Any SAP backend can be used as a data supplier.

 There are already fully implemented APIs


at: https://api.sap.com/graph.

 You can create your own APIs with different procedures.

SAP Graph Is a Data Graph

Data graphs support queries that explore the data and the relationships.

A data graph represents entities (data objects) as nodes of a graph:

 Entities are grouped in namespaces.

 The fields of an entity are called attributes.

Edges represent semantic relations:

 Between a root-node and its sub-nodes: a composition.

 Between independent nodes: an association.

Sample Data Graph

An example should illustrate the idea.


An SAP Graph API with name product, using the namespace sap.graph is
linked to product entities from the SAP S/4HANA* (No. 2) and from the SAP
Sales Cloud (No. 3). The new API thus offers an extended view of product
data stored in various SAP systems.

Developing SAP Graph APIs

There are two options to create an SAP Graph API. You can create APIs
directly via an implementation with the SAP Graph module in Node.js.
Under Source, you will find 26 video tutorials that show all the
development steps in detail. A second approach from the field of low code
is offered via SAP API Management.

Resources
 Blogs: SAP Graph Multi-Part Tutorial: Information Map

 SAP Graph Community: Graph

Summary

SAP Graph is an API based on OData v4 that connects entities from


different sources in one API. For example, product data from various SAP
systems, such as SAP S/4HANA, Sales Cloud, and others, is offered in one
API. There are already standard SAP Graph APIs for various entities. SAP
Graph APIs can be programmed with Node.js as well as created via SAP
API Management in low code mode.
Introducing iPaaS
Describing the integration Strategy of
SAP
The Integration Strategy of SAP

In this lesson, the following topics are discussed:

 The integration strategy for intelligent enterprises of SAP.

 How does SAP fulfill these promises?

SAP’s Integration Strategy for Intelligent Enterprises

The integration strategy is based on four principles:

No. 1: Predefined integration

SAP has outlined its strategy and road map for the integration of end-to-
end processes of its intelligent suite (SAP software to SAP software) based
on well-defined suite qualities. As an example, the alignment of domain
models helps to ensure that master data can be exchanged in an efficient
and convenient way between SAP applications, including prebuilt
integrations in SAP Business Accelerator Hub.

No. 2: Open integration

Besides integrations of SAP software to SAP software and SAP software to


partner software, SAP is open for any third-party integration as well as for
custom extensions that leverage public APIs. With the Open Connectors
capability of SAP Integration Suite, SAP provides feature-rich, prebuilt
connectors for more than 170 third-party applications.

No. 3: Holistic integration

SAP provides a holistic integration technology portfolio that covers most


flavors of integration required in cloud and hybrid landscapes. Based on
SAP Integration Suite, SAP supports various integration use cases, ranging
from process, data, user, and "thing" to analytics-centric integration.

No. 4: AI-driven integration

In addition to bringing intelligence into core business processes, SAP is


using AI techniques to simplify the development of integration scenarios.
One example is the integration advisor capability of SAP Integration Suite.

How Does SAP Fulfill These Promises?

If we look at the positioning of SAP BTP, we see that one of the most
important pillars is integration.

Integrate applications On-Premise, in the cloud or in a hybrid model, while


securely connecting applications, processes and people.

 Integration of SAP and beyond to include third parties, including API


management, B2B/B2G support, data integration, event-based, IoT
support, and process integration.

 Ready built content that includes integration packs, APIs, business


events, and connectors.
 Continuous access to best-practices through pre-packaged SAP
business content.

 AI-powered content advisor to speed integration development and


lower on-going support costs.

Summary

SAP wants to offer its customers a comprehensive, secure, and


comprehensive integration solution. This is based on four principles. These
principles are as follows:

1. Predefined integration

2. Open integration

3. Holistic integration

4. AI-driven integration

This claim is fulfilled by services on SAP BTP.

Positioning of the Integration Suite From a


More Technical Perspective
Positioning of the Integration Suite From a More Technical
Perspective

In this lesson, the following topics are discussed:

 Positioning of the Integration Suite from a more technical


perspective.

 Sources.

Positioning of the Integration Suite from a more technical perspective

SAP Integration Suite is the toolkit recommended by SAP to simplify and


accelerate integration for SAP, partner, and third-party integration
scenarios. The term iPaaS (Integration as a Service) was also coined for
this purpose.

The following shows the SAP Integration Suite with its core capacities.
The capabilities are as follows:

Integration Assessment

Adopt a structured and well-directed method for designing and


implementing your enterprise integration strategy.

1. Enhance integration through optimal best practices: Streamline


integration using a structured approach grounded in the SAP
Integration Solution Advisory Methodology (ISA-M).

2. Access recommendations for integration: Utilize the latest SAP


integration technology recommendations to streamline processes
and automate tasks.

3. Produce documentation: Enhance communication between project


teams and system integrators by thoroughly documenting your
integration strategy and infrastructure.

4. Engage in collaboration with the business: Respond more quickly


with a built-in request feature from the business and manage
changes more efficiently.

Cloud Integration

Develop and execute integration flows across cloud, on-premise, and


hybrid environments for application-to-application (A2A), business-to-
business (B2B), and business-to-government (B2G) scenarios.

1. Utilize over 3,200 prebuilt integrations: Streamline integration


between SAP and third-party applications and data sources.
2. Accelerate your pace with a web interface assisted by AI: Efficiently
design and oversee intuitive integration flows with speed and
simplicity.

3. Speed up interface implementations: Utilize message-mapping


recommendations sourced from a crowd.

4. Revamp integration in the cloud: Transition away from legacy on-


premise integration tools like SAP Process Orchestration.

API Management

Ensure the security, governance, and transformation of your APIs through


management and delivery processes, featuring an intuitive catalog,
comprehensive documentation, and policies and protocols that foster
innovation while protecting against threats.

1. Standardize your APIs: Establish consistent URLs and APIs by


utilizing your own domain, and expand the prebuilt SAP data model
with Graph to facilitate connections between SAP and third-party
systems.

2. Safeguard your APIs: Defend against security threats, handle traffic,


and cache data at the edge using over 40 preconfigured policies.

3. Release and oversee your APIs: Drive innovation at a rapid pace


with user-friendly APIs that can be swiftly published and managed
under your own domain and branding.

Integration Advisor

Leverage a crowdsourced machine learning method to tackle major


challenges encountered in business-to-business (B2B), application-to-
application (A2A), and business-to-government (B2G) integration
scenarios.

1. Collaborate with standardized message formats: Enable the support


of various business partners utilizing diverse industry-standard
message formats.

2. Achieve greater speed with message type definitions: Reduce


implementation time by utilizing a library of prebuilt industry-
standard message type definitions.

3. Streamline message implementation: Establish and document


message implementation guidelines tailored to industry and
geographic content.

4. Accelerate message mapping: Make the creation of message-


mapping artifacts easier with AI-generated mapping proposals.
Trading Partner Management

Develop and upkeep trading partner profiles to capture the distinctive


business-to-business (B2B) requirements of each partner.

1. Streamline partner management: Simplify the management of B2B


integration scenarios involving multiple trading partners.

2. Accelerate partner onboarding: Initiate collaboration with new


partners swiftly through an intuitive onboarding process.

3. Create partner agreements: Establish trading partner agreements


that delineate relationships and structure pertinent business
transactions.

4. Monitor integration with partners: Monitor your integrations with


business partners to ensure seamless transactions.

5. Exchange data with partners: Leverage robust cloud integration


capabilities to share business data with trading partners.

Open Connectors

Streamline connectivity to over 170 third-party applications and solutions


catering to collaboration, messaging, CRM, help desk, and various other
scenarios.

1. Achieve swift progress with preconfigured connectors: Streamline,


standardize, and expedite connectivity with third-party cloud
applications.

2. Utilize RESTful APIs and JSON for your work: Benefit from open data
formats, irrespective of the underlying architecture of third-party
services.

3. Convert data fields: Apply shared resource definitions from one or


multiple third-party applications to a standardized format.

4. Provide support for bulk data operations: Normalize the process of


uploading and downloading data, irrespective of the underlying
service architecture.

Asssess Migration Scenarios

Transition your landscape from SAP Process Orchestration to SAP


Integration Suite, ensuring future readiness by migrating to an Integration
Platform as a Service (iPaaS). Construct an enterprise landscape that is
heterogeneous and hybrid, catalyzing the transformation of the
organization into an intelligent enterprise.
1. Optimizing solutions: Evaluate your existing integration landscape
and review the key aspects. Estimate the potential migration effort.

2. Enhance project planning: Set up the PI Migration Assessment


feature and facilitate secure connections through internal or
external access points. Identify technical hurdles and offer potential
solutions.

3. Expertise in implementation: Utilize the migration assessment tool


and migration tool to provide SAP users with a modern and user-
friendly experience, thereby expediting the migration process.

The add-on capabilities are as follows:

Event Mesh

Enable applications to communicate asynchronously in real-time across


distributed landscapes using a fully-managed cloud service that embraces
event-driven architectures. Implement scalable event-driven process
integration patterns and handle peak loads across environments.

1. Natively respond to application events: React more swiftly to events


from core SAP solutions like SAP S/4HANA and SAP SuccessFactors
solutions, as well as from third-party sources.

2. Establish connections across different landscapes and geographical


locations: Distribute and subscribe to business events originating
from both SAP and third-party sources across hybrid, multi-cloud,
and edge environments.
3. Effectively handle events with confidence: Administer, oversee, and
visualize decentralized event streaming throughout distributed
landscapes.

SAP Graph

Unified API for accessing SAP-managed data that can be used to create
new extensions and applications using SAP data.

Cloud Transport Management

Management of software products between accounts in different


environments by transporting them over different terms.

On-top capacities are as follows:

SAP Business Accelerator Hub

Jumpstart for integration projects with APIs, packaged integration content


and adapters.

Resources

 Basic- and Standard Editions: SAP-Integration-Suite

 SAP Discovery Center: SAP Integration Suite in Discovery Center

 SAP Community: SAP Business Technology Platform

 SAP product page - Getting started with SAP BTP: Integration

 Technical point of view: Technical Landscape, Cloud Foundry


Environment

Summary
We distinguish between core capabilities, add-on capabilities, and add-on
capabilities. The core capabilities are implemented in the Integration
Suite. The most important capabilities are API Management and Cloud
Integration.
Managing APIs

Introducing SAP API Management


Introduction to SAP API Management

In this lesson, the following topics are discussed:

 What is SAP API Management?

 Typical use cases

 User roles

What is SAP API Management?

SAP API Management is a solution that maps the entire life cycle of an API.

In particular, it offers the following features:

Building APIs

API portal is an application that provides a common platform for API


designers to define and publish APIs. Every API Management customer is
provided with their own API portal application in the cloud. The API portal
offers capabilities to configure systems, build and publish APIs, analyze
and test APIs.

Publishing APIs

A Product is a bundle of APIs. It contains metadata specific to your


business for monitoring or analytics. For example, all APIs related to CRM
can be bundled as one CRM Product. Instead of publishing APIs
individually, it is easier to bundle related APIs together as a Product and
publish it. After including the required APIs to a Product, the Product is
published to the catalog, where the Product is available for application
developers.

Analyzing APIs

API Management provides comprehensive analytics capabilities to


understand the various patterns of API consumption. The API Analytics
server uses the runtime data of the APIs to analyze the information. The
runtime data is gathered, analyzed, and displayed as charts, headers, and
key performance indicators (KPIs).

Consuming APIs
API Business Hub Enterprise is an application that provides a common
platform for application developers to consume APIs. Every API
Management customer is provided with their own API Business Hub
Enterprise application in the cloud. The API Business Hub Enterprise offers
capabilities to onboard application developers, explore and test APIs,
create and subscribe to applications.

Monetizing APIs

SAP API Management provides a monetization feature to all API providers


to generate revenue on the use of APIs. As an API admin, you can create a
rate plan and attach a rate plan to a Product in the API Portal and publish
the product in the API Business Hub Enterprise.

You can also view bill details of each developer in the API Portal. As an
application developer, in the API Business Hub Enterprise, you can create
an application and add products to the application. Based on the product
usage, you can view the corresponding bill details.

Discover API Packages

In API Management, you can discover API Management platform


supported API packages available in the SAP Business Accelerator Hub on
the API Portal.

API Designer

Model APIs using the API designer. The API designer is based on the
OpenAPI Specification (OAS) standard, which is an open source
collaborative project. The API designer allows you to seamlessly create
and edit APIs, and view its corresponding documentation in a single
window frame. It has rich inbuilt capabilities, which cannot be limited to
conversion of APIs from one format to another (for example, from RAML to
YAML, JSON to YAML, and vice versa), generate server and client stubs,
download API specifications, and so on. The OpenAPI specification, which
is created by the API designer, can be published as APIs on the SAP
Business Accelerator Hub.

Typical Use Cases

The relevant use cases are as follows.


Enterprise Digital Apps

Build Enterprise Digital Apps for employees (Field Sales/Services/Support),


customers, and partners.

Real-time API Integrations

Share enterprise data from data lakes, business systems to suppliers,


partners, and customers. Expose enterprise transactions and processes as
APIs for suppliers and customers.

Enterprise Microservices

Build and manage API-first microservices. Enable DevOps of


microservices.

User Roles

For all subsequent work in SAP API Management, you need


the APIPortal.Administrator role collection. An overview of the total
available roles can be found in the second link under Resources.

Resources

The following resources can be found on the SAP Help Portal:

 Overview Page SAP API Management

 Assignment of User Roles

Summary

With API management, the entire life cycle of an API can be mapped. It
begins with the creation, publication, and maintenance over the entire
term. In an API first architecture, API management is the central building
block and is used in every specific use case of a customer.
Describing the Technology in an Overview
Components of SAP API Management

In this lesson, the following topic is discussed:

Components of SAP API Management

Components of SAP API Management

SAP API Management consists of the following components. In the


following figure, you will find the component diagram (that we will also
use in later exercises).

The important components are numbered. In the following list, you will
find a first overview:

 No. 1: API Provider - Summarizes many different sources

 No. 2: API - The new API with URL (No. 4)

 No. 3: API Designer - An openAPI definition

 No. 4: The new API URL - Acts as proxy


 No. 5: Policies - Edit the request and response message

 No. 6: Product - Implementation of an API

 No. 7: Application based on a product

 No. 8: Additional services such as monitoring, testing, and more

 No. 9: Entry in API Business HUB Enterprise

We take a closer look at the individual components in the following


lessons.

Resources

The following resources are available on the SAP Help Portal:

 Components of API Management

 SAP Help Portal

Summary

SAP API Management consists of various components that provide


different capabilities. The most important ones are the API Provider, API,
Product, and API Business Hub Enterprise.

Creating an API Provider


API Provider Creation

In this lesson, the following topics are discussed:

 The role of an API provider

 Procedure for creating an API provider

The Role of an API Provider

An API provider defines the connection details for services running on


specific hosts whose details you want to access. Use an API provider to
define the following:

 Details of the host that you want an application to reach.

 Define any further details that are necessary to establish the


connection, for example, proxy settings.
An API provider can connect to the following sources:

 No. 1: Open Connectors

 No. 2: Through Cloud Connector to all SAP On-Prem backends ( ECC,


SAP S/4HANA On-Prem, PI, PO, and more)

 No. 3: Cloud Integration which delivers an OData or SOAP API

 No. 4: APIs from the Internet

 No. 5: From the SAP Business Accelerator Hub for prototyping

Procedure for Creating an API Provider

The following steps must be carried out in order:


1. Start the wizard by choosing the Create button:

2. Enter a name and description in the Overview tab.

3.
4. Enter the connection data in the Connection tab.

Note

You use your own Host details to connect to your backend system.

The following assignment applies:

 Internet: No. 4

 On-Premise: No. 2

 Open Connectors: No. 1

 Cloud Integration: No. 3

Each type uses different configuration data. A detailed list of the


parameters that must be set can be found at: Create an API Provider
5. Enter the Catalog Service Settings data in the Catalog Service
Settings tab.

The path information (No. 1) is standardized in SAP S/4HANA. The catalog


service and path can be found in the
transaction /n/IWFND/MAINT_SERVICE on the SAP backend system. A basic
authorization is required to access the catalog server.

6. Test your API Provider. When you save the entries, the created API
provider can be tested. To do this, use the Save button first.
Depending on the Type, a successful test is one of the following:

Type Internet and On-Prem

The HTTP Status code 200 means that the connection to the backend
system is correctly set up:
Type Open Connectors and Cloud integration

The HTTP status code is not 200, but in this case it is correct since this is
only a ping:

Sources

The following sources can be found at the SAP Help Portal: API Providers

Summary

An API provider encapsulates access to APIs from various sources. More


than 260 third-party REST-based APIs are connected through the Open
Connector. SAP backend systems such as SAP S/4HANA On-Prem or
ECC/PI/PO can be connected through the Cloud Connector. SOAP APIs can
also be made available through the Cloud Integration. Ultimately, almost
all APIs can be connected. The procedure for connecting a foreign API is
wizard-controlled.

Create an API Provider Based on Your ES5 Demo System

Business Scenario

For the utilization of the GWSAMPLE_BASIC API via the ES5 database, we
are creating an API provider, which encapsulates the original interface.
The API provider components and accompanying artifacts are marked in
red in the following diagram.

Task Flow

In this exercise, you will perform the following steps:

1. Create an API Provider based in the ES5 Demo System

2. Test the connection.

Prerequisites

You have a functioning API Management within the SAP Integration Suite.

Outcome after this exercise

A running API Provider based on the ES5 demo system.

What do you learn within this exercise

You get familiar with how to create and use an API provider of type
internet into the API Management.

Task 1: Create an API Provider Based on the ES5 Demo System

Steps

1. Log on to the API Management Portal.

1. Log on to your API Portal within your Integration Suite as


already shown several times.

2. Navigate to Configure → APIs


2. Create an API provider.

Choose the left menu on Configure → APIs.

In the Configure tab, choose the API Providers tab to create an API
Provider.

Now, choose the Create button to set up an API Provider. It opens a new
user interface to set up your API Provider.

Use the following data:

Field Value

Name SAPGatewayDemoSystemES5_Provider

Connection tab

Type Choose Internet

Host sapes5.sapdevcenter.com
Field Value

Port 443

Use SSL flagged

Catalog Service Settings

Path Prefix /sap/opu/odata

Service Collection /IWFND/CATALOGSERVICE/


URL ServiceCollection

Authentication
Basic
type

Username enter your ES5 user (P/S number )

Password enter your ES5 password

1. On the Overview tab, in the Name field, enter the name from
preceding the table.

Simply override the given entry.

2. Switch to the Connection tab.

Enter the following data (excerpt from the preceding table):


Field
Input
Name

Type Choose Internet

sapes5.sapdevcenter.
Host
com

Port 443

Use SSL flagged

3. Switch to the Catalog Service Settings tab.

4. Enter the following data (excerpt from the table before this):

Field Name Input

Path Prefix /sap/opu/odata

Service Collection /IWFND/CATALOGSERVICE/


URL ServiceCollection
Field Name Input

Authentication
Basic
type

Username enter your ES5 user (P/S number )

Password enter your ES5 password

5. The Catalog URL is automatically created based on the data


that you have entered.

6. Choose Save.

Task 2: Test the Connection

Steps

1. Test your API Provider.

1. When it is saved, the Test Connection button becomes active.

2. Choose Test Connection at the top right of the screen.

3. Check that you receive an HTTP status code: 200.


If you don't get an HTTP status code: 200, check the parameters that you
have entered.

4. Navigate back to Configure → APIs Overview.

5. Check that you can see your configured API provider.

Creating an API Proxy


Creation of APIs Based on the API Provider

In this lesson, the following topics are discussed:

 What are the possibilities to create an API?

 Create an API using the Create button.

 Create an API using menu links.

What are the Possibilities to Create an API?

The API, which will be created, is important for further implementations. It


acts as a proxy of the actual resource API. The previous name was API
Proxy. This name often appears in older documents.

The following options are available:

Create an API using the Create button with the following options:

 API provider (No. 1 at concept diagram)

 URL (No. 2 at concept diagram)

 API proxy

Create an API using menu links with the following options

 Create in API Designer (No. 3 at concept diagram)

 Import an external API


Create an API Using the Create Button

This is probably the most common case. With this option, you can create
an API with an API provider, a provided URL, or an existing API.

Procedure

 Start with Design → APIs to open the Develop screen.

 Start the wizard by choosing the Create button. A new window


opens.

Use the API Provider option.


Select the API Provider radio button and open the selected box. All API
providers display. Choose one, for
example, SAPGatewayDemoSystemES5_Provider.

When the API Provider is chosen, a new list box with the name Discover is
available. Some data, such as the host and the type of API, have already
been entered.

When the list box is chosen, all available services listed within the catalog
service are displayed.
What exactly is displayed here depends on the type of API Provider. In the
case of Open Connectors, for example, all instances are displayed. For the
type Cloud Integration, the available integration flows are displayed.

The following figure shows a list of available services, which are usable
from the SAP backend system. The API provider gets defined by choosing
one service from the provided catalog services.

You can choose exactly one of the offered services. After that, further data
is added to the mask.

When you finish creating this API (proxy), it has to be deployed so that it
can be used. After that, the API (proxy) is ready for testing. The service
type is automatically defined. In this case, it is OData.

Use the URL option to create an API


In case you do not use an API provider, you can directly specify the URL of
your source.

In this case, you must enter the data manually (marked). The Service
Type can only be REST or SOAP.

After saving and deploying the API, it can also be tested.

Use the API Proxy option to create an API

In this case, you can copy an existing API.


In this case, you must enter the data manually (marked). The Service
Type can only be REST or SOAP, even if the spied API is from type OData.

Create an API Using Menu Links

Use the Create in API Designer Option

Start creating an API by choosing the menu link, Create in API Designer.
Switch to the openAPI editor. You can manually create your API there
through the openAPI language in YAML. In this case, all entries must be
created manually. The server URL is automatically adjusted after saving.
The Service Type can only be REST.

Before Saving

After Saving

Note

Be aware that the shown URL is a sample and will not work.

Import an External API


Start the creation of an API by choosing the menu link, Import API.

The Service Type corresponds to the imported API.

Resource

Help Portal: Create an API Proxy

Summary

There are several ways to create an API. APIs can be created:


 By using the Create button.

 Based on an existing API provider.

 Directly through an provided URL.

Finally, you can also define it with an openAPI specification via


the Create button in API Designer.

Create an API Based on a Predefined API Provider

Business Scenario

The objective is to establish a connection between the API provider,


indicated in green, to a new API that we are developing within the API
Management. The subsequent connection and associated artifacts that
emerge from this process are marked in red within the component
diagram.

Task flow

In this exercise, you will perform the following tasks:

1. Create an API proxy.

2. Test the API proxy.

Prerequisites

You have successfully completed the previous exercises.

Outcome after this exercise


You have a working API based on the API provider from the ES5 system.
This allows APIs to be called from the ES5 system.

What do you learn within this exercise

After this exercise, you can create and configure an API proxy based on an
API provider. This allows you to call the GWSAMPLE_BASIC OData interface
on the ES5 backend.

Task 1: Create an API Proxy

Steps

1. Create an API proxy based on the naming convention


GWSAMPLE_BASIC_date_subaccountnumber.

1. Log in to your SAP Integration Suite.

2. Navigate to Configure, APIs, API Proxies and choose


the Create button.

3. In the API Provider field, choose your previously created API


provider named APIGatewayDemoSystemES5_Provider.
4. Now that we are fetching the catalog data from the ES5
system, we must select a specific API. Select
the Discover button, which is available after choosing the API
provider.

5. After selecting the Discover button, a new pop-up window


appears to display the available Business APIs on the ES5
system.

6. Search for GWSAMPLE_BASIC, choose the pop-up button at the


start of the row, and then choose the OK button.
7. You return to the initial pop-up window, and most of the fields
will be prefilled with data. Please note that the Host Alias
depends on your SAP BTP trial account. In some
screenshots, x00-
cld900.apimanagement.eu12.han.ondemand.comis used.
However, this does not affect the functionality.
2. Configure your predefined API proxy.

1. Add your date and subaccount number as a postfix at the end


of the Name and Title fields. We are using the API Proxy name:
GWSAMPLE_BASIC_23042024_pa. Feel free to use your own
name.

2. Enter v1 into the Version field. The v1 is immediately


appended as a prefix in the Name field and as a path postfix
in the API Base Path field.
3. Add the v1 postfix manually at the end of the Title field.

4. Afterwards, choose the Create button. Subsequently, the


wizard pop-ups close, and you are redirected back to the
Create API page.
5. Choose the Deploy button to activate the API proxy. If
everything has been correctly set and the API proxy has been
successfully deployed, the API proxy URL can now access the
OData service GWSAMPLE_BASIC at ES5, as displayed.

6. Navigate back to the Configure menu, either by using the


breadcrumb navigation at the top left or via the main menu on
the left, by selecting Configure → APIs.

Task 2: Test the API Proxy

Steps

1. Test with the proxy resource.


1. Navigate to the detail page, choose your API proxy.

2. Open the detail page of your API proxy by clicking on the row
containing your API proxy. Then, switch to the Resources tab.

3. Search and find the entry for ProductSet.


4. Choose the ProductSet resource to unfold the user interface,
then choose the GET/ProductSet method area.
5. Choose the Try it out button.

6. Scroll further down until you see the blue Execute bar.
7. Choose the Execute bar to send your request.
8. The request fails with an HTTP code 401 - Unauthorized, as we
have not enabled authorization for the call. We do this in a
later exercise by involving policies. The authorization set
during the creation of the API provider was solely for calling
the Catalog Service.

2. Test your API proxy with the Test functionality.

We have observed that the API proxy URL results in an authorization error
when the resource GET /ProductSet is invoked. This error is due to missing
authentication involving a user and password connected to the original
interface.

Currently, the SwaggerUI does not allow us to implement basic


authentication as there are no fields designed for a user and password.
Though one could modify the OpenAPI Specification to include these.

Regardless, a standard function does exist that allows us to conduct


testing with Basic Authentication.

1. Navigate with the left side menu to Test → APIs.

2. Choose your API proxy from the left side, for example,
GWSAMPLE_BASIC_date_subaccountnumber.

3. In the address navigation menu, add /ProductSet to the end of


the path. The GET method is automatically selected.
4. Choose the Authentication: None-link above the address bar.
Choose Basic Authentication .

5. Enter the user and password for the ES5 system. Afterwards,
choose the OK button.
6. Choose the Send button on the bottom right.

7. If you get an HTTP status code 200, everything works as


expected.

If the call was successful, as in the screenshot shown, all the data records
hosted on the database are displayed as a feed in the response. If you
don't get an HTTP status code 200, check your authentication credentials.
Using Policies
Usage of Policies

In this lesson, the following topics are discussed:

 What are policies?

 Policy types.

 Apply pre-built policies using the Policy Designer.

 Use predefined policies.

What Are Policies?

SAP API Management provides capabilities to define the behavior of an API


by using Policies. A Policy is a program that executes a specific function at
runtime. They provide the flexibility to add common functionalities on an
API without having to code them individually each time.

Policies provide features to secure APIs, control the API traffic, and
transform message formats. You can also customize the behavior of an API
by adding scripts and attaching them to policies.

You can apply a policy on the request or response stream. You can also
specify if it is applicable on the proxy endpoint or target endpoint. For
information on the types of policies supported by API Management, see
Policy Types.

You can use the following types of policies:

 Predefined policy templates at SAP Business Accelerator Hub.

 Pre-built policies within the Policy Editor.

Policy types

The following is the list of pre-built policies supported by API Management:

 Access Control

 Access Entity

 Assign Message

 Basic Authentication

 Extract variables

 Invalidate Cache

 JavaScript
 JSON to XML

 Key Value Map Operations

 Lookup Cache

 Message Logging Policy

 OAuth v2.0

 OAuth v2.0 GET

 OAuth v2.0 SET

 Populate Cache

 Python Script

 Quota

 Raise Fault

 Reset Quota

 Service Callout

 Spike Arrest

 SAML Assertion Policy

 SOAP Message Validation Policy

 Verify API Key

 XML to JSON

 XSL Transform

 XML Threat Protection

 Regular Expression Protection

 JSON Threat Protection

 Response Cache

 Statistics Collector Policy

Read more here: Policy Types

Apply Pre-built Policies Using the Policy Designer

To use one of the available policies, it is first necessary to consider where


the policy will work. The policy editor offers the following options in the
request and response:
Policies can also be used for all calls (PostClientFlows, resources), then
you do not select a PostClientFlow. In the following example, there are two
PostClientFlows CatalogCollection and ServiceCollection. The policies are
used for all PostClientFlows because none has been specially selected.

Security - Policies

SAP BTP, API Management offers many out-of-the-box API security policies
based on the Open Web Application Security Project (OWASP). API security
best practices can be customized for your enterprise requirements.

There is a blog series that showcases the security policies from SAP BTP,
API Management to secure and protect the enterprise APIs as shown in the
following figure, SAP Cloud Platform API Management.
You will find the blog series here: SAP Cloud Platform API Management –
API Security Best Practices Blog Series

Logging and Monitoring Policies

The Message Logging policy lets you send syslog messages to third-party
log management services, such as Splunk, SumoLogic, Loggly, or similar
log management services.

A blog with the Message Logging Policy and Splunk can be found
here: Splunk – Part 1 : SAP APIM Logging & Monitoring | SAP Blogs

A blog with the Message Logging Policy and Loggly can be found
here: Part 7 – API Security Best Practices – Log all API interactions | SAP
Blogs

Use Predefined Policies

There are predefined sets of policies for specific applications. They can be
found in the SAP Business Accelerator Hub.

Navigate to https://api.sap.com/ to Explore → APIs.


Under the Policy Template tab SAP Business Accelerator Hub, you will find
20 policy templates for immediate use.

Import a Policy Template from SAP Business Accelerator Hub

Search and find the Performance_Traceability policy template at SAP


Business Accelerator Hub. Choose the Performance_Traceability tile. You
will find the content at the Flow Type.

The following is an example with these two items:

 Flow Type: ProxyEndPoint PreFlow

 Content: JavaScript file


To download the complete policies, choose the Download button in the
upper right corner and save the *.zip file locally to your computer.

Switch to the Develop view and choose the Policy Templates tab.

Then, import the previous locally stored policy template through


the Import button.

At the end, the Performance_Traceability template is now imported into


the SAP Business Accelerator Hub.
To place the policy template, navigate to the API in which you want to use
the policy, and navigate to the Policy Editor. Choose Edit so that the Policy
Template button becomes active.

Now, choose the Apply button to import the policy template. Then select
the previously imported policy template and choose Apply.

The policy template has been imported and inserted into the
corresponding flow.
After the update, save and redeploy, the policy template is active.

Summary

SAP API Management provides capabilities to define the behavior of an API


by using policies. These capabilities can be used in both the request and
the response. There are policies for the transformation of the payload and
calls to external, for example, to log in using OAuth 2.0 and much more. In
particular, the security policies are useful. SAP offers federal of policies
and policy templates for certain use cases. They can be easily imported.

Add Policies for Basic Authentication Against the ES5 Demo


System

Business Scenario

To use the interfaces in API management, authentication against the


source interface is necessary, which is accomplished through a policy
implementation. The creation of connections and artifacts is indicated with
red markings in the following component diagram.

Task flow

In this exercise, you will perform the following tasks:

1. Add the Message Policy.

2. Add the Basic Authentication Policy.

3. Test your policies.

4. Monitor your API calls.

Task 1: Add the Message Policy

Steps

1. Add the Message Policy.

1. Navigate to Configure → APIs and choose the API Proxies tab.


2. Open your API view by choosing the
link GWSAMPLE_BASIC_v1_date_subaccountnumber.

3. Choose the Policies button.

4. Choose the Edit button.

5. You can see the grey plus symbols on the right side.
6. Choose the following: Flows → TargetEndpoint → PostFlow. The
plus signs are now black and usable.

7. Find the Assign Message Policy on the left side menu.

Note

To implement the policies in your API proxy, you must have a working
concept on how the policies work.
8. Choose the plus sign at the Assign Message policy symbol. To
add the following:

9. Choose the Add button in the pop-up window.

Field Name Value

Assign
Policy Type
Message

Policy
setCredentials
Name

Endpoint
TargetEndpoint
Type

Flow Type Postflow

Incoming
Stream
Request
10.

11. Choose the Add button in the pop-up window.


12. In the XML editor, enter the following code via copy-
paste:

Note

Be aware to substitute the Username and Password with yours.

Code Snippet

Copy codeSwitch to dark mode

12345678910111213141516171819

<!-- This policy can be used to create or modify the standard HTTP request
and response messages -->

<AssignMessage async="false" continueOnError="false" enabled="true"


xmlns='http://www.sap.com/apimgmt'>

<!-- Sets a new value to the existing parameter -->

<Set>
<Payload contentType="application/json" variablePrefix="@"
variableSuffix="#">{"name":"foo",
"type":"@apiproxy.name#"}</Payload>

</Set>

<AssignVariable>

<Name>request.header.username</Name>

<Value>Your username from your GWSAMPLE_BASIC backend


system</Value>

</AssignVariable>

<AssignVariable>

<Name>request.header.password</Name>

<Value>Your password from your GWSAMPLE_BASIC backend


system</Value>

</AssignVariable>

<IgnoreUnresolvedVariables>false</IgnoreUnresolvedVariables>

<AssignTo createNew="false" type="request">request</AssignTo>

</AssignMessage>

Note

Be sure to substitute the username and password with yours.

Note

You can also download the code snippets via Github for this learning
journey:

integration-suite-learning-journey/src/rev_20 at main ·
SAP-samples/integration-suite-learning-journey · GitHub

13. Enter your username and password in a plain text


format. It is also possible to set both values encrypted from a
keystore.
Now, you have defined two variables, request.header.username and
the request.header.password.

14. Be aware before your update and save this entry, to set
a second policy that uses the variables for basic
authentication.

Task 2: Add the Basic Authentication Policy

Steps

In this step, the previously defined variables are set as authorization


parameters in the HTTP request header.

1. Add the Basic Authentication Policy.

1. On the right side, choose the Basic Authentication policy and


choose the plus button.
2. Add/check the following data:

Field Name Value

Basic
Policy Type
Authentication

Policy setBasicAuthenticat
Name ion

Endpoint
TargetEndpoint
Type

Flow Type Postflow

Stream Incoming Request


3.

4. Choose the Add button within the pop-up window.

5. Check the entries and choose the update button (on top right
of the screen).
Switch back to the detail view of your API Proxy and choose
the Save button.

6. After saving a navigation bar at the top of the API Proxy details
window with a request to deploy, the API Proxy shows.

7. Choose the Click to Deploy link to confirm the deployment via


the detail pop-up window.
Task 3: Test Your Policies

Steps

After the set up of the automatic authentication, you can now test your
configured policy via the resources. You receive a status code 200.

1. Test your policies via resources.

1. Choose the Resources tab. Choose the first


entry GET/BusinessPartnerSet, select the Try out button, and
then choose the Execute button.
2. You receive an HTTP status Code with response 200 and the
containing response body.

3. Check out the result.


Note

If you don't get an HTTP status Code with response 200, check your
username and password in the policy. Be sure that your backend system
account is not blocked by too many failed logons.

Task 4: Monitor Your API Calls

Steps

We use the API Monitor to examine the metrics of the API calls made so
far. An extra app is available.

1. Analyze your API calls.

1. Navigate to Monitor → APIs.


Note

Your monitor can look different.

Explore the API, Policies, and More at SAP Business Accelerator


Hub

Business Scenario

Exploring the SAP Business Accelerator Hub to identify available APIs,


policies, and other artifacts enable you to expedite your integrations,
extensions, and innovations.

Task flow

In this exercise, you will perform the following tasks:

1. Log on to the SAP Business Accelerator Hub (https://api.sap.com).

2. Explore the policies.

3. Explore the APIs at the SAP Business Accelerator Hub.

Prerequisites

You have successfully completed the previous exercise.

Outcome after this exercise

Gain a comprehensive understanding of the SAP Business Accelerator Hub


and its extensive collection of available APIs for your utilization.
What do you learn within this exercise

By exploring the SAP Business Accelerator Hub, you can discover and
analyze API Management Policies, allowing you to determine their
suitability for various purposes based on your needs and objectives.

Task 1: Log on to SAP Business Accelerator Hub

Steps

1. Log on to the SAP Business Accelerator Hub.

1. Open the link: https://api.sap.com

2. Log in through the Login button to try out the APIs.

3. Navigate to Explore → APIs.

Task 2: Explore the Policies


Steps

1. Explore the policies.

1. Choose the Policy Template tab.

2. Now, you can check out all the available policies, which you
can use in your SAP API Management.

3. Find the Performance_Traceability policy and open it.


4. Check out the documentation for more information.

5. Review the configuration of the policies, especially


the proxy_request_retriving_latency.

Task 3: Explore the APIs at SAP Business Accelerator Hub

Steps

1. Explore the APIs.

1. Navigate back to https://api.sap.comExplore → APIs.

2. Choose ODATA V2.

3. In the search bar, enter Purchase Order and choose Enter.

You see a tile with the name: Purchase Order.

4. Choose the Purchase Order tile. You will find a lot of


information there.
Note

The Purchase Order ODATA V2 is marked as deprecated, but don't worry it


works, so try it out.

5. Choose the Try Out button.

6. On the left side, find Purchase Order → GET


/APurchaseOrder and choose it. On top of the page, you see
the chosen context, /A_PurchaseOrder.

7. Choose the Run button.

You get an HTTP Status Code 200 and a filled Response Body and
Response Header.
Editing APIs
Edit an API

In this lesson, the following topics are discussed:

 Explore the View API view.

 Explore the Notification Area (No. 3).

 Explore the API URL - Proxy URL (No. 1).

 Explore the Navigation tabs (No. 2).

 Create or edit an API from API Designer.

Explore the API View

When you create and deploy an API, it is displayed in the API view. The
following is the example of the GWSAMPLE_BASIC API.

The following areas are marked to be examined in more detail in the


following list:

 No. 1: API URL - Proxy URL.

 No. 2: Navigation Tabs No. 3: Notification area.

Explore the Notification Area (No. 3)

On the right panel, you find the API Health, active calls made, and related
usage information about the API.

Since we have not yet used this API, there is no usage information
available.
API URL - Proxy URL (No. 1)

At No. 1, you can see the new URL (proxy URL) with which you can now
call the original source API. The URL consists of the following elements:

 API URL: https://group-cld900-


d052537.prod01.apimanagement.eu10.hana.ondemand.com:443/
GWSAMPLE_BASIC

 Application protocol: https

 Virtual Host: group-cld900-d052537

 API Host: prod01.apimanagement.eu10.hana.ondemand.com

 API Port: 443

 API Name: GWSAMPLE_BASIC

Virtual Host

The virtual host was created during the provisioning of API management
and can be changed at any time using Settings → APIs. Check and see
your Host Alias name.
API Host

Depends on your subaccount. It can also be your own custom domain


name.

Navigation Tabs (No. 2)

There are five tabs with the following names:

 Overview

 Proxy Endpoint

 Target EndPoint

 Resources

 Revisions
Tab 1: Overview

In this Overview tab, you will find all major information about your API.

These are as follows:

 Title

 Host Alias, that is the host from your Proxy URL on top of this page

 API Base Path

 API State (Active, Alpha, Beta, Deprecated, Decommissioned)

 Description
On the bottom of the interface, there is a Product Associated area. Later,
we create a product based on our API. Every entry can be changed.

Tab 2: Proxy EndPoint

Here, you can add some Proxy Endpoint Properties and Route Rules. Read
more here: API Proxy Structure

Tab 3: Target EndPoint

Here, you find the configured API Provider or the URL. In this case, we see
the SAPGatewayDemoSystemES5_Provider. It is also possible to use Load
Balancing.

Tab 4: Resources

This is the most important area of an API. It shows with a Swagger UI all
the possible resource paths and REST actions (GET, PUT, DELETE..) with all
necessary parameters.

The following figure gives us the example of a resource path, /ProductSet,


and the REST action GET with predefined query parameters.

Where do these entries in this tab come from?

While creating APIs for SOAP and REST, API resources are not auto
generated; you must add them manually. While creating the API for
ODataAPI, auto generation of resources can be possible in some cases.
This is the case if you use an API provider of type On-Premise with SAP
backend or one from Type Open Connectors. The visualization of the
resources is carried out by the Swagger UI implementation. It interprets
the openAPI spec of this API.

Tab 5: Revisions

With API revisions, you can make incremental changes to an API proxy
without disrupting the deployed API. You can access previous changes
made to the API proxy and even restore the API to any of its earlier states.

Revisions typically consist of small, incremental, and compatible changes,


such as adding a property, a new resource, or a policy to an API proxy.
Revisions are created when changes do not disrupt existing consumption
flows. They are independent of the actual URL used for consuming the API.
Because the deployed revision is the one being consumed, there is no
need to access it separately. The API proxy URL remains consistent across
different revisions.
Only one revision of an API proxy exists in the runtime environment. In the
design phase, you can view and compare the contents of different
revisions.

For more information on creating API Revisions, visit the help.sap.com


website: Creating API Revisions | SAP Help Portal

Create or Edit an API from the API Designer

Since it is not always possible to have the resources generated


automatically, the SAP API Management also offers you the ability to do it
manually. The resources visualize the openAPI spec created in the API
Designer.

Procedure for creating an API

In the Configure view of your APIs, choose the link, Create in API Designer,
to open the API Designer. You will find a simple start template in YAML.

You can now start to write your own openAPI spec. To edit an openAPI
spec, you use the same editor. You can also use other editors, such as
IDEs, Visual Code, and others, and copy the result into it.

Resources

You find the whole openAPI documentation here:

 OpenAPI Specification

 Create an API from API Designer

 https://swagger.io/docs/specification/about/
 Help Portal: Edit an API Proxy

Summary

The proxy URL is the new URL to ultimately consume the resource API. The
virtual host name is defined by you. It is used as an API host (API proxy
URL) in the subaccount.

The proxy URL is the new URL to ultimately consume the resource API. The
virtual host name is defined by you. It is used as an API host (API proxy
URL) in the subaccount. There can also be a custom domain here. SAP API
Management offers different tabs with different functionalities in
the View API. The Resources tab is the most important. The resources
describe the REST functionalities (GET, POST, and so on) and the paths to
the actual data (/ProductSet, /BusinessPartnerSet...). The description is
based on the openAPI specification. The visualization of the openAPI
specification is carried out with the Swagger UI. The Swagger UI is an
open-source JavaScript framework to make APIs tangible.

Explore the API Designer

Business Scenario

Our objective is to apply the OpenAPI format, a vendor-neutral description


standard, to define new APIs, or modify existing ones. The following
component diagram illustrates the connections and artifacts that are
generated, as indicated by the grey shading.

Task flow

In this exercise, you will perform the following tasks:


1. Explore the API Designer - Design-First Approach.

2. Explore the openAPI spec of your API Proxy.

Prerequisites

You have successfully finished the previous exercise.

Outcome after this exercise

Familiarize yourself with the OpenAPI specification and its implementation


using the API Designer tool.

What do you learn within this exercise

You will gain the knowledge in how to use the API Designer to transform
your API into the SAP API Management.

Steps

1. Explore the API Designer, a design-first approach.

1. Navigate to Configure → APIs.

2. Choose your API Proxy.

3. Choose the Edit → Edit in API Designer to open the API


Designer.
4. Check out the API Designer.

No. 1: Import openAPI spec in YAML or JSON

 No. 2: Download openAPI spec in YAML or JSON

 No. 3: Paste different formats (RAML, Data, JSON)

 No. 4: Generate Server Stub generates a server


implementation in different languages and Frameworks
(Node.js, Java -Spring, Java EE)

5. Choose every menu entry to check out how they work.

2. Explore the API specifications.

1. The editor’s area describes the definition of the API in YAML


(Yet Another Markup Language) format. More information can
be found at: https://yaml.org/

2. OpenAPI is the standard for describing REST-based interfaces.


More information can be found
here: https://www.openapis.org/. Here, the OpenAPI
Specification 3.0.0 (1) is used.
3. There are several blocks within this API description. One is
the API Server (2). This is the original source API of the ES5
system.

4. Path blocks describe the resource context. Every resource


context is visualized with the Swagger UI (3). You can find
more here: https://swagger.io/tools/swagger-ui/

Creating a Product
Product Creation

In this lesson, the following topics are discussed:

 What are Products in the context of SAP API Management?

 Create and publish a Product

 Show Products at API Business Hub Enterprise Portal

 Navigate to your Product

 Navigate to your API

 API Business Hub Enterprise portal in a glance

What Are Products in the Context of SAP API Management?

Products are artifacts that appear on the SAP API Business Hub
Enterprise portal. The SAP API Business Hub Enterprise portal is accessed
using its own URL. It is accessible in the SAP Integration Suite
cockpit through the navigation in the upper right corner of the interface.

After opening the API Business Hub Enterprise portal, the products are
displayed as tiles. The API used under a product corresponds to the API
Proxy URL of the corresponding API.

Needed Roles (Role Collection) to Use the API Business Hub Enterprise

To open the API Business Hub Enterprise portal, one of the following role
collections is required:

 AuthGroup.API.Admin
 AuthGroup.API.ApplicationDeveloper

We have already assigned both role collections to the user when


provisioning the SAP Integration Suite capabilities.

If you are coming via learning.sap.com, then you have to assign your user
account to the mentioned role collections.

Open an API Business Hub Enterprise portal with published products as a


sample

In this screenshot, you can see an already created product


named Product based on the GWSAMPLE_BASIC_v1 API.

This page is empty, if you have not yet created a product.

The products can then be searched for, found, and consumed by


developers.

Procedure for creating a Product

Note

You perform this step in the SAP Integration Suite cockpit.

The following steps must be carried out in the following order:


 Create a Product using the Create button

 Add the entries under the Overview tab

 Add an API under the APIs tab

 Choose the Publish button

 Open the API Business Hub Enterprise portal

 Check out your Product

Create and Publish a Product

A prerequisite for creating a product is a working API. The creation is


started using Engage. Then, navigate to the Products tab.

Choose the Create button to start the procedure.

At least the following entries must be made:

Tab: Overview

The Name and Title should be the same. The Title is the heading of the
tile. The description is also displayed on the tiles and is intended to give
the user the most important information about the API.

Example

Name: P_GWSAMPLE_BASIC_v1 Title: P_GWSAMPLE_BASIC_v1 Description:


An API based on the Enterprise Procurement Model (EPM). Authentication
is done using policies. No additional authentication is required.

The other entries such as Quota, Requests Every, and Scope are optional
and must be defined by policies.

A sample setting for that is found here: Create a Product

Tab: APIs

Here, you can choose your previously created API proxy, which you can
add to your API product. When you select the Add button, all available
APIs are displayed. You can assign any combination of the displayed APIs.
It is also possible to combine individual resources.

In the following case, the entire (all resources) of


the GWSAMPLE_BASIC_v1 API is added.
Entries under the tabs Permissions, Rate plans, and Custom Attributes are
optional.

A sample setting of custom attributes is described here: Custom Attributes

Publish Your Product

After the product has been configured, the product implementation must
be published on the API Business Hub Enterprise portal. It is called Publish.

Show Products at API Business Hub Enterprise Portal

The API Business Hub Enterprise portal is its own application. This was
provisioned together with SAP management.

The API Business Hub Enterprise portal is connected to the API


Management using its own URL. These can be checked
under Settings → API if problems occur.
After opening the API Business Hub Enterprise portal, as previously
described, all published products display as tiles.

Navigate to Your Product

At the moment, you still have the possibility to choose between two
representations. We use the new design. Navigate into the tile, you are
routed to the Test Environment tab.

Explanations:
 No. 1: Here is your description of the product

 No. 2: Here is the metadata of the product

 No. 3: The design decision

 No. 4: The logged-in user

To test the API, navigate to the APIs tab. Here, you can now see the title of
the assigned API. In this case, it is GWSAMPLE_BASIC. The name of this API
is GWSAMPLE_BASIC_v1. The product name is P_GWSAMPLE_BASIC_v1.

Navigate to Your API

When you select the tile with the title of the associated API, you are in the
API. Under the tab, API Reference, you will find the Swagger UI for calling
the assigned resources.
If you successfully test a selected resource, here GET/ProductSet, you will
see the well-known Proxy URL from SAP API Management as a Request
URL.

API Business Hub Enterprise Portal at a Glance

The API Business Hub Enterprise portal offers a platform to deploy


products with your associated APIs, centrally in your enterprise. This is the
only place where developers search, find, and test APIs, and ultimately
consume the corresponding proxy URL in their own processes. For this
purpose, the API Business Hub Enterprise portal offers the following:
 Your own user management

 A role-based access to the products

 An additional authentication Layer with its own API key

 A self-registration Unknown developer

Resources

Resources are available at SAP Help: SAP Help Portal

Resources are also available at Blogs: Protect Your API Proxy by Adding
Application Key Verification | Tutorials for SAP Developers

Summary

A product in the context of SAP API Management is its own artifact that
encapsulates created APIs or parts of them (resources). The product is
configured and deployed (published) on the API Business Hub Enterprise
portal. The product can be tested on the API Business Hub Enterprise
portal. Access to the products is restricted through roles.

Create a Product Based on Your Created API

Business Scenario

The API Business Hub Enterprise is a robust platform designed to


centralize and streamline the management of APIs for your deployed
products. This business case helps you understand how to navigate the
API catalog, use the available APIs effectively, and employ the built-in
monitoring tools to gain valuable insights into API performance and usage
when using APIs in your enterprise.

Task flow

In this exercise, you will perform the following tasks:

1. Create and publish a Product to an API.

2. Test your deployed API in the API Business Hub Enterprise.

Prerequisites

You have successfully completed the previous exercise.

Outcome of this exercise

By the end of this learning, you will be equipped with the knowledge and
skills to maximize the potential of the API Business Hub Enterprise,
enhancing your ability to streamline integrations and drive innovation in
your organization.
What do you learn within this exercise

You can create an API based on a deployed Product at the API Business
Hub Enterprise portal.

Task 1: Create and Publish a Product

Steps

1. Create a Product based on your API Proxy.

1. Navigate to Engage → click on Create.

2. Start the creation process with the Overview tab.

Field Input

Name API_GWSAMPLE_BASIC_XXX
Field Input

Title API_GWSAMPLE_BASIC_XXX

API Based on your API Proxy. The authorization


Descripti
against the source interface is realized via
on
policies. No extra credentials needed.

3.

4. Choose the APIs tab and the Add button, choose your API
Proxy and then choose the OK button.
The Permission, Rate plans, and Custom Attribute tabs are primarily not
necessary for this exercise and can be skipped.

1. Permission: Whenever you create or edit a draft product,


you can add permissions to the product. Use this
procedure to grant user roles the necessary permissions
for discovering and subscribing to the product in the API
Business Hub Enterprise. Only users assigned the
required role are able to discover and subscribe to the
product.

2. Rate plans: API Management enables users to create


rate plans and attach them to products. With a rate
plan, you can charge application developers for using
your APIs.

3. Custom Attribute: Custom attributes can be used to


influence the runtime behavior of API proxy execution.
These attributes can be set at the product level or at the
application level (when an application is created by an
admin on behalf of a developer). They offer the
flexibility to extend functionality based on attribute
values that can be set or read during the API proxy
execution flow. These attributes can be accessed during
an API call through the following policies: Verify API Key,
Access Token, and Access Entity.

5. Choose and click the Publish button on the top-right side.


6. Choose and click the top-right side navigation bread
crumbs Explore our Ecosystem, to log on to the API Business
Hub Enterprise .

7. Within the API Business Hub Enterprise, choose the


created GWSAMPLE_BASIC_XXX Product API.

8. Choose the APIs menu tab.


9. Then, choose the GWSAMPLE_BASIC_XXX Product tile.

10. If everything works correctly, you will see the entry with
the API proxy URL and the status of the published Product.

11. Now, choose the API Reference menu tab.


12. Navigate to the My Workspace using the top-left
navigation menu.

Task 2: Test your deployed API in the API Business Hub Enterprise

Steps

1. Test your Product

1. Choose the Resource GET/ProductSet.

2. Click onTry out.

3. Afterwards, click on Execute.

4. On (1) you can see the API proxy URL that you know from the
API management.

5. On (2) you can see the response that comes from the ES5
system.
Using Logging and Monitoring
Logging and Monitoring

In this lesson, the following topics are discussed:

 Health Monitoring with SAP Cloud ALM

 Analyze API usage and performance with the build-in: Advanced API
Analytics

 Logging with Message Logging Policy

 Inspection

Health Monitoring with SAP Cloud ALM

In the Health Monitoring application, you can check the health of your
monitored cloud service and technical systems from an application and
customer perspective. Technical metrics are collected regularly and can be
used to calculate the overall health of the monitored object. The
monitored metrics are defined by the service itself and can differ for each
service type.

At the moment, only health monitoring with the SAP Cloud ALM solution is
possible.

Analyze API usage and performance with the build-in Advanced API
Analytics

Advanced API Analytics brings to you an all-new analytics dashboard,


providing powerful tools and in-depth reports for analyzing your API usage
and performance. The reports are categorized across report pages, with
each report page providing information about key API metrics, which are
relevant for both business users and API developers.

Navigate to Monitor → APIs. The Dashboard opens.


There are many views and setting options to visualize relevant
information.

Logging with Message Logging Policy

If you want to use a logging solution, the SAP API Management lets you
send syslog messages to the third-party log management service. If you
want to send syslog to a third-party service, follow the service
documentation.

Third-party log management services are as follows:

 Splunk

 Sumo Logic

 Loggly

 Others

A sylog message contains the following elements and attributes


of Request and/or Response depending on the place at the flow.

 Message ( Payload)

 Host

 Port

 Protocol

As a result, the following screenshot shows the use of Splunk:


Resources

Health Monitoring with SAP Cloud ALM

 Read more here: Health Monitoring.

 Read more here: Supported Solutions.

Analyze API usage and performance with the build-in Advanced API
Analytics

Read more here: Analyze APIs.

Logging with Message Logging Policy

 Read more in a blog to use Splunk and Message Logging


Policy: Splunk – Part 1: SAP APIM Logging & Monitoring | SAP Blogs.

 Read more in a blog to use Loggly and Message Logging Policy: Part
7 – API Security Best Practices – Log all API interactions | SAP Blogs.

 Read more about the Message Logging Policy: Message Logging


Policy.

Inspection
 Read more in a blog to use: Inspecting and Understanding Resource
Consumption ... - SAP Community

 Read more at help.sap.com: Inspect Resource Consumption for


Individual Integration Flow | SAP Help Portal

Summary

Metrics, usage, and performance of individual API calls can be examined


on the one hand with the build-in Advanced API Analytics and on the other
hand with the SAP Cloud ALM product. Logging of the communication
parameters and payload is done with the Message Logging Policy. It
compiles the corresponding data and uses an external solution, such as
Loggly or others, to visualize it.
Managing Cloud Integration

Introducing Cloud Integration


Introduction to Cloud Integration

In this lesson, the following topics are discussed:

 What is Cloud Integration?

 Key Features

 Explore predefined Integration Content in SAP Business Accelerator


Hub

What is SAP Cloud Integration?

The following statements try to answer the question:

 It is one of the core capabilities of the SAP Integration Suite

 It is based on the open source framework Camel from the Apache


Software Foundation

 SAP Cloud Integration supports end-to-end process integration


through the exchange of messages

 The development, deployment, and monitoring takes place in the


browser with graphical tools

 It is one of the Low Code/No Code tools

Key Features
An integration flow has a 0-1 sender adapter. The message is delivered via
an endpoint if an adapter is configured. Various sender adapters are
available on the sender side. (No. 1) After receipt of the message, the
process is started via a startup event Start. This is followed by predefined
processing steps. (No. 2) There is a wide range of integration capabilities
that define different ways that messages can be processed on the
integration platform. Ultimately, receiver adapters can be configured to
complete the business process. Message processing can be carried
out synchronously or asynchronously. With this concept, a lot of well-
known enterprise integration patterns can be mapped.

Connectivity

The sender and receiver adapters are different. You are able to build your
own adapter. To do this, you can use the provided Software Development
Kit.

To determine which adapters are available depending on your license, you


can display the adapters after creating an empty project template, as
described in the following exercise. Perform the following steps:
1. Start with an editable empty project template:

2. Draw a line from the channel to the Start event. The available
adapters are displayed:
3. Proceed with the same procedure on the recipient's side:

Integration Capabilities

All integration capabilities are categorized. Among them are the


predefined processing steps. An integration flow now combines the
individual capabilities to, for example, map a technical process. There are
almost no limits to the possibilities of combination. To examine all
available integration capabilities with the assigned individual steps, you
can again start with the empty process template. You will find the tool
palette on top of your screen. Every icon describes a functionality in the
cloud integration user interface.

In the following case, this becomes visible at the transformation capability.

Predefined Integration Content


As previously indicated, the art of combining the integration capabilities in
such a way is that it becomes a professional process. The integration flows
can be complex. SAP offers over 400 predefined integration flows that can
be consumed.

There are two ways to investigate the predefined integration content. A


path directly via the Integration Suite is shown at the end of the course.
The other variant is described in the following paragraph.

Explore Predefined Integration Content in SAP Business Accelerator Hub

All available predefined integration content is listed in the SAP Business


Accelerator Hub, depending on the products to be integrated.

Procedure

The following steps must be carried out in order:

 Open the SAP Business Accelerator Hub at: API.SAP.com

 Navigate to the Discover Integrations tab

 Choose the first product

 Choose the second product which will be integrated with the first
one

 Find all available predefined integration content as an integration


package, based on your selection

 Navigate deeper in an integration package and find all available


integration flows

 Navigate deeper in an integration flow to find out the complete


configuration

Further details about the steps:


1. Discover the Integrations tab:

2. Choose the first product, for example, SAP S/4HANA:


3. Choose the second product, for example, SAP SuccessFactors:

4. Navigate to an integration package, for example, SAP


SuccessFactors Employee Central Integration with SAP ERP or SAP
S/4HANA, Employee Data:
5. There is only one integration flow available. Navigate to this
integration flow:

Here, you will find all the information to understand this integration flow:

 The configurations of all steps

 The business documentations

 And more

To consume this integration package or integration flow, you have to use


the Discover menu within the Integration Suite. This is shown by the
example of the Examples at the end of the exercises.

Sources

Read more:

 About key features:

o A complete overview of the Enterprise Integration Patterns


can be found
here: https://www.enterpriseintegrationpatterns.com/

o Home - Apache Camel

 About connectivity:

o A complete overview of the currently available adapters can


be found here: Connectivity (Adapters)

o More information can be found here: Developing Custom


Adapters
 About integration capabilities: The complete overview can be found
here: Integration Capabilities

Summary

Individual integration flows are compiled via predefined functional steps.


They are divided into categories such as mapping, routing, and others,
and provided as a palette. The process is started via exactly one incoming
message. The contents of this message can then be manipulated in
various ways in the process itself. The connectivity and flexibility comes
from many sender and receiver adapters. In addition to creating the
individual integration flow, SAP offers over 400 predefined integration
flows, as they are often needed in the SAP environment.

Describe the iFlow Process Created in the Next Exercises

Business Scenario

An external consultant has presented an integration concept to your


department manager, who wishes for you to implement it to identify a list
of affected customers to be immediately informed. You will have access to
the SAP system via an application interface to process the data.
Management expects a prompt solution. The basic concept is presented in
the next steps.

The consultant's integration concept is divided into seven key steps, which
will be presented to you.

Task Flow

In this exercise, you will perform the following steps in one task:

1. Send a List with ProductIDs as a SOAP message to the integration


flow

2. Iterate over the Product List

3. Check each ProductID to see if it is available in the database


4. Decide whether the ProductID exists in the database or not

5. Get the SalesOrderLineItems (SalesOrderID, ItemPosition) as a feed


for each ProductID

6. Select the entry SalesHeader (CustomerID) for every


SalesOrderLineItems (SalesOrderID, ItemPosition)

7. In every SalesHeader, select the Customer Name and write it to a


datastore

What do you learn within this exercise?

We discuss the integration process theoretically, highlighting all technical


points.

Task 1: Prepare the Next Exercises

Steps

1. Send a list with ProductIDs as a SOAP message to the integration


flow.

1. Send an XML message list via HTTPS and SOAP message to


the deployed integration flow. The Message Exchange Pattern
is One-Way (asynchron, fire, and forget).

2. The XML message contains the ProductIDs, for example HT-


1000.

Code Snippet

Copy codeSwitch to dark mode

12345678910111213141516171819202122232425262728293031323334
35363738394041

<soapenv:Envelope
xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
<soapenv:Header/>

<soapenv:Body>

<List>

<Product>

<ProductID>HT-2001</ProductID>

</Product>

<Product>

<ProductID>HT-1020</ProductID>

</Product>

<Product>

<ProductID>HT-1035</ProductID>

</Product>

<Product>

<ProductID>HT-6101</ProductID>

</Product>

<Product>

<ProductID>HT-7000</ProductID>

</Product>

<Product>

<ProductID>HT-2026</ProductID>

</Product>

<Product>

<ProductID>HT-6100</ProductID>

</Product>

<Product>

<ProductID>HT-9994</ProductID>

</Product>

<Product>

<ProductID>HT-1254</ProductID>
</Product>

<Product>

<ProductID>HT-1031</ProductID>

</Product>

<Product>

<ProductID>HT-1036</ProductID>

</Product>

</List>

</soapenv:Body>

</soapenv:Envelope>

2. Iterate over the Product List.

1. Each ProductID from the list is processed individually, one by


one.

2. For example, if the list has five ProductIDs, the following


processing steps are executed five times (loop five times).

3. Check each ProductID to see if it is available in the database.

1. Call the following URL:

Code Snippet

Copy codeSwitch to dark mode

https:// < your host >/v1/GWSAMPLE_BASIC/ProductSet('< ProductID >')

Code Snippet

Copy codeSwitch to dark mode

123

for example:
https://group00-xxxxxxxx-
yyyyyy.prod.apimanagement.eu10.hana.ondemand.com/v1/
GWSAMPLE_BASIC/ProductSet('HT-1000')

2. The ProductID comes from the product list.

3. If a ProductID, for example, HT-1000 exists in the database,


the following response is returned.

Code Snippet

Copy codeSwitch to dark mode

1234567

<ProductSet>

<Product>

<Category>Notebooks</Category>

<ProductID>HT-1000</ProductID>

<Name>Notebook Basic 15</Name>

</Product>

</ProductSet>

4. Decide whether the ProductID exists in the database or not.

1. If the response is empty, the current loop will be terminated


and the next one will be started.
2. If the ProductID is available, then the response will be further
processed.

5. Get the SalesOrderLineItems (SalesOrderID, ItemPosition) as a feed


for each ProductID.

1. Get the SalesOrderLineItems (SalesOrderID, ItemPosition) as a


feed for each ProductID.

If the ProductID is available, then the response is further processed.

2. Check that the following feed for each ProductID with the
following URL displays:

Code Snippet

Copy codeSwitch to dark mode

1234

https://< host >/v1/GWSAMPLE_BASIC/ProductSet('< ProductId


>')/ToSalesOrderLineItems?
$select=SalesOrderID,ItemPosition,DeliveryDate
for example:

https://group00-cld900-
d052537.prod.apimanagement.eu10.hana.ondemand.com/v1/
GWSAMPLE_BASIC/ProductSet('HT-1035')/ToSalesOrderLineItems?
$select=SalesOrderID,ItemPosition,DeliveryDate

3. Check that the feed is provided as a response. Below is an


example with one entry:

There can well be several thousand entries.

4. Check that the important properties


are SalesOrderId and ItemPosition.

5. Determine the number in advance with the following URL:

Code Snippet

Copy codeSwitch to dark mode

1234

https://< your host >/v1/GWSAMPLE_BASIC/ProductSet('< ProductID


>')/ToSalesOrderLineItems/$count

for example:

https://group00-cld900-
d052537.prod.apimanagement.eu10.hana.ondemand.com/v1/
GWSAMPLE_BASIC/ProductSet('HT-1035')/ToSalesOrderLineItems/$count

6. Enter the SalesHeader (CustomerID) for every SalesOrderLineItems


(SalesOrderID, ItemPosition).
1. Call the following URL.
The SalesOrderID and ItemPosition came from the last API call.

Code Snippet

Copy codeSwitch to dark mode

1234

https://< host
>v1/GWSAMPLE_BASIC/SalesOrderLineItemSet(SalesOrderID='<
SalesOrderId >',ItemPosition='< ItemPositoion >')/ToHeader?
$select=CustomerID,CustomerName,DeliveryStatus

e.g:

https://group00-cld900-
d052537.prod.apimanagement.eu10.hana.ondemand.com/v1/
GWSAMPLE_BASIC/
SalesOrderLineItemSet(SalesOrderID='0500000001',ItemPosition='00000
00040')/ToHeader?$select=CustomerID,CustomerName,DeliveryStatus

2. An XML is provided as a response. Below is an example with


one entry.
3. The important response property is CustomerName of the
orderer.

7. In every SalesHeader, select the CustomerName, and write it to a


datastore.

1. Select the CustomerName and write it to the database.

2. As a result, a Data Store is created in cloud integration that


links the Customer Names to the list of delivered ProductIds.

3. Below is the Data Store with the customer IDs matching the
list.

4. The generated list can now undergo further processing, with


the API used to identify the products and communication data
associated with each CustomerID and CustomerName,
allowing the customers to be promptly informed of any
potential delays.

Explore the Cloud Integration

Business Scenario

You are interested in learning how to construct an integration flow using


the cloud integration.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the Cloud Integration

2. Explore the Cloud Integration

Prerequisites

 You have a working Integration Suite.

 You have a working Cloud Integration.

Outcome after this exercise

You have acquired some preliminary experience with Cloud Integration.

What do you learn within this exercise?

Get familiar with the key navigation elements in cloud integration and
their functions.

Task 1: Log on to the Cloud Integration

Steps

1. Log on to the Integration Suite.

1. Navigate to your subaccount to Instances and


Subscriptions → Integration Suite.
2. Now, navigate to the individual areas.

Task 2: Explore the Cloud Integration

Steps

1. Explore the Discover area.

1. Navigate to Discover → Integrations.

2. You will find all available predefined integration scenarios


here, just as you do in the SAP Business Accelerator Hub.

2. Explore the Design area.

1. Navigate to Design → Integrations and APIs.

2. This is the most important development area, where you can


create, deploy, and manage integration flows and other
artifacts.
3. Explore the Monitor area.

1. Navigate to Monitor → Integrations and APIs.

2. Here, you can:

 Monitor Message Processing

 Manage Integration Content

 Manage Security

 Manage Stores

 Access Logs

 Manage Locks
 View Usage Details

4. Explore the Settings area.

1. Navigate to Settings → Integrations.

2. Here, you can manage:

 Runtime Profiles

 Transport

 System
 Custom Tags

 Malware Scanner

 Software Updates

 Design Guidelines

5. Find more information within the online guide.

1. Access the online guide via user symbol → Online Guide.

2. You can navigate through every area here to access further


information.
Explaining the Development Cycle
The Development Cycle

In this lesson, the following topics are discussed:

 Development with Cloud Integration

 Technical Implementation

 Development Cycle for Creating integration flow

 Versioning of your integration flow

 Developer test with real Deployment and Debugging of your


integration flow

 Developer test with Simulations of your integration and components

Development with Cloud Integration

Cloud integration is a subscription service, which means that graphical


modeling and processing of the process steps, among other things, occur
on the subaccount where you registered for the service. The consumer
and provider subaccounts are located in the same region, such as eu10
(Frankfurt), and communication takes place directly via the browser.
Therefore, all content displayed in the browser is delivered as an HTML
data stream. The URLs for different levels are provided below, and in all
cases, we are located in the eu10 region. The cloud integration URL
begins with the subdomain of your subaccount, followed by the specific
domain and the context path:

 URL of your subaccounts (No. 1) : https://emea.cockpit.btp.cloud.sap

 URL of the Integration Suite (No. 2) :


https://.integrationsuite.cfapps.eu10-003.hana.ondemand.com/shell/
home

 URL of the Cloud Integration (No. 2):


https://.integrationsuite.cfapps.eu10-003.hana.ondemand.com/shell/
design
When working, this means:

 All the available capabilities are used via the browser.

 The browser decides on the presentation. That's why SAP


recommends the latest Chrome browser.

 The browser sets timeouts.

 With poor network connection, there may be greater time delays of


the response.

 You can work with any device as long as the screen is large enough,
an updated Chrome browser is available, and there is a sufficient
Internet connection.

You need at least the PI_Integration_Developer Role Collection.

Technical Implementation

As mentioned at the outset, the core of the system is the Camel


integration framework. SAP enhances the Camel framework with a
graphical client and various security features. The complete
implementation is a Java application and comprises the following
components:
The first component (No. 1), is your browser, which accesses the
implementation via the Cloud Integration URL to create and manage the
integration flow. The second component (No. 2), is the graphical interface.

Once the integration flow is created, if it is deployed as a Java application


on the runtime (Cloud Foundry, Kyma), (No. 4) messages can be
transmitted using the sender component (No. 3), and received using the
receiver component (No. 5).

A load balancer (IP5) is connected to the sender input (No. 3), and
interestingly, it does not go directly to the runtime.

Resources on a Tenant

The resources for a Cloud Integration implementation are limited.


Development Cycle for Creating Integration Flow

In the related exercises, we follow the principle of building professional


integration flows.

To create a development cycle, the following steps must be carried out in


order:

 Understand your use case.

 Configure the SAP BTP subaccount and the Integration Suite.

 Find the list of required API with all its metadata, such as
credentials, headers, and more.

 Start in the Cloud Integration with an empty template.

 Modeling your processes.

 Build the integration flow bit by piece.

 Repeat the steps.

 What comes next?

Here are explanations about the steps:

Understand your Use Case


Collaborating with the artifacts, the use case is thoroughly analyzed and
the SAP Integration Solution Advisory Methodology methodology is applied
to address all requirements.

Configure the SAP BTP subaccount and the Integration Suite

The next step involves providing integration developers with the relevant
role collections, enabling them to work on the appropriate Integration
Suite. This is coordinated with the administrators.

Find the List of Required API with all its Metadata, Such as Credentials,
Headers, and More

If all APIs are listed in an API Business Hub Enterprise, you are fortunate
and the work of obtaining the necessary URL and parameters is
completed. However, if not, you can plan enough time to obtain this data
and test the interfaces.

Start in the Cloud Integration with an Empty Template.

Create a package with a meaningful name. Here is a proposal for name


conventions: Naming Conventions.

To start, select the integration flow artifact and an empty template will be
created automatically. If an incoming message is needed, it can be
simulated using a Timer event to start and a Content Modifier to simulate
the message. This approach facilitates faster and easier development
cycles.

Modeling your Processes

It can be difficult to establish clear criteria for process development.


Sometimes a process can initially seem simple, but it can later be broken
down into multiple individual processes. It is important to consider that
the process must be understood by the specialist staff in the future. In the
exercises of this training, we focus on implementing one process.
However, it is possible to outsource API calls to separate processes with
their own error handling.

Build the Integration Flow Bit by Piece

There are various ways to develop integration flows depending on the use
case. For the practical exercise, it is recommended to start with the API
calls. Once the connections are established, it becomes easier to
determine the required input and output. Unlike XI or PI with its XI
message protocol, there is no internal format in cloud integration. Thus, it
is important to consider the internal formats and transformations needed.
The help section for each integration flow component can be used to find
the appropriate configurations. This process is also demonstrated in the
exercises. After configuring a component, it is essential to debug and
verify that the output meets our expectations. Generally, there are two
ways to test our integration flow.

These are:

 Simulation of integration flows.

 Test with real deploying and debugging. This approach is used in the
exercises.

Both topics are examined in more detail next.

Repeat the Steps

Iterate through the steps until your integration flow functions as intended.

What Comes Next?

The first step is to test the process, and various testing procedures are
discussed in detail later. Once the testing is successful, the integration
flow is be transported to the production subaccounts. A continuous
monitoring of the processing or implementation of alert management to
respond to unforeseen events is the responsibility of the administrators,
and will not be addressed separately here.

Versioning of Your Integration Flows

It is important to version the development state periodically to allow for


the possibility of reverting back to a previous version if necessary.

Procedure to Version Integration Flow:

 Start in your editable integration flow.

 Choose the Save as version button on the top right.

 Enter a meaningful comment.

This approach is used in the exercises.

Procedure to Switch to Former Version:

 Start at your package.

 Mark your artifact (integration flow) in the list of available artifacts.

 Go to the Version column.

 Choose the version number.

 Choose a former version.

 Choose the symbol to change back to former version.


Developer Test with Real Deployment and Debugging of Your Integration
Flow

Before examining the integration flow, it must be deployed in the


monitoring environment. The graphical model is converted into a Java
application and placed in the runtime, allowing the integration flow to be
started. If the deployment is successful, the integration flow will either
execute immediately if a timer event is used, or it waits for an incoming
message. Cloud integration offers a trace log level that provides insight
into the processing of each integration flow component.

To Perform a Developer Test, the Following Steps Must Be Carried Out in


Order:

 Start at your integration flow.

 Choose the Deploy button.

 Choose a spot in the white space outside the integration flow swim
lane.

 Choose the Deployment Status in the Integration Flow configuration


area.

 If your integration flow is successful deployed, you will see


a Navigate to Manage Integration Content link.

 Choose this link to jump to Monitor Artifacts → Overview → Manage


Integration Content.

 Change the log level to trace.

 Deploy again if you use a timer starting event. Otherwise, send a


message to the endpoint.

 If you deploy again, come back to Monitor


Artifacts → Overview → Manage Integration Content.

 Here, choose the Monitor Message Processing link.

 In the new window, choose Monitor Artifacts → Overview → Monitor


Message Processing. Choose the last message on the message list
and choose it.

 Choose the Trace link to jump directly to Monitor


Artifacts → Overview → Monitor Message Processing → Message
Processing Run.

 Explore the trace of your flow.


This approach is used in the exercises.

Developer Test with Simulations of Your Integration Flow and Components

Simulating individual parts or the entire integration flow can be useful to


verify if values are correctly set in a content modifier or if a script or
mapping is executed as expected. However, not all integration flow
components are supported for simulation.

Here is the list of supported integration flow components: Simulation of an


Integration Flow

Example

In the DeDelayedDelivery_Process, we want to check through a simulation


whether the ProductID is set correctly in
the Modify_setProductIDAsProperty.

To Perform Developer Tests with Simulations, the Following Steps Must Be


Carried Out in Order:

 Choose a place on the line in front of


the Splitter_iterateOverProducts component.

 Set the starting point via the context menu.

 Add the input message as a payload (content).

 Choose the line after


the Modify_setProductIDAsProperty component.

 Set the end point of the simulation.


 The simulation navigation bar is now active.

 Start the simulation with the Start button of the navigation bar.

 Choose all envelopes between the start point and the endpoint to
explore the results.

 After the testing, choose the Clear button of the navigation bar.
Summary
The process of creating an integration flow involves using a graphical
editor in the remote cloud integration application. Simulations can be
conducted on individual parts or the entire integration flow to verify that
values are correctly set in content modifiers, scripts, or mappings. Once
the integration flow is complete, it is versioned and deployed, resulting in
the creation and deployment of a Java application in a runtime. The
integration flow can then be executed. The development process can be
approached as cycles, where the placement and configuration of
components, debugging using trace log levels, and testing are repeated
until the desired result is achieved.

Create an Integration Package and Integration Flow

Business Scenario

You have the implementation concept at hand and want to start


developing the integration flow. For this, an integration package must be
created that contains the essential artifact.

Task Flow

In this exercise, you will perform the following steps:

1. Log on to the Design View of Cloud Integration.

2. Create an Integration Package.

3. Create an integration flow within the Integration Package.

Prerequisites

 You have a working Integration Suite.

 You are able to log on to the Design View of Cloud Integration.

Outcome after this exercise

A functional integration flow has been established within the designated


package.

What do you learn within this exercise?

You are capable of creating and working with an initial integration flow.

Task 1: Create an Integration Package and Integration Flow

Steps

1. Log on to the Design view of Cloud Integration.

1. Log on to the Integration Suite welcome page.

2. Navigate to Design → Integrations and APIs.


2. Create an Integration Package with the
name DelayedDelivery_Package_CLD900_Data_Number.

3. If you are visiting the Classroom Training, your trainer provides you
with a number to use for the creation of your iFlow package.

1. Choose the Create button on the top right.

2. Under the Header tab, enter the following data:

Field Input

DelayedDelivery_Package_CLD900_Date_Numbe
Name
r

Technical DelayedDeliveryPackageCLD900DateNumber
Name (created automatically)

Short
Package for exercises with the CLD900 course.
Description

Version 1.0.0

Vendor SAP
3.

4. Choose the Save button and then choose the Artifacts tab to
be able to add an artifact.

4. Create an integration flow within the Integration Package with the


name DelayedDelivery_Process_Number.

1. Switch to the Artifacts tab.

2. Open the Add list and choose Integration Flow.

3. Add the following data:

Flied
Input
Name

Name DelayedDelivery_Process_Number

ID (will be filled out automatically)

Descripti Processes a list of Product IDs which are affected


on of the delayed delivering.
4.

5. Choose the Add button to create the initial integration flow.

6. Choose the DelayedDelivery_Process_Number row to open


your initial integration flow.
Using Message Monitoring and Logging

Using Message Monitoring and Logging

In this lesson, the following topics are discussed:

 Types of Monitoring.

 Build-In Monitor for Message Monitoring.

 System Log Files.

 SAP Cloud Integration API for Message Processing Logs.

 External Logging.

Types of Monitoring

In addition to the standard built-in monitor, various tools can be used with
Cloud Integration Monitoring. It offers different categories including
message monitoring, content management, and alert management. In
this discussion, we focus on message monitoring using the built-in monitor
for messages and the SAP Cloud Integration API for Message Processing
Logs API.

There is a helpful overview section that highlights the strengths of each


tool.

 SAP Cloud ALM

 SAP Solution Manager

 SAP Focused Run SAP Analytics Cloud

 SAP Application Interface Framework

 Cloud Integration OData APIs

Build-In Monitor for Message Monitoring

The lesson applies to the PI_Integration_Developer Role collection, while


the exercises use the PI_Administrator Role Collection, which my lead to
differences in the presentation. Upon logging in through the monitor
menu, you are presented with the following desktop, where any view can
be individually configured by clicking on the plus sign.
No. 1: Monitor Message Processing, No. 2: Manage Integration Content,
No. 3: Manage Security - Admin Place, No. 4: Manage Stores, No. 5: Access
Log.

Here, we are only interested in the jump to message monitoring (No. 1)


and the Access Log (No. 5). All other areas concern mainly *integration
admins*.

Monitor Message Processing Area

The overview displays all processed messages in all deployed integration


flows and can be filtered. It also allows direct access to the processed
message.

The Message at the Status Column Can Have Different Values

 Completed

 Processing

 Retry

 Escalated

 Failed

 Canceled

 Discarded

 Abandoned

Sample with Message Status Completed


1. Open Monitor Artifacts → Overview → Manage Integration
Content:

No. 1: the filter bar, No. 2: a message with status complete.

2. Choose the message with status complete:

On the right side of the windows, there is another section that provides a
wealth of information about message processing. When the log level is set
to info, this section serves as the primary monitor.

However, if the log level is set to trace, you can access the debugging
mode of the process flow, which we have already encountered in the
exercise.
Jump in Directly from Your Integration Flow

A more concise version could be: "You can navigate from the integration
flow to the monitor artifacts section using the link Navigate to Manage
Integration Content, which opens the monitor in a new tab for easier
navigation." This is used in the exercises, and provides access to all
information about the processing through various tabs.

This approach is used in the exercises of this training. All information


about the processing can be accessed via the different tabs.

System Log Files

The Access Logs section in the Build-In Monitor provides direct access to
the System Log files via the System Log Files tile. These files include the
HTTP's access and trace, and are retained for 30 days.
After having opened the tile, you have access to the system logs.

SAP Cloud Integration API for Message Processing Logs

The Monitor itself is based on a cloud integration API. To investigate it,


navigate to: api.sap.com and search for the SAP Cloud Integration API.
Find the APIs here: API.SAP.com.
Open the tile Message Processing Logs → API Reference. Now, we can
examine the API with the Swagger UI.

Try out GET /MessageProcessingLogs.


At (No. 1), you find the metadata of this message. At (No. 2), the
navigation is on the left to examine specific information.

Configure the API Access

If a user is assigned the PI_Administrator or PI_Integration_Developer Role


Collection, they can directly access the API using their credentials.

The URL Scheme is:

https://{{host}}/api/v1/

Sample for the Resource LogFiles

Call the following URL with your own subdomain and


region: https://..hana.ondemand.com:443/api/v1/LogFiles

External Logging

To be independent of the size of the SAP Cloud Integration database, and


manage a large number of log files, customers can use the external
logging feature to send message processing logs to an external system.
This section provides instructions on how to enable this feature on your
tenant.

Resources

Read more:

Types of Monitoring

Read the whole story: Monitoring tools for Cloud Integration Capability of
SAP Integration Suite | SAP Blogs
Build-In Monitor for Message Monitoring

Here is an overview about the sections on monitor start page: Monitoring

Message Status

Find a complete overview and description at: Message Status

External Logging

External Logging

Summary

SAP provides various products to support message monitoring such as


SAP Cloud ALM, SAP Solution Manager, SAP Focused Run, SAP Analytics
Cloud and more. In addition, SAP Cloud Integration provides its own
graphical monitor, which provides a comprehensive view of message
processing. The monitor enables users to read system log files and is built
on an API, which can be directly called. This API enables users to create
their own customized monitor.

Create a Timer Event in Place of the Message Start Event

Business Scenario

You have implemented the initial integration process and now you want to
start developing the actual integration process. To do this, you will learn
the most important basic functions, such as saving and deploying your
developments, and examining them in monitoring.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the Integration Flow DelayedDelivery_Process.

2. Explore the design view.

3. Replace the message start event by a timer event.

4. Version and deploy the integration flow.

5. Use the monitor to check out the result.

Prerequisites

 You have a working Integration Suite.

 You are able to log to the design view of Cloud Integration.

 You have successfully finished the step Create an Integration


Package and Integration Flow.
Outcome of this exercise

A functional integration flow with a timer-based start event.

What do you learn within this exercise?

The objective is to comprehend the concept of events and


create/configure a timer event. Also, the user can deploy the integration
flow and monitor it.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via the SAP


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations and
APIs → DelayedDelivery_Package_CLD900_Date_Number → Art
ifacts → DelayedDelivery_Process_Number.

2. Choose the Edit button on the top right. The palette with
the Integration Flow Components is now active.

Task 2: Explore the Design View

Steps

1. Explore the palette with Integration Flow Components.


1. Choose the context sensitive help button on the bottom right.
For each iFlow component, you can access the corresponding
help directly via this help button.

2. All integration flow components are displayed here:

3. Choose the Participant button. Here, you can


configure Sender and Receiver aliases.
You can choose theSender or Receiver Artifact from the Help icon as
previously mentioned.

4. Open the context-sensitive help, navigate to Assign Sender


and Receiver Components, and read the description.

5. Choose the Process button.


6. Open the context-sensitive help menu on the left side,
navigate to Define Process Shapes, and read the description.

7. Continue in the same way with the other pallet elements:

 Event

 Connectors

 Mapping

 Message Transformers

 Call
 Message Routing

 Security Elements

 Persistence

 Message Validators

2. Explore the navigation bar on the top.

1. Choose the Edit button.

Before you can work on the integration flow again, you must set the
integration flow into the edit mode.

2. Explore the navigation bar on the top.

3. Choose the Save button. You get a notification. The navigation


bar does not change.
4. Choose the Save as version button.

A pop-up to enter values for a new version displays.

5. A new version is always necessary if you have a working


integration flow that you may want to jump back to.

6. The navigation bar changes. Save and Save as version button


are substituted with the Edit button. A new
button, Configure displays.

7. The Deploy button, we use in the next part of the exercise.

8. The Cancel button and the Delete button are self-explanatory.


3. Explore the navigation elements on the right side. These elements
help you to fit your work within the screen.

1. Choose the center of the No. 1. The integration flow is


centered.

2. Choose the + and - symbols of the No. 2 to zoom your


integration flow.

4. Explore the navigation bar on the bottom. This element helps you to
work efficiently and with high performance.

1. Double-click on the white area, somewhere outside the


integration flow.

2. The configuration bar displays at the lower part of the screen.


It is helpful for configuring the integration flow components,
but takes up a lot of space on the screen.

3. If you do not choose any special components, the


configuration of the entire integration flow is called.
4. To hide the configuration bar, choose the minus icon on the
right-side of the configuration bar.

5. Also, try the Maximize button.

6. Also, try the Show Overview button.

Task 3: Replace the Message Start Event by a Timer Event

Steps

1. Replace the message start event by a timer event.

1. Ensure that the process is in editing mode.

2. Choose the Event button and scroll down and find


the Timer entry.
3. To add a timer event to your integration flow, choose
the Timer symbol and navigate along the blue line that
connects the Start and End event in the swim lane of your
integration flow. Once the blue line becomes thicker, position
the timer symbol on the line, and select it using the mouse. Or
you can use the navigation menu by clicking on the blue line,
then choosing Add Flow Step and searching for the flow step
you want to use.

4. The integration flow component can be seen to be placed


exactly at this point by the input arrow pointing to the
currently placed integration flow component.

5. To hide the context menu, choose the white area within the
swim lane.

6. Choose the Start event to display the context menu.


7. Choose the trash can to remove the Start event.

8. Double-click on the Start Timer 1 symbol to display the


configuration bar as previously described.

9. Choose the Scheduler tab. Accept the settings.


Task 4: Version and Deploy the Integration Flow

Steps

1. Version your work.

1. Before deploying, we always create a new version to ensure


that we can revert to the previous version if any issues occur.

2. Choose the Save as version button on top, and enter a


comment, for example, Add and configure a timer.

3. The top navigation bar changed, as previously described. The


version number is set automatically.
2. Deploy your integration flow for the first time.

1. Choose the Deploy button. You will get a demand. Confirm this
by choosing the Yes button.

2. You will get a second notification. Confirm it by choosing


the OK button.

3. Your integration flow is compiled and packed as a *jar file. This


*jar file is called from the java SDK.

Task 5: Use the Monitor to Check Out the Result

Steps
1. Jump to the monitor from your integration flow – the easy way.

1. Access the configuration bar of the Integration Flow as


previously described. Choose the Deployment Status tab.

Note

The menu options display when you click in the white canvas space as
previously mentioned.

2. After deploying your integration flow, you will receive an


active link Navigate to Manage Integration Content that you
can use to access the monitor.

3. Choose the Navigate to Manage Integration Content link and a


new tab opens in your browser, displaying the Overview page
within the Manage Integration Content section of the monitor
area.

4. A new tab will always open when you choose this link.

5. You will find your integration flow, its status, and a link for
monitoring your integration flow.
6. Choose the active Monitor Message Processing link. You are
switched to a new page Overview → Monitor Message
Processing.

7. Choose the most recent message with a Completed status to


navigate further into the process.

8. The details about the process displayed on the right are


limited because the debug level is set to info.

9. There is a second introduction to monitoring.

10. Jump back to your integration flow by choosing Navigate


to Artifact Editor.

2. Access the monitor from the left-side navigation bar - a more


complex method.

1. On the left side, choose Monitor → Integrations and APIs.


2. The Overview page opens.

3. Choose the Manage Integration Content tile. Now,


the Overview → Manage Integration Content page opens. It
can also be accessed directly from the integration flow, as
previously described.
4. Let's try deploying and monitoring using the described
procedure several times, as it is a common practice going
forward.

5. The key distinction between the two methods of accessing the


monitor is that, in the first case, the monitor opens in a
separate browser window, making it easy to switch back and
forth between the integration flow and monitor.
Explaining the Camel Data Model and
Simple Expression Language
Camel Data Model and Simple Expression Language

In this lesson, the following topics are discussed:

 The Camel Data Model.

 Simple Expression Language.

The Camel Data Model

It is a container for the following elements:

 Headers

 Properties

 Attachment

 Body

 Others

Here are the explanations:

Headers
Header data contains information related to the message, such as the
message sender's address, and is automatically included in any
subsequent HTTP call.

Properties

More data can be temporarily stored during message processing in the


form of context objects.

Attachments

Contain optional data that is to be attached to the message.

Body

The payload to be transferred in a message is contained within the body.


During message processing, an Exchange container is also available,
which can store extra data besides the message. This container is
uniquely identified by an Exchange ID and can hold temporary data in the
Properties area during message processing. The data stored in the
Exchange container is available for the entire duration of the message
exchange and is included in the container when the next processing step
is called.

Manipulation of the Exchange Parameters

Exchange parameters, including the payload, are automatically set by


incoming messages. However, these parameters can also be manually
manipulated by reading and writing. Various methods are available for
manipulating exchange parameters:

 Use of the Content Modifier component

 Use the Groovy SDK

 Use the JavaScript SDK

 Use of UDF in Message Mapping

 Use of XSLT Mapping

 And even more

Set Exchange Parameters with Content Modifier Component

The Content Modifier element offers a graphical way to manipulate the


Exchange parameters.
You can manipulate:

 Header

 Properties

 Body

Set Exchange Parameters with Groovy SDK

The com.sap.gateway.ip.core.customdev.util.Message class offers


methods to manipulate the parameters.

The same applies to JavaScript.

Simple Expression Language

The Simple Expression Language is used to parameterize Exchange


Parameters. It also offers several Build-In parameters, such as
timestamps, error messages, and more. That means you have only a
reading access to, for example, Exchange Parameters.
The general scheme is the ${ } placeholder containing a built-in variable
or Exchange Parameters. In Cloud Integration, the ${ } placeholder can be
inserted in, for example, the payload in a Content Modifier step, or applied
in the Query Editor, adding dynamic values to an OData resource path.

The ${ } placeholder can also be combined with operators to produce


boolean expressions, which you can then use as conditions
in Routers, Filters, and more integration flow components.

Samples:

Code Snippet

Copy codeSwitch to dark mode

1234

${property.MyNumericProperty} > 0

${property.MyStringProperty} contains ‚test‘

property.ProductCoderegex‚[a−z]5\d3‘

${date:now:dd-MM-yyyy HH:mm}

Specials Within Daily Business

The Message Body

${in.body}

Properties

${property.}

Message Headers

${header.}

The Simple Expression Language Can Be Used:

 In Scripting with Groovy or JavaScript.

 Within some integration flow components like Router, Content


Modifier, and Message Mapping as user defined functions.

 In XSLT Mappings.

 In some adapters for querying.

Resources

Read more here:

Guidelines for modifying content


Guidelines for Modifying Content

Exchange Parameters with Content Modifier Component

 Define Content Modifier

 SAP Integration Suite - Deep dive into Content Modifier | SAP Blogs

Using Camels Simple in CPI Groovy Scripts

Using Camel’s Simple in CPI Groovy scripts | SAP Blogs

Simple Expression Language

 Simple Expression Language

 Using Camel Simple Expression Language

 Get to know Camel’s Simple expression language in SAP Cloud


Integration | SAP Blogs

Summary

The Camel Data Model is used to manage temporary data during


processing in the individual integration flow components. This data model
includes not only the payload (body) but also properties and header data,
which are automatically included in an HTTP call.

The Exchange container is passed from the predecessor to the next


processing step with each processing step. Exchange Parameters are set
automatically, for instance when a message is received, and manually
through components like the Content Modifier or the Groovy SDK, among
others. The Camel Data Model manages the temporary data during
processing, which includes the payload (body), properties, and header
data. Header data is automatically included in an HTTP call.

Accessing the Exchange Parameters for reading is done through the


Simple Expression Language, which not only includes built-in parameters
but also allows for modeling complex regex expressions.

Create a Content Modifier with Sample Data as Payload (XML)

Business Scenario

For development, you receive a customized local ProductID list in XML


format. It is used for development and testing. The ProductIDs correspond
to those in the production system.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.


2. Place and configure a Content Modifier component.

3. Save as version, deploy, and debug your integration process.

Prerequisites

The step of creating a timer event instead of the message start event has
been completed.

Outcome after this exercise

A Content Modifier with sample data as the payload is provided for use.

What do you learn within this exercise?

 Learn to use and configure a Content Modifier.

 Learn to trace your process.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations and
APIs → DelayedDelivery_Package_CLD900_Date_Number → Art
ifacts → DelayedDelivery_Process.

2. Following the status after the last exercise step.

3. Check that the integration flow is in the editing status.

Task 2: Place and Configure a Content Modifier Component

Steps

1. Place a Transformation → Content Modifier component after


the Start Timer 1 event.

1. Place a Content Modifier component, as shown at the


beginning.
2. The result is a placed Content Modifier component.

2. Configure the Content Modifier component.

1. Call the configuration bar of the Content Modifier component


with a double-click on the component.

2. Choose the General tab and enter the following name.

Field
Input
Name

Modify_setPaylo
Name
ad

3.
4. Switch to the Message Body tab and enter the following SOAP
payload.

Code Snippet

Copy codeSwitch to dark mode

12345678910111213141516171819202122232425262728293031323334
353637383940414243444546474849

<soapenv:Envelope
xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">

<soapenv:Header/>

<soapenv:Body>

<List>

<Product>

<ProductID>HT-2001</ProductID>

</Product>

<Product>

<ProductID>HT-1020</ProductID>

</Product>

<Product>

<ProductID>HT-1035</ProductID>

</Product>

<Product>

<ProductID>HT-6101</ProductID>

</Product>

<Product>

<ProductID>HT-7000</ProductID>

</Product>

<Product>

<ProductID>HT-2026</ProductID>

</Product>

<Product>
<ProductID>HT-6100</ProductID>

</Product>

<Product>

<ProductID>HT-9994</ProductID>

</Product>

<Product>

<ProductID>HT-1254</ProductID>

</Product>

<Product>

<ProductID>HT-1031</ProductID>

</Product>

<Product>

<ProductID>HT-1035</ProductID>

</Product>

<Product>

<ProductID>HT-1601</ProductID>

</Product>

<Product>

<ProductID>HT-2025</ProductID>

</Product>

<Product>

<ProductID>HT-1067</ProductID>

</Product>

</List>

</soapenv:Body>

</soapenv:Envelope>

5. The requested products with their product IDs must already


exist in the database. You can test it beforehand.

6. To ensure that the ProductIDs used return data, you can check
with the API beforehand. For
example: https://sapes5.sapdevcenter.com/sap/opu/odata/iwb
ep/GWSAMPLE_BASIC/ProductSet('HT-1000')

Note

Be careful while copying the URLs. Ensure that there are single quotes in
the URL and not apostrophes.

Task 3: Save as Version, Deploy, and Debug Your Integration Process

Steps

1. Save as version and deploy your integration flow.

1. As described in the last step, save as version, and deploy.

2. After successfully deployed.

2. Set the log level to trace.


1. Go to Monitor → Integrations and APIs → Manage Integration
Content from your Integration Flow configuration bar.

2. Change the Log Configuration → Log Level to Trace. With trace


level you see the payload.

3. Switch to the browser tab where your integration flow is open


and initiate the deployment process to start the debugging.

You must confirm the change of the Log Level within a pop-up message.

Note that the modified log level will only be applicable from the next
instance of the deployed process. In this particular case, it will be effective
immediately after deployment.This means you must redeploy your
integration flow before your can trace your flow messages.

The log level Trace expires after 10 minutes from activating!


4. From the deployed integration flow, directly go back to the
current Monitor → Integrations and APIs → Manage Integration
Content.

5. A new browser tab opens, displaying the current instance of


the Monitor → Integrations and APIs → Manage Integration
Content.

6. Choose the Monitor Message Processing link.

7. Select the message with the status Completed. Then, in the


detailed view of the message that appears on the right,
choose the Log Level link.

8. You are in the Trace view of your monitor:


3. Explore the trace capabilities.

1. On No. 1, you see the steps.

2. On No. 2, you see the Integration Flow Model – corresponding


with the tab, Integration Flow Model.

3. On No. 3, you see the log content of the step.

4. On No. 4, you see the payload.

5. Choose the step End (No. 1). You see that the corresponding
part of the Integration Flow Model is also marked.

6. Choose the Log Content tab (No. 3). You see the log entries for
this step
7. Choose the Message Content tab (No. 4).

8. Choose Message before Step → Payload to see the Payload as


a result of the last step – here Modify_setPayload.
Modeling Processes

Modeling Integration Flows in an


Overview
Integration Flows Modeling

In this lesson, the following topics are discussed:

 Integration Flow Design Guidelines Overview.

 Learn the Basics.

 Guidelines to Design Enterprise-Grade Integration Flows.

 Guidelines to Implement Specific Integration Patterns.

 Integrated Design Guidelines.

Integration Flow Design Guidelines Overview

Integration developers must ensure that integration flows are designed in


a robust manner to protect their company's mission-critical business
processes. As each use case can be unique, there are as many integration
flow models as there are use cases. SAP provides solutions for recurring
requirements through the Integration Flow Design Guidelines, which are
well-documented and implemented. These example integration flows can
be tested directly.

This section provides guidelines for integration developers covering the


following three main aspects:

 Learn the Basics.

 Guidelines to Design Enterprise-Grade Integration Flows.

 Guidelines to Implement Specific Integration Patterns.

The integration flows are designed to meet the following requirements:

 One specific guideline or pattern is the focus of each integration


flow, making it easy for you to understand the topic.

 You can easily deploy and execute each integration flow with
minimal effort, allowing you to test each guideline or pattern on
your own.
 Each reference integration flow can serve as a foundation for
developing more intricate scenarios.

Implement the sample packages

The last three exercises taught you how to implement and use a sample
integration flow from the "Learn the Basics" package.

Learn the Basics

Here, the following topics are discussed based on sample


implementations:

 Start your integration flow design journey by getting familiar with a


set of three simple integration flows that demonstrate the basic
features of message processing. These flows progressively increase
in complexity.

 Learn how to access (and set) headers and properties.

 Learn how to design integration scenarios with integration flow to


integration flow communication.

 Learn how to configure adapters.

 Learn how to transport integration content from a source to a target


tenant.

 Learn how to implement different scenarios to decouple sender and


integration flow processing.

 Learn how to retrieve only delta data from the source system using
the current date or the latest date in the payload.

 Learn how to handle exceptions with an exception subprocess.

 Learn how to use the monitor application to analyze the behavior of


an integration flow at runtime.

 Learn how to modify content to use different integration flow steps


(for example, the content modifier or the content enricher) to
modify the message content:

o Learn how to convert data from a source into a target format.

o Learn how to encode and decode content.

o Learn how to handle message mappings.

 Learn how to use steps that store the message on the tenant
database.

 Learn how to transfer files.


Guidelines to Design Enterprise-Grade Integration Flows

Here, the following topics are discussed based on sample


implementations:

 Learn to design with high availability.

 Learn to design with resilience.

 Learn to deal with limited resources.

 Learn to design loose coupling.

 Learn to handle failures gracefully.

 Learn to design flows readability.

 Learn to use prepackaged integration content.

Guidelines to Implement Specific Integration Patterns

Based on sample implementations, the following topics are discussed


here:

 Learn to implement and use anAggregator pattern.

 Learn to implement and use a Composed Message


Processor pattern.

 Learn to implement and use a Content based routing pattern.

 Learn to implement and use a Content Enricher pattern.

 Learn to implement and use a Content Filter pattern.

 Learn to implement and use a Message Filter pattern.

 Learn to implement and use a Recipient List pattern.

 Learn to implement and use a Resequencer pattern.

 Learn to implement and use a Scatter-Gather pattern.

 Learn to implement and use a Splitter pattern.

 Learn to implement and use a Quality of Service pattern.

Integrated Design Guidelines Check

In the integration flow, you are now able to Execute Guidelines checks into
your integration flow with an analyze view or Report of violated
components.
Resources

Read more here:

About Designing Enterprise-Grade Integration Flows:

Description of the integration flows at: Guidelines to Design Enterprise-


Grade Integration Flows

Basic Documentation

Description of the integration flows at: Learn the Basics

Documentation About Guidelines to Design Enterprise-Grade Integration


Flows

Description of the integration flows at: Guidelines to Design Enterprise-


Grade Integration Flows

Documentation About Guidelines to Implement Specific Integration


Patterns

Description of the package, including further information about the links


to the single patterns: Guidelines to Implement Specific Integration
Patterns

Summary

SAP offers various solutions for common technical requirements through


documentation and implementation examples. These areas include Learn
the Basics, Guidelines for Designing Enterprise-Grade Integration Flows,
and Guidelines for Implementing Specific Integration Patterns. You can
research and use these examples in your own projects.

Create and Configure a General Splitter

Business Scenario

To process every productID of the incoming list, you aim to implement a


splitter (iterator) to further expand your integration process.
Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure a General Splitter.

3. Save as version, deploy, and debug your integration process.

4. Learn more about the splitter component.

Prerequisites

The step of creating a Content Modifier with sample data as payload (XML)
has been completed.

Outcome after this exercise

A General Splitter component is currently running.

What do you learn within this exercise?

Learn to use and configure a General Splitter.

Steps

1. Log on to the integration flow DelayedDelivery_Process via the SAP


Integration Suite.

1. Navigate within the SAP Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. Following the status after the last exercise step.


3. Imagine that the integration flow is more edited.

2. Create and configure a General Splitter.

1. Get a General Splitter from Routing → Splitter → General


Splitter and place it after the Modify_setPayload component,
as described.

2. Expand the General Splitter by choosing the artifact. The


configuration tab pops up, change the name
to Splitter_iterateOverProducts.
3. Switch to the Processing tab and enter the following data:

Field Name Input

Expression
XPath
Type

//
XP Expression
Product

Grouping 1

Streaming flagged

Stop on
flagged
Exception

4.

5. The element is used in the product list to encapsulate the


ProductIDs. With grouping set to 1, each ProductID is
processed sequentially.

3. Save as version, deploy, and debug your integration process.

1. Perform the following steps.

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set log level to trace.

5. Deploy again.

6. Jump again toOverview → Manage Integration Content.


2. Navigate to your Integration Flow Model
in Overview → Monitor Message Processing → Message
Processing Run.

3. Choose the Splitter_iterateOverProducts Segment 1 (No. 1)


option and navigate to the Message Content tab. You can view
the payload from the previous step, which includes several
products.

4. By choosing the first End artifact, you notice that the message
is being processed. You can check the payload of
multiple End artifacts to see the single product by using
the navigation menu on the top-left side.
5. Choose the last End step to see a different product.

6. In summary, it can be stated that the General Splitter works


as expected.

4. Learn more about the splitter component.

1. Navigate back to your integration process.

2. Open the configuration bar of the General Splitter, and choose


the question mark symbol.

3. You can directly access the help site of the General Splitter by
choosing the question mark symbol in the configuration bar of
the Splitter, which takes you there.
Learning the Basics
The Basics of Process Modeling

In this lesson, the following topics are discussed:

 Handle Attachments.

 File Transfer.

 Decouple integration flows.

 Use Converters.

Handle Attachments

In this lesson, you are not using attachments, but you will learn how to
handle them effectively in integration flows, where a file is specified in a
format such as text files, and attached as an exchange parameter.

Read more here:

 Create Attachments

 Replace Body with Content of Attachment

 Read Multiple Attachments

 Read Attachment Based on Filter Criteria

 Read Multiple Attachments Based on Filter Criteria

File Transfer

This lesson covers transferring files from a server, although it does not
involve the use of attachments.

Read more here:

 Poll File by Done File

 Poll Folder by Fixed Done File

 Concatenating Files via Poll Enrich

 Combine XML Files via Poll Enrich

 Poll and Merge Folder

Decouple integration flows

In this lesson, you learn about decoupling processing, which involves


asynchronous decoupling of the processing of integration scenarios
between the sender and the integration flow.
Read more here:

 Decouple Sender and Flows Without Persistence

 Decouple Sender and Flows Using Persistence (using either the data
store or JMS message queues)

Decouple with SOAP Adapter

This lesson uses a specific configuration for the SOAP adapter, which
invokes the integration flow asynchronously. The configuration is as
follows:

 Message Exchange Pattern: One-Way.

 Process Settings: WS Standard.

Use Converters

This lesson provides guidance on how to perform file format conversion,


but it is important to note that converters are necessary as there is no
internal message format like in Process Integration with the XI Message
Protocol. XML is the most commonly used format as it is required for
Message Mapping and XPATH operations, as well as the JSON format.

Read more here:

 Use the CSV to XML Converter

 Use the XML to CSV Converter

 Use the JSON to XML and XML to JSON Converter

Summary
Attachments can be created, modified, and combined, while files can be
retrieved using the SFTP adapter in various ways. Asynchronous
decoupling of integration flows separates the call time from the
processing time. It's also important to note that XML or JSON formats are
necessary for message mapping and XPATH operations.

Create and Configure a Content Modifier

Business Scenario

To expand your integration process, you now want to store the currently
processed ProductID and enrich it in this process. The option to use an
Exchange Property into a Content Modifier enables you to enrich your
Message.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure a Content Modifier.

3. Save as version, deploy, and debug your integration process.

Prerequisites

You have completed the task of creating and configuring an OData call.

Outcome after this exercise

The Exchange Property is used to store the ProductID that is being


processed.

What do you learn within this exercise?

You learn how to use and configure a Content Modifier with an Exchange
Property.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.
1. Navigate within the SAP Integration Suite welcome page
to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. This is the status after the last exercise step.

3. Imagine that the integration flow is more edited.

Task 2: Create and Configure a Content Modifier

Steps

1. Enlarge the swim lane of your processes.

1. Keep adjusting the right border of your swim lane until the
border line becomes active.

2. Use the mouse to drag the center point towards the right side.

3. Proceed in the same manner to the right of the End_Message_


Event.

2. Create and configure a Content Modifier.

1. Position an additional Content Modifier object after


the General Splitter, just like at the beginning.
2. Name it Modify_setProductIDasProperty.

3. Switch to the Exchange Property tab.

4. Choose the Add button, and enter the following data.

Field Name Value

Name ProductID

Source
XPATH
Type

Source
//ProductID
Value

Data Type java.lang.String (S is upper


Field Name Value

case)

Task 3: Save as Version, Deploy, and Debug your Integration Process

Steps

1. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.

3. Choose End_Message_Event from the first loop.


4. Switch to the Message Content tab, and choose the Exchange
Property link.

Now, you see the ProductID as a property, which is set correctly.

2.

3. Learn more about the Content Modifier component.

1. Return to your Integration Process.

2. Access the configuration bar of the Content Modifier.

3. Choose the question mark symbol.

4. You are taken directly to the help site of the Content Modifier.
Using Adapters
Adapters

In this lesson, the following topics are discussed:

 Adapters in an overview.

 Sample OData receiver adapter.

Adapters in an overview

A wide variety of prebuilt adapters are available, with a differentiation


made between input (transmitter) and output (receiver) adapters. These
adapters support various application and transport protocols, as well as
message protocols, and are configured based on their intended function.
Adapters can be broadly categorized into two groups:

 TCP-based.

 Non TCP-based.

Often these adapters are simply called: HTTP-based and non HTTP-based.

Example: OData Adapter

Example: Details of an OData Adapter

Detail Outcome

Category HTTP-based

Transport
TCP/IP
protocol

Application
HTTP/HTTPS
protocol

Message Atom Pub as XML or JSON


protocol representation

Overview of Available Adapters

You have the option to visit the help page or view the available adapters
on an integration flow, based on your license, as demonstrated earlier
with connectivity.

Read more here: Configure Adapter in Communication Channels


Custom Adapter

In case the current adapters do not meet your requirements, it is possible


to import adapters from third-party sources.

Read more here: Importing Custom Integration Adapter in the Cloud


Foundry Environment

Develop your own adapter

In case none of the previously mentioned sources are helpful in finding the
desired adapter, you can also create your own adapter.

Read more here: Developing Custom Adapters

Difference to the adapters in process integration

Within PI, every incoming message format is automatically transformed


into the internal XI message protocol. Unlike cloud integration, there is no
native format available. Essentially, it implies that if a binary format is
used to send a message, it will be forwarded as such to the next
integration flow component without any modifications. That's where the
converters come into play. Using the XML Message format enables the
delivery of excellent support (XPATH) in individual integration flow
components.

Sample OData Receiver Adapter

The OData adapter is applied in the exercises. What sets the OData
adapter apart, and why is it selected for use in the exercise? This is
discussed in the following paragraphs.

The Query Wizard

The OData Sender Adapter features a wizard that allows users to navigate
to the interface to be accessed by using a metadata document. This
approach makes it possible to configure the adapter even if the interface
details are not fully known. However, this procedure is restricted to OData
V2.0 and is only appropriate for a small hierarchy structure.
The Page Processing Mode

OData interfaces transmit data in the form of a feed using the Atom Pub
protocol, with namespaces and their respective prefixes used for added
clarity.

Sometimes, the number of entries in the feed can be substantial,


potentially resulting in technical issues due to message size limitations. To
overcome this challenge, paging processing is employed. It allows for the
reading of the total quantity of entries in packages or pages, which are
then processed sequentially. Nevertheless, this procedure requires an
additional design decision.

Sample with Page Processing Mode

No Page Processing

At No. 1, the Scenario has a looping Process Call. At No. 2, the OData
Adapter calls an OData API.
The Result is:
With Page Processing

Similar to the previous scenario, the OData configuration is situated


below. In this instance, each call is expected to deliver only 5 items.

The Result is:

Automatically Removing of the Name Spaces of the Response

Apart from the option to gather data in packages, the namespaces and
their prefixes are also automatically eliminated. It enables the data to be
processed directly with XPATH. During the practice session, the HTTP
adapter is used for the other calls. However, it does not automatically
remove the namespaces, needing extra mapping.

Read more here: Configure the OData Sender Adapter

Summary
SAP provides a range of adapters in cloud integration, which vary based
on their direction (inbound or outbound) and the transport, application,
and message protocols employed. Broadly speaking, these adapters can
be classified as HTTP-based, TCP/IP-based, or nonHTTP/non TCP/IP-based.
For instance, the OData receiver adapter offers several unique features,
such as a wizard that facilitates easy configuration of the OData API to be
called and the capability for page processing to handle large data
volumes. Also, the adapter removes namespaces and their respective
prefixes from the response, which is another significant advantage.

Create a Request and Reply to an External Call (OData Adapter)

Business Scenario

In your integration process, various Product IDs are used. To check


whether the supplied Product IDs are correct, you must validate them. To
do this, you set up a query to the existing database and use your created
API in API Management. This verification is performed through an OData
protocol query.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure an OData Adapter.

3. Save as version, deploy, and debug your integration process.

Prerequisites

You have completed the final step of creating and configuring a Content
Modifier.

Outcome after this exercise

The first external API call will be implemented by you.


What do you learn within this exercise?

Learn to configure an OData adapter within an external call to an API.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the SAP Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Number → DelayedDelivery_Process_Number.

2. Check the status after the last exercise step:

3. Imagine that the integration flow is more edited.

Task 2: Create and Configure an OData Adapter

Steps

1. Set a Receiver.

1. To create a receiver, choose the Participants menu and


select Receiver.

2. Move the receiver to the position you want.


3. Name the receiver API_SalesOrder_ProductSet.

2. Set an External Call.

1. Choose from the palette Call → External Call → Request


Reply after the Modify_setProductIDAsProperty step.

2. Name it Call_checkProductID.
3. Configure the OData Adapter.

1. Drag the arrow downwards to the


receiver API_SalesOrder_ProductSet from the context menu.

2. Select OData → Data V2 as the adapter.

You can also choose the OData V4 if you want.


3. If the configuration bar does not display automatically, choose
the connection line
from Call_checkProductID to API_SalesOrder_ProductSet.

4. Select the Connection tab on the configuration bar, and enter


the following data:
Name Input

Code Snippet

Copy codeSwitch to dark mode

123

Address https://>your host from your


API</v1/GWSAMPLE_BASIC

for example: https://group00-xxxxx-


yyyyyy.prod.apimanagement.eu10.hana.ondeman
d.com/v1/GWSAMPLE_BASIC

DO NOT COPY THIS LINK

Proxy
Internet
Type

Authenti
None
cation

Reuse
Connecti Labeled
on

5. Make sure to remove the checkmark for "CSRF Protected".

6.

7. Your API URL can be found in the Configure → APIs → API


Proxies section.
8. Navigate to the Processing tab. Here, you find an OData
wizard that assists in configuring the entities and queries.

9. Choose the Select button.

10. Accept the offered entries, and choose the Step


2 button.
11. In this step, choose the magnifying glass next to
the Select Entity field. Choose ProductSet.

12. Select the fields of this entity that should be read. These
include:

 ProductID

 Category

 Name
13. The selected data displays below as a query.

14. Choose the Step 3 button.

15. To ensure that the interface only receives a single result


if the ProductID stored as an Exchange Property is found in the
database, we must specify this as a condition.
16. For No. 1, select the ProductID from the interface.

17. For No. 2, select Equal from the list of conditions.

18. For No. 3, enter manually: ${property.ProductID}.

Be careful to write the Simple Expression Notation correctly, as mentioned


in step p.

The Simple Expression Language can be used to access the stored


property.

19. Choose Finish.

The fields Operation Details, Resource Path, and Query Options are
automatically set. Make sure to also copy over the content
type and timeout.

Task 3: Save as Version, Deploy, and Debug your Integration Process

Steps

1. Save as version, deploy, and debug your integration process.

1. Perform the following steps.

1. Save as version.
2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set log level to trace.

5. Deploy again.

6. Jump again to Overview → Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run. Select the End artifact and choose
the Message Content tab.

3. Choose the End artifact and choose the Message Content tab.

4. Choose the Payload tab to see the message.

5. If a ProductID is stored in the database, the API call functions


properly. However, you must also verify whether the API call
works when the ProductID is not stored in the database.

6. Go back to your integration process and choose Edit.

7. Choose the Modify_setPayload component, open the Message


Body, and change the second ProductID to HT-xxxx.
8. Save and deploy again. Check your trace. The result must be
without any product entries.

9. Don't revert the incorrect ProductID back, as we must use it


for the next step of the verification process.

2. Learn more about the OData Adapter component.

1. Return to your integration process.

2. Open the configuration bar of the OData component and


choose the question mark symbol.
3. Here, you find all the information about the different adapters
and how they can be used in a familiar way.

Create and Configure a Router

Business Scenario

After extra exchange properties have been used with the content modifier
and the message query has been expanded, empty information must be
filtered out. This can be made visible using the router artifact.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure a Router.

3. Save as version, deploy, and debug your integration process.

Prerequisites

The creation of a Content Modifier with the Exchange Properties has been
completed by you in the last step.

Outcome after this exercise

A working integration flow with a filtering of non existent ProductIDs.

What do you learn within this exercise?


Get familiar on how to use and set up a Router that consists of two routes.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the SAP Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. Follow the status after the last exercise step.

3. Imagine that the integration flow is more edited.

Task 2: Create and Configure a Router

Steps

1. Set a Router.

1. Place a router artifact in a known way according to


the Modify_setProductIDasProperty, name it Router_ProductID.

2. Rename the existing End Message


Event to End_Message_with_ProductID.
2. Create the following Condition in the router:

1. Choose the line between the router and


the End_Message_with_ProductID artifact. Choose the General
tab and change the name from Route
1 to ProductID_Available.

2. Switch to the Processing tab. In the Condition field and enter


the code below. That means if at least one child product
exists, take this path:

/ProductSet/count(Product)>0.

Name Input

Conditio /ProductSet/
n count(Product)>0

3. Create the default route.

1. A default route is essential in all cases, and for this purpose,


we can use a second Message End event. In order to achieve
this, choose the plus navigation menu and add an End
Message Event artifact. Now, you can move
the ProductID_Available route slightly downwards.

2. Change the name of this new route to No_ProductID_Available.

3. Name the new End event End_Message__without_ProductID.

4. Go to the Processing tab of the No_ProductID_Available route,


and set this route as the default one.
5. Now, your process should look like this one:

Task 3: Save as Version, Deploy, and Debug your Integration Process

Steps

1. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.

3. As the product list contains both existing and non existent IDs,
it is necessary to follow both paths.

Go to the monitoring and first choose


the End_Message_without_ProductID artifact. Select the Message
Content tab and check the Payload without any ProductID information.
2. Now, you don't see entries for some ProductIDs.

3.

1. Now, check the monitor trace messages with ProductIDs. To do


this, select the End_Message_with_ProductID artifact and
choose the Message Content tab to check the payload with
ProductIDs.

Now, you see some entries with ProductID information.


4. Learn more about the Router component.

1. To access the help page for the router, navigate to your


integration process, open the configuration bar of the Router,
and choose the question mark symbol. This brings up the help
page in the usual manner.

Create a Request and Reply to an External Call (2. OData Adapter)

Business Scenario

To obtain the customer names in a list, you must retrieve more detailed
information via the GWSAMPLE API. Specifically, you will need the
SalesOrderID, ItemPosition, and ProductID, which can be found in the
ToSalesOrderLineItems via the Productset Resource.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create a request and reply to an external OData call.

3. Save as version, deploy, and debug your integration process.

Prerequisites

The creation and configuration of a Router has been completed by you in


the last step.

Outcome after this exercise

An external OData adapter has been configured for a second Request and
Reply call that is currently running.
What do you learn within this exercise?

Get familiar with how to use and configure an OData adapter.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the SAP Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. Your actual integration flow should look like this.

3. Imagine that the integration flow is more edited.

Task 2: Create a Request and Reply to an External OData Call

Steps

1. Expand the workspace.

1. We are working on the ProductID_Available route.


2. Set a second Receiver component.

1. Add a Receiver component.

2. Rename the receiver


to API_SalesOrder_ProductSet_ToSalesOrderLineItems. Ensure
that each name is distinct.

3. Add a Call → External Call → Request Reply.

1. Manually arrange the components to create a visually pleasing


flow. However, there is an automatic formatting feature in the
context menu on the integration process line.
2. Change the name to Call_fetchLineItems.

3. Again, drag the arrow from the context menu of the

 Call_fetchLineItems to

 API_SalesOrder_ProductSet_ToSalesOrderLineItems.

4. Select the OData adapter.


5. Select the OData connection line and access the connection
tab to configure the adapter. Once again, we use the Simple
Expression Language to read the Exchange Property.

You get your API address from your SAP Integration Suite
under Configure → APIs

Field Name Input

Address Code Snippet

Copy codeSwitch to dark mode

123
Field Name Input

https://<yourAPI>/<version>/
GWSAMPLE_BASIC

Proxy Type Internet

Authentication None

CSRF Protected flagged

Reuse
flagged
Connection

Switch to the Processing tab to configure the processing details.

Choose the Select button to configure your resource path.

Field Name Input

Operation
Query (Get)
Details

Choose
Resource Path
the Select button.

All needed information under Connect to System should be automatically


filled in. Choose the Step 2 button to continue.
In this section, you need only to choose the operation Query (GET),
the Sub Levels, and the Entity with their needed Fields.

Field Name Input

Connection
Remote
Source

*automatically filled
Address
in*

Proxy Type Internet

Authentication None
In the fields section, choose ToSalesOrderLineItems and the
subfields SalesOrderID, ItemPosition, DeliveryDate.

Field
Input
Name

Query
Operation
(Get)

Sub Levels 1

Select ProductSe
Entity t
Now, the wizard generates a query for you, check your query and if
everything is matching, choose the Step 3 button.

Field
Input
Name

ToSalesOrderLineIte
Fields
ms

Subfield SalesOrderID

Subfield ItemPosition

Subfield DeliveryDate
The last step is configuring a filter with your already set exchange
property.

Choose the

Code Snippet

Copy codeSwitch to dark mode

123

ProductID <Equal> ${property.ProductID}

in the Filter by section.

When you have done this, choose the Finish button.


Be aware that you type the Exchange Property correctly in the field $
{property.ProductID}!

6. That is how your OData adapter configuration should look.

4. Now, you can verify the number of available data records with a
request as follows:

1. In your browser tab, enter the following URL and replace $


{property.ProductID} with HT-1000.

https://< your API Host


>/ProductSet('${property.ProductID}')/ToSalesOrderLineItems/$count

2. Currently, there are 1400 data records available for the


ProductID "HT-1000", but it is also possible that there is only
one dataset. Keep in mind that the database will be filled with
data every day, so if you don't have a few data, try again
later.

Task 3: Save as Version, Deploy, and Debug your Integration Process

Steps
1. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.

3. A ProductID is being processed.

4. Choose the End_Message_with_ProductID event step at


the Message Content tab and at the Payload link.
5. If the payload is too large, you cannot view it immediately. To
do so, you must download it or limit the message flow."?
&top=2"

6. Replace ${property.ProductID} with HT-1000.

https://< your API


>/ProductSet('${property.ProductID}')/ToSalesOrderLineItems?&top=2

Here, you can check and see that the system has limited the message
output on two messages entries. Try it out.

2. Learn more about the OData Adapter component.

1. Navigate back to your integration process.

2. Open the configuration bar of the OData Adapter and choose


the question mark symbol.
Using Mappings
Mappings

In this lesson, the following topic is discussed:

Mappings in an overview.

Mappings in an Overview

The following mapping types are available:

 Message Mapping.

 XSLT Mapping.

 Mapping with Scripting.

 Operation Mapping from Enterprise Service Repository (On-Premise).

Message Mapping

The Java SDK for message mapping and user-defined functions (UDFs) is
the same as for the process integrations. To use them, you need the body
in XML or JSON format. The source and destination mapping can be
defined using one of the following file types:

 XML Schema Definitions (XSD)

 OData V2/V4 metadata files with .edmx or .xml extensions

 WSDL

 Swagger/OpenAPI Spec JSON file

Mapping Editor

This lesson does not involve the use of a message mapping. The mapping
editor, however, provides all the necessary tools to map XML or JSON
messages.
 No. 1: Adding the source structure.

 No. 2: Adding the target structure.

 No. 3: The actual assignment of the values depending on the


context.

 No. 4: Other representation of the mapping.

 No. 5: Simulation of the mapping with a source file.

 No. 6: Define a user defined function (UDF).

Testing

We can test/simulate the mapping using the monitor.

Implemented Samples via Guidelines

 Guidelines to Implement Specific Integration Patterns Content


Filter → Variant: Message Mapping: Guidelines to Implement Specific
Integration Patterns

 Guidelines to Implement Specific Integration


Patterns Splitter → Variant with Message Mapping: Variant with
Message Mapping

 Learn the Basics → Access Header and Properties → Access Header


and Properties in Message Mapping: Access Header and Properties
in Message Mapping

Implemented Samples via Tutorials, Missions, and Blogs


SAP Cloud Platform Integration (CPI): Use cases of node functions in
message mapping | SAP Blogs

Videos

Message Mapping in SAP CPI || Step by Step guide

Read more here:

 Message Mapping

 SAP Cloud Integration – Swagger/OpenAPI Spec JSON in Message


Mapping | SAP Blogs

XSLT Mapping

XSLT (Extensible Stylesheet Language Transformations) is a language


originally designed for transforming XML documents into
other XML documents, or other formats such as HTML for web pages, plain
text, or XSL Formatting Objects, which can subsequently be converted to
other formats, such as PDF. A style sheet is processed by an XSLT
processor such as Xalan or Saxon. Both are included in every Java SDK.
We use XSLT mapping in the exercises. There is a helpful online
editor, Groovy IDE for easy development and testing of your scripts.

Sample

This example copies the content of the source file without any
namespaces and their corresponding prefixes, and generates a target file.

The namespaces are:

 xmlns:m=„http://schemas.microsoft.com/ado/2007/08/
dataservices/metadata"

 xmlns:d=„http://schemas.microsoft.com/ado/2007/08/dataservices"

 xml:base=„https://sapes5.sapdevcenter.com/sap/opu/odata/iwbep/
GWSAMPLE_BASIC/"

This is necessary to be able to access the contents of the response


via XPATH.
The XSL stylesheet creates the result document through a template that
only includes the original element names and attributes.

The result after transformation:

Testing and Limitations


Testing XSLT mapping requires deploying the integration flow or using
external tools. Without deployment, testing is not possible. To speed up
the tasks related to integration flow transport, testing, and error handling,
you can use tools like the DOST Add-on.

Implemented Samples via Guidelines

Learn the Basics → Access Header and Properties → Access Header and
Properties in XSLT Mapping: Access Header and Properties in XSLT
Mapping

Blogs

 I heart XSLT mappings | SAP Blogs

 XSLT mapping for Batch requests in SAP Cloud Platform Integration (


CPI ) | SAP Blogs

Read more here:

 XSL Transformations (XSLT) Version 3.0

 Create XSLT Mapping

Mapping with Scripting

Mappings can also be implemented using Groovy or JavaScript.

Sample

In this example, the list of order items is restructured in a different way.

The source JSON Payload:


Testing

Testing of Groovy script mapping can be done by deploying the integration


flow or using external tools. It is not possible to test without deploying it.

Read more here:

 SAP HCI/CPI - Cloud Platform Integration: SAP CPI message mapping


groovy script for different scenarios with context handlings

 XSLT Mapping or Scripts in SAP CPI - TechTalkZone

Operation Mapping from Enterprise Service Repository (On-Premise)


To save time and effort when creating integration content, you can import
existing content from your ES repository directly into Cloud Integration. To
do so, you must configure the connection settings to connect to the On-
Premise ES system through Cloud Connector. This allows you to reuse
previously created integration content and avoid redundant work in the
Cloud Integration web application.

Import Content from ESR

Once you have configured the connection to the ES Repository, you can
proceed to import the content from it through the "Resources" tab in the
integration flow editor. Now, you are able to import:

 Message mapping

 Value mapping

 Operation mapping

 WSDL

Read more here:

 Importing Content from ES Repository

 Configuring Connectivity to ES Repository

 Importing Mapping Content from ES Repository

 Importing Message Mapping from ES Repository in SAP Cloud


Integration | SAP Blogs

 Cloud Connector Configuration to Import Message Mapping from ES


Repository into SAP Cloud Platform Integration | SAP Blogs

Summary

Mapping is the process of converting a source format into a different


target format using various techniques in SAP Cloud Integration. The
source and target structures must first be defined, which can be done via
XSDs, WSDLs, and more definitions in message mapping. The Message
Mapping Editor, which offers context handling, user-defined functions, and
testing options, is used for the build-in variant of mapping in XML and
JSON formats. The XSLT procedure, which requires XML as an input format
and offers a simple editor, can create more target formats and is useful for
creating attachments. Testing is done via integration flow deployment or
external tools like XML Spy. Mapping via scripting offers the greatest
degree of freedom in terms of source and target formats and requires a
written language that supports the formats, such as XMLSlurper for
Groovy scripting language. Testing here is also done via integration flow
deployment or external tools like IntelliJ IDEA. If all data structures such as
XSD and WSDLs are already created in the Enterprise Service Repository,
message mapping can be connected and used.

Create and Configure a Content Modifier for SalesOrderID and


ItemPosition

Business Scenario

To expand your integration process, you want now to store the currently
processed SalesOrderID and ItemPosition and enrich it in this process. The
option to use an Exchange Property into a Content Modifier enables you to
enrich your Message.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure a Content Modifier.

3. Save as version, deploy, and debug your integration process.

Prerequisites

You have completed the task of creating and configuring an OData call.

Outcome after this exercise

The Exchange Property is used to store the SalesOrderID and ItemPosition


that is being processed.

What do you learn within this exercise?

You learn how to use and configure a Content Modifier with an Exchange
Property.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps
1. Log on to the integration flow DelayedDelivery_Process via SAP
Integration Suite.

1. Navigate within the Integration Suite welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. This is the status after the last exercise step.

Task 2: Create and Configure a Content Modifier

Steps

1. Enlarge the swim lane of your processes.

1. Keep adjusting the right border of your swim lane until the
border line becomes active.

2. Use the mouse to drag the center point towards the right side.

3. Proceed in the same manner to the right of the End_Message_


Events.

2. Create and configure a Content Modifier.

1. Position an additional Content Modifier object after


the General Splitter, just like at the beginning.
2. Name it Modify_setProductIDasProperty.

3. Switch to the Exchange Property tab.


4. Choose the Add button, and enter the following data.

Field Name Value

Action Create

Name SalesOrderID

Source
XPATH
Type

Source
//SalesOrderID
Value

java.lang.String (S in upper
Data Type
case)

Second
Value

Action Create

Name ItemPositionID

Source
XPATH
Type

Source
//ItemPositionID
Value

Data Type java.lang.String (S in upper


Field Name Value

case

Task 3: Save as Version, Deploy, and Debug your Integration Process

Steps

1. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.

3. Choose the End_Message_with_ProductID event from the first


loop.

4. Switch to the Message Content tab, and choose the Exchange


Property link.

Now, you see the cached ProductID, SalesOrderID, and ItemPositionID as


properties, which are set correctly.
5. Switch back to the Integration Flow Model tab.

6. Now, choose the Message Content tab and then


choose Exchange Properties.

2. Learn more about the Content Modifier component.

1. Return to your Integration Process.

2. Access the configuration bar of the Content Modifier.

3. Choose the question mark symbol.

4. You are taken directly to the Help site of the Content Modifier.
Create and Configure a second Router

Business Scenario

After extra exchange properties have been used with the Content Modifier
and the message query has been expanded, empty information must be
filtered out. This can be made visible using the router artifact.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure a Router.

3. Save as version, deploy, and debug your integration process.

Prerequisites

The creation of a Content Modifier with the Exchange Properties has been
completed by you in the last step.

Outcome after this exercise

A working integration flow with a filtering of non existent SalesOrderIDs.


What do you learn within this exercise?

Get familiar with on how to use and set up a Router that consists of two
routes.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. Follow the status after the last exercise step.

Task 2: Create and Configure a Router

Steps

1. Set a Router.

1. Place a router artifact in a known way according to


the Modify_setSalesOrderIDandItemPositionIDasProperty,
name it Router_SalesOrderID.
2. Rename the existing End Message
Event to End_Message_with_ProductID.

2. Create the following Condition in the router:

1. Choose the line between the router and


the End_Message_with_ProductID artifact. Choose the General
tab and change the name from Route
1 to SalesOrderID_Available.

2. Switch to the Processing tab. In the Condition field, enter the


following (that means if at least one child product exists, take
this path):

/ProductSet/Product/ToSalesOrderLineItems/SalesOrderLineItem/
count(SalesOrderID)>0.
Name Input

Condi /ProductSet/Product/ToSalesOrderLineItems/
tion SalesOrderLineItem/count(SalesOrderID)>0

3. Create the default route.

1. A default route is essential in all cases, and for this purpose,


we can use a second Message End event. In order to achieve
this, choose the navigation menu plus and add an End
Message Event artifact. Now, you can move
the SalesOrderID_Available route slightly downwards.
2. Change the name of this new route
to No_SalesOrderID_Available.

3. Name the new End event End_Message_without_SalesOrderID.

4. Rename the existing End_Message_with_ProductID event


to End_Message_with_SalesOrderID.

5. Go to the Processing tab, and set


the No_SalesOrderID_Available route as the default one.

Task 3: Save as Version, Deploy, and Debug your Integration Process

Steps
1. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.

3. As the product list contains both existing and non existent IDs,
it is necessary to follow both paths.

Go to the monitoring and first choose


the End_Message_with_ProductID artifact. Choose the Message
Content tab and check the Payload without any SalesOrderID information.

2. Now, you see entries for some SalesOrderIDs.


3.

1. Now, check the monitor trace messages without


SalesOrderIDs. To do this, choose
the End_Message_without_SalesOrderID artifact and choose
the Message Content tab to check the payload with
SalesOrderIDs.

Now, you see entries for some SalesOrderIDs without any entries.
4. Learn more about the Router component.

1. To access the help page for the router, navigate to your


integration process, open the configuration bar of the Router,
and choose the question mark symbol. This brings up the Help
page in the usual manner.

Using Adapter Outbound Security


Adapter Outbound Security

In this lesson, the following topics are discussed:

 Outbound Security for Adapters.

 Establishing a secure connection to the receiver using certificates.

 Implement the required authentication and authorization process for


the OData adapter to communicate with the receiver.

Outbound Security for Adapters

The procedures for implementing authentication and authorization against


the receiver vary depending on the type of adapter used, and sometimes,
can be different. However, there are similarities when using TCP-based
adapters. The process involves creating an HTTPS connection via
certificates and performing real authentication. In practice, it means that
the recipient's certificate must be imported into the Cloud Integration
tenant.

The establishment of a secure TCP connection requires the use of TLS with
certificates. SAP provides a dedicated tool for verifying and importing the
necessary certificates specific to the recipient.
The authentication and authorization process is adapter-specific and is
described below for the OData adapter.

Establishing a Secure Connection to the Receiver involves using


Certificates

How can we ensure that the message is delivered to the recipient


properly? In this scenario, the connection is established directly between
the receiver adapter and the receiver.

To establish a secure connection with the receiver, it is necessary to


perform authentication and authorization. This process also involves
setting up an HTTPS connection through certificates, which can be used
for more authentication and authorization. Ultimately, the type of
authentication and authorization used is decided by the receiver.

We demonstrate it again with the example of the OData adapter. In this


training's exercises, we have set up policies for the API Management to
avoid the need for authentication.

Locate and Import the Certificates for the Receiver and the Certificate
Chain for the Server

We can use a helpful tool in cloud integration called Test Connectivity to


find and import the required receiver certificates and their server
certificate chain.

Procedure

 Navigate to Monitor → Integrations → Manage Security → Test


Connectivity.
 Choose your protocol.

 Fill in the necessary data.

 Choose the Send button.

 Download the certificates.

 Import the certificates at Monitor → Integrations → Manage


Security → Manage Keystore → Add → Certificate.

Note

The following screenshots address twitter.com as receiver.

Further explanations:

Choose the Protocol and Enter all Necessary Data

Choose the protocol and enter all necessary data:

Choosing the Send button provides the certificates. Choose


the Download button:

Decompress the downloaded file:

Navigate to Monitor → Integrations → Manage Security → Manage


Keystore → Add → Certificate. Add all certificates separately from your
decompressed file.
The server certificates chain:

The Twitter certificate:

The imported certificates:

A secure HTTPS connection to twitter.com can now be established from


your integration flow.

Implement the Necessary Authentication and Authorization Against the


Receiver for OData Adapters

As previously mentioned, certificates are primarily used to establish the


HTTPS connection. So, more procedures are often required for
authentication and authorization.

The Connection tab of the OData Adapter offers various options for
authentication and authorization.

These are:

 Basic

 Client Certificate

 None
 OAuth2 Client Credentials

 OAuth2 SAML Bearer Assertion

All these options must first be configured


under Monitor → Integrations → Manage Security → Manage Security
Material. Except for the client certificate, all authentication options can be
found there.

Implement an API key based authentication and authorization

It is common to use an API key for authentication and authorization, even


though there is no configuration option for it in the setting options of the
OData Adapter. It is demonstrated here:

Procedure

 Copy the API key from your API.

 Place and configure an Content Modifier in front of


the call component with the OData adapter.

 Enter an Message Header with the API key value.

 Configure the OData adapter at the Connection tab


at Authentication with None.

 At the Processing tab, enter the APIkey in the Request


headers fields.
Summary

The process of establishing secure connections and authentication must


be distinguished. Initially, a TSL connection is established, similar to the
inbound case. However, in this scenario, the communication, and
exchange of certificates occur directly between the Cloud Integration
tenant (subaccount) and the receiver. To identify and import these
certificates, SAP provides a Test Connection tool. The actual
authentication is performed by the adapter, and various options are
available, such as those provided by the OData adapter.

 Basic

 Client Certificate

 None
 OAuth2 Client Credentials

 OAuth2 SAML Bearer Assertion

Create a Request and Reply to an External Call (3.OData Adapter)

Business Scenario

The used and cached SalesOrderIDs and ItemPositionIDs are required to


identify the customer name and customer number. In this step, it is
essential to carry out another query via an OData adapter on the backend
system (API) and to expand the query to include the customer name and
customer number.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure an OData Adapter.

3. Save as version, deploy, and debug your integration process.

Prerequisites

You have completed the final step of creating and configuring a Router.

Outcome After This Exercise

The first external API call will be implemented by you.


What Do You Learn Within This Exercise?

Learn to configure the OData adapter within an external call to an API.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Number → DelayedDelivery_Process_Number.

2. Check the status after the last exercise step:

3. Imagine that the integration flow is more edited.

Task 2: Create and Configure an OData Adapter

Steps

1. Set a Receiver.

1. To create a request and reply, choose theCall menu and


select External Call → Request Replyor use the context menu.
2. Select the line between the Router and the
End_Message_with_SalesOrderID.

3. Name the request reply: Call_fetch_ToHeader.


2. Set the Receiver via the context menu.

1. Rename the Receiver to API_SalesOrder_ProductSet_ToHeader.

2. Connect the Request Reply with the Receiver by using the


interactive context menu as shown in the screenshot.

3. Select the ODataV2 adapter.


4. Select the OData connection between the Request Reply and
Receiver.

5. Set your known API URL from the API Management in the
OData Connection tab. (e.g. "<API
URL>/<version>/GWSAMPLE_BASIC")

6. Your API URL can be found in the Configure → APIs → API


Proxiessection.
7. Switch to the Processing tab and choose the Select button.

3. Configure the OData Adapter.

1. All required information is automatically filled in. Choose


the Step 2 button to move forward.

2. Select the Sub Level 1 and select the


Entity SalesOrderLineItemSet.
3. Now, select the Field you want to get via the API from your
backend system.

4. Use the following selection for the Entity and Operations. After
you have selected all, you move forward by choosing the Step
3 button.
Select the following fields:
 ToHeader

 CustomerID

 CustomerName

 DeliveryStatus

5. Set two filters on the cached Exchange Properties in the


Content Modifier.

Field Name Input

Filter SalesOrderID

Operation Equal

$
Value {property.SalesOderID
}

Second
Filter

Filter ItemPositionID

Operation Equal

Value $
Field Name Input

{property.ItemPosition
ID}

Be careful to write the Simple Expression Notation correctly, as mentioned


in the step.

6. Your OData Adapter configuration under Processing looks like


this:

Task 3: Save as Version, Deploy, and Debug Your Integration Process

Steps

1. Save as version, deploy, and debug your integration process.

1. Perform the following steps.

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run. Select the End artifact and choose
the Message Content tab.

3. Choose the End_Message_with_SalesOrderID artifact and


select the Message Content tab.
4. To see the message, choose the Payload tab.

5. Save and deploy again. Check your trace. The result must
have some CustomerNames and CustomerIDs entries.

2. Learn more about the OData Adapter component.

1. Return to your integration process.

2. Open the configuration bar of the OData component and


choose the question mark symbol.

3. Here, you find all the information about the different adapters
and how they can be used in a familiar way.

Create and Configure a Content Modifier for the CustomerName

Business Scenario

You want to read the customer name via an exchange property. To do this,
you use a Content Modifier.

Task Flow
In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure a Content Modifier.

3. Save as version, deploy, and debug your integration process.

Prerequisites

The step of creating of an OData Call has been completed by you.

Outcome After This Exercise

The CustomerName values are saved as Exchange Properties using


a Content Modifier.

What do you learn within this exercise?

Review the exercise to create and configure a Content


Modifier component.

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_Date_Nu
mber → DelayedDelivery_Process_Number.

2. Following the status after the last exercise step.


3. Be aware that the integration flow is in the edit state.

2. Create and configure a Content Modifier.

1. Add a Content Modifier component after


the Call_fetch_ToHeader OData Call component.

2. Rename it to: Modify_setCustomerNameasProperty.

3. Configure an Exchange Property with the following entries.


Field Name Value

Name CustomerID

Source
XPATH
Type

Source
//CustomerID
Value

java.lang.String (S is upper
Data Type
case)

3. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.
3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content .

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.

4. Check out the Exchange Properties.

1. Navigate to the Message Content tab and first choose


the End_Message_with_SalesOrderID.

2. Select the Exchange Properties tab and search for the


CustomerName.

Create a Data Store Operation

Business Scenario
The preceding step involves writing the CustomerNames stored
as Exchange Properties in each loop to a Data Store, and removing
duplicates.

Task Flow

In this exercise, you will perform the following steps in one task:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create a Data Store Operation.

3. Configure a Data Store Operation.

4. Save as version, deploy, and debug your integration process.

5. Check the Data Store.

6. Delete the data store entries.

7. Learn more about Data Store operations.

Prerequisites

The step to create and configure a Content Modifier has been completed
by you.

Outcome After This Exercise

You add a Data Store containing all CustomerNames for the


given ProductIDs to your integration flow.
What do you learn within this exercise?

Learn to create and use a Data Store → Write operation.

Task 1: Create a Data Store Operation

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations and
APIs → DelayedDelivery_Package_CLD900_Date_Number → De
layedDelivery_Process_Number.

2. Following the status after the last exercise step. Imagine


that the integration flow has been further modified.

2. Create a Data Store Operation.

1. Set Persistence → Data Store Operations → Write component


after Modify_setCustomerNameasProperty, or use the context
menu as shown.

2. Rename it to Write_CustomerName.
3. Configure a Data Store Operation.

1. Switch to the Processing tab, and enter the following data:

Field Name Value

DelayedDelivery_CustomerName_List_Nu
Data Store Name
mber

Global (external processes can read


Visibility
these entries )

${property.CustomerID} (is shown as


Entry ID
the header)

RetentionThreshol
2
d for Alerting (in d)

Expiration Period
3
(in d)

Encrypt Stored
flagged
Message

Overwrite Existing
flagged (remove duplicates )
Message

Include Message
flagged
Headers
2.

3. Again, we use the Simple Expression Language for addressing


the CustomerName as an Exchange property with $
{property.CustomerName}.

4. Rename
the End_Message_with_SalesOrderID to End_Message_with_Cu
stomerName.

4. Save as version, deploy, and debug your integration process.

1. Perform the following steps.

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.

2. Navigate to your Integration Flow Model


in Overview → Monitor Message Processing → Message
Processing Run.
5. Check the Data Store.

1. Navigate to Monitor → Integrations and APIs from the left-side


menu.

2. Find Manage Stores → Data Stores.

3. Choose the Data Store tile.

4. After selecting the correct Data Store name, you are able to
see the entries. Choose the Data Store name.

5. Now, check the Payload in the Monitoring, by selecting


the End_Message_with_CustomerName artifact and choose
the Message Content tab to switch to the Payload tab.
6. Learn more about Data Store operations.

1. Navigate back to your Integration Process, open the


configuration bar of the Write_CustomerName component,
and choose the question mark symbol.

Performing Exception Handling


Exception Handling

In this lesson, the following topics are discussed:

 What is an Exception?

 Define an Error Configuration for one integration flow to inform the


Sender.

 Define an Exception Subprocess.

 Error Handler.

 Error and Escalation events.


What is an Exception?

There are typically two types of exceptions: expected and unexpected.


Expected exceptions can include different values in a message field or
empty values, and can be handled through the integration flow design.
Unexpected exceptions, on the other hand, are technical in nature, such
as connection failures or errors in scripts, and can lead to program
termination. To prevent this, unexpected errors must be intercepted and
handled appropriately to allow the program to continue running.

The focus now shifts to unexpected exceptions and their handling, and for
the remainder of this discussion, the terms "unexpected exceptions",
"exceptions", and "errors" will be used interchangeably.

Define Error Configuration for One Integration Flow to Inform the Sender

You can specify the error handling mechanism for handling runtime
failures during message processing. The primary objective of this
approach is to communicate error details to the sender for better
awareness. To achieve this, you can enable the Return Exception to
Sender flag in the integration flow settings.

Read More Here:

Define Error Configuration

Define an Exception Subprocess

An extra subprocess can be defined within an integration flow, which will


be invoked whenever an unexpected error occurs.

Procedure

 In an editable flow, you may want to retrieve an exception to ensure


that the original process is completed without any errors.

 Place an Exception Subprocess.


 Define your error handler.

 Save, deploy, and run your integration flow.

Sample

Your integration flow has encountered an error as the connection


parameters are not functional.

The error in the message monitor after deployment:

Usage of an Exception Subprocess:

The starting event is an Error Start event. This component fetches the
exception.
The monitor with the exception subprocess:

You can view the detailed error information inside the Message Processing
Run.

Error Handler

If the error is caught by an Error Start event, any further processing can
be implemented as in a regular process. Scripts, like the one
demonstrated in Handle Exceptions, are especially common.

Error and Escalation Events


The Error Start event always triggers the Exception Subprocess, which
intercepts the error and starts its execution. If you want to terminate
the Exception Subprocess without the status Failed, you must define
an End Message, as shown in the previous example.

Manual Generation of Error and Escalation End Events

Setting an Error End event at the end of an integration flow will always
cause it to enter the Fail status, which can be used as a design element.
The Error End event serves as the endpoint of the main process, and in
case of any errors, the Exception Subprocess always intercepts it.

Run to an Error End Event:

If you get the Failed status, then the process worked correctly.

The Escalation End event generates an error without interrupting the main
process.

Run to an Escalation End Event:

If you get the Escalated status, then the process worked correctly.
Summary

A special error subprocess can intercept an unexpected error using an


Exception Start Event. After interception, various processing steps can be
implemented. For instance, it is appropriate to store process values or
message content following an error. Also, informing the sender about the
error can also be configured.

Create an Exception Process

Business Scenario

We will now add error handling to intercept unexpected errors, which is


necessary for every OData call since the connection may not be
established from time to time. To achieve this, we apply our own
Exception Subprocess.

Task Flow

In this exercise, you will perform the following steps in one task:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create and configure an Exception Subprocess.

3. Learn more about the Exception Subprocess.

Prerequisites

A subprocess to handle exceptions should be added by you.

Outcome After This Exercise

You have implemented an Exception Subprocess that handles errors in


your integration flow.
What Do You Learn Within This Exercise?

Learn to create and use an Exception Subprocess.

Task 1: Create an Exception Process

Steps

1. Log on to the integration flow DelayedDelivery_Process via


Integration Suite.

1. Navigate within the Integration Suite Welcome page


to → Design → Integrations → DelayedDelivery_Process_Packa
ge_CLD900_Date_Number → DelayedDelivery_Process_Numbe
r.

2. Following the status after the last exercise step.

3. Imagine that the integration flow has been further edited.

2. Create and configure an Exception Subprocess component.

1. Choose Process → Exception Subprocess.


2. Ensure that there is enough space in the swim lane to
accommodate the subprocess before placing it in a free area.

3. Rename it to Exception Subprocess.

4. The subprocess is traversed in case of an error, but no actions


are taken based on it.

3. Learn more about the Exception Subprocess.

1. Go back to the Design view of your iflow. Navigate


to Exception Subprocess, open the configuration bar of
the Exception Subprocess component, and choose
the question mark symbol.
Using Scripting
Scripting

In this lesson, the following topics are discussed:

 Scripting overview.

 Developing Groovy Scripts with the inline editor.

 Create Groovy Scripts with an external editor.

Scripting Overview

You can use Java or Groovy scripts for message processing, which can be
useful in the following scenarios:

Add Information to the Message Processing Log

You can use the Script step to add information to the message processing
log (MPL).

Read and Modify Message Header, Message Body, and Exchange


Properties

You can use the Script step to address (get, add, modify, or delete) the
message header, the message body, and exchange properties, using the
interface Message object.

Read and Modify SOAP Headers

You can use the Script step to address SOAP headers.

Read and Modify Partner Directory Content

You can use the Script step to address Partner Directory content.

Handle Exceptions

You can use the Script step to identify exceptions that arise when sending
messages using the HTTP or OData V2 receiver adapter.

Read Security-Related Artifacts

You can use the Script step to address security-related artifacts (for
example, keystore entries).

Additional Use Cases

Java Docs for com.sap.gateway.ip.core.customdev.util packages

The Java doc can be found here: com.sap.it.script.custom-development


2.7.1 API
Developing Groovy Scripts with the Inline Editor

You can use the inLine editor directly in the following manner.

Procedure

 Position a Groovy Script component on the expiration path.

 Choose the Create button from the context menu.

 After that, you are in the inline editor of the Groovy script. There is
already a basic script created, on which you can build.

 You will have code compilation and more.


Then, proceed with the standard simulation and/or testing process by
deploying and tracing the flow.

Create Groovy Scripts with an External Editor

You can build a local development environment with Eclipse, IntelliJ,


or Visual Code. You can get the corresponding SDK here: SAP
Development Tools at Using Script API → Script API.

There is also an online editor with which you can write and test directly.
Use Groovy IDE for easy development and testing of your scripts. All
necessary SDKs are already implemented.
After copying the code, paste it into the scripting component, and proceed
as usual.

Resources

There is a whole range of examples and demos in the SAP Help Portal:

 Blogs:

o Understanding Groovy Scripting for SAP Integration Suite –


Part 1: https://blogs.sap.com/2021/04/08/understanding-
groovy-scripting-for-sap-integration-suite-part-1/

o SAP Cloud Platform Integration (CPI) || Part 7 || Maintaining


logs by using "Groovy Scripts", even if the integration flow is
not on trace mode: https://blogs.sap.com/2020/01/09/sap-
cloud-platform-integration-cpi-part-7-maintaining-logs-using-
groovy-scripts-even-if-the-iflow-is-not-on-trace-mode./

o SAP Cloud Integration (CPI/HCI) || Writing Groovy Scripts _


With Basic Examples | SAP Blogs

 Samples

o Get an environment to Edit SAP CPI scripts in Groovy in 20


minutes.

o SAP CPI development in Git and debug your Groovy scripts .

o Groovy Scripting in SAP CPI | First Program | Groovy IDE |


[email protected].

 Script Collections: Using Script Collection across various Integration


Flows in a Package in SAP
CPI: https://blogs.sap.com/2021/06/07/using-script-collection-across-
various-packages-in-sap-cpi/.

 Script API: https://tools.hana.ondemand.com/#cloudintegration.


Using Script API.

Summary

Scripts in Java or Groovy can be created using a Script SDK, which allows
for processing messages and their metadata in various ways. The SDK
allows for setting and reading exchange parameters, writing logs,
intercepting exceptions, and more. These scripts can be created using
both the Inline Editor and an online IDE that includes all the necessary
SDKs, making it possible to test the script directly.

Create a Groovy Script for Error Handling


Business Scenario

A Groovy Script is required for error handling to work and take effect. It is
used to record error information in the event of an error. To do this, the
script must be stored within the exception subprocess.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create a Groovy Script.

3. Save as version, deploy, and debug your integration process.

Prerequisites

The last step of creating an Exception Process has been completed.

Outcome After This Exercise

You will have implemented a Groovy Script within the Exception


Subprocess.

What do you Learn Within This Exercise

Learn the process of creating and using a Groovy Script.

Task 1: Log on to the Integration Flow DelayedDelivery_Process

Steps
1. Log on to the integration flow DelayedDelivery_Process via
Integration Suite.

1. Navigate within the Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. Following the status after the last exercise step.

3. Let's assume that the integration flow has been further edited.

Task 2: Create a Groovy Script

Steps

1. Create a Groovy Script.

1. Set Transformation → Script → Groovy Script component within


the Exception Subprocess.

2. Or you can use the context menu by choosing the line


between the Error Start and End event. Then, to add a flow
step, choose the plus symbol.
3. Rename it to Script_readExceptionMessagesToPayload.

2. Write a Groovy Script.

1. Use the Create icon from the context menu of


the Script_readExceptionMessagesToPayload component.
2. Enter the following data:

Code Snippet

Copy codeSwitch to dark mode

1234567891011121314151617181920212223242526

import com.sap.gateway.ip.core.customdev.util.Message;

import java.util.HashMap;

import groovy.xml.MarkupBuilder;

def Message processData(Message message) {

//Body

def body = message.getBody();

def props = message.getProperties();

def exStacktrace = props.get("exStacktrace");

def exMessage = props.get("exMessage");

def stringWriter = new StringWriter();

def exceptionBuiler = new MarkupBuilder(stringWriter);

exceptionBuiler.exception {

exceptionMessage(exMessage)

stacktrace(exStacktrace)

message.setHeader("Content-Type", "application/xml")

message.setHeader("CamelHttpResponseCode", 500)
message.setBody(stringWriter.toString());

def messageLog = messageLogFactory.getMessageLog(message);

if(messageLog != null) {

messageLog.addAttachmentAsString("Exception Messages",
message.getBody(), "text/plain")

return message;

3. Choose the OK button to save your work.

4. Navigate back to your Integration Process via the bread crumb


navigation.

Task 3: Save as Version, Deploy, and Debug Your Integration Process

Steps

1. Save as version, deploy, and debug your integration process.

1. Perform the following steps:

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

5. Deploy again.

6. Jump again to Overview → Manage Integration Content.


2. Navigate to your integration flow Model in Overview → Monitor
Message Processing → Message Processing Run.

2. Deploy it without an Exception and check the Exception


Subprocess in the monitoring, which has no messages working with.

3. Deploy it with an implemented exception error.

1. Navigate to the Call_checkProductID OData adapter. At


the Connection tab, change the URL so that it no longer works
by adding an x.
2. Save and Deploy again, and check your Monitor.

3. At Overview → Monitor Message Processing.

4. The message status is Completed and an error message has


been created. Check the messages and the attachments
under Monitor Message Processing.

5. An Exception occurs and the Exception Subprocess is called.

6. Everything works as expected.

7. Remove the error in the URL again so that everything runs


without errors again.
4. Learn more about the Groovy Scripting operations.

1. Navigate back to your Integration Process, open the


configuration bar of
the Script_readExceptionMessagesToPayload component, and
choose the question mark symbol.

Replace the Timer Event by a Message Start Event

Business Scenario

In this step, our objective is to replace the timer event with a message
start event and initiate the process through an incoming SOAP message.
Also, we remove the initial content modifier, Modify_setPayload, as it was
created for assistance purposes.

Task Flow

In this exercise, you will perform the following steps in one task:

1. Log on to the integration flow DelayedDelivery_Process.

2. Substitute the timer event with a message start event.

3. Save as version, deploy, and debug your integration process.

Prerequisites

You have completed the final step of writing the Customer ID.

Outcome After This Exercise

A message start event has been added at the beginning and


the Modify_setPayload component has been removed.
What Do You Learn Within This Exercise?

Learn how to replace a timer event with a message start event.

Task 1: Replace the Timer Event by a Message Start Event

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the SAP Integration Suite Welcome page


toDesign → Integrations → DelayedDelivery_Package_CLD900_
Date_Number → DelayedDelivery_Process_Number .

2. Following the status after the last exercise step.


3. Imagine that the integration flow has been further modified.

2. Substitute the timer event with a message start event.

1. Choose Event → Start Message or use the context menu as


previously mentioned in the preceding exercises.

2. Delete the Time Event.


3. Rename the start event to Event_startMessage.
3. Delete the Modify_setPayload component.

1. This segment of your integration flow looks like this now.


2. Choose the Save button.

Using Adapter Inbound Security


Inbound Security Adapter

In this lesson, the following topics are discussed:

 Inbound Security for Adapters.

 Inbound Security for an HTTP-based Adapter.

Inbound Security for Adapters

Two cases must be distinguished: establishing a secure connection to the


load balancer and authenticating at the Cloud Integration tenant where
the integration flow is implemented. To establish a secure connection,
a Transport Layer Security (TLS) procedure is used over TCP (Transport
Control Protocol)-based connections, and certificates are employed. This
procedure is used in HTTPS, IMAPS, POP3S, SMTPS, FTPS, and other
protocols. In the exercises of this training, we use the SOAP adapter.

We want to explore the available options for ensuring that only authorized
senders can send messages to our integration flow. This topic is referred
to as Inbound Security.

 The certificates required to establish an HTTPS connection between


the sender and the load balancer are necessary.

 The sender's authorization is validated against the integration flow


endpoint.

Establishment of the HTTPS Connection


Recall the technical overview discussed in a previous lesson, which shows
a load balancer between the sender and the endpoint URL of our deployed
integration flow. As a result, it is necessary to establish the SSL connection
between the sender and the load balancer.

The responsibility of importing all the required certificates lies with SAP, as
we do not have access to the load balancer in Cloud Integration.

Authorization of the Sender Against the Endpoint of the Integration Flow

There are also two options available here:

 Authentication can be performed directly against the remote tenant


where the integration flows are deployed.

 Usage of an authentication client (OAuth) on your own tenant.


Authentication can be performed directly against the remote tenant where
the Integration Flows are deployed

This is the most common scenario during development, however, it is not


recommended for production use. This option is marked in red as No. 1 in
the preceding picture.

Procedure

 Create your integration flow sender adapter and choose User


Role as Authorization.

 At User Role, choose the available user roles.

 The default user role is ESBMessaging.send.

 Assign the user role ESBMessaging.send to a Role Collection.

 Assign your Role Collection to a user at your subaccount.

 Call the endpoint with a user who is assigned to your Role


Collection with the ESBMessaging.send role included.

Note

It is not recommended to use client certificates for authorization. This


approach requires importing a client certificate directly into the
configuration.

Role Collection, A_sendMessagesToCI with the


role MessagingSend assigned to two users.
Note

The role MessagingSend in the context of this situation is equivalent to the


role ESBMessaging.send in the Cloud Integration.

You get a successful call to the endpoint using one of the assigned users.

Set your own User Role at Cloud Integration

You can use your own user roles. To set your own user role, navigate
to Monitor Artifacts → Integrations → Mange Security → User Roles, and set
your own role.

Set your own user role, for example, Peter1:


Then, as previously described, you can configure your own user role:

 Create a Role Collection.

 Assign your own user role to your own Role Collection.

 Assign the Role Collection to a user.

Usage of an Authentication (OAuth) Client on your own Tenant

The method of directly calling an integration flow via the role-based


approach shown uses personalized users and basic authentication, which
are not suitable for productive purposes. For better authentication
methods, we must use a self-configured OAuth2.0 client that can be
created on our own subaccount.

To accomplish this, we must set up a Process Integration Runtime instance


on our subaccount, and associate it with the integration flow plan. This
instance can then be customized with various client credentials. These
correspond to No. 1 and No. 2, marked in blue in the preceding picture.

You can choose the following grand-types:

 Authorization Code

 Client Credentials

 Password

 Refresh Token

 SAML2 Bearer
 JWT Bearer

Selection of grand types when configuring the local Process integration


Runtime instance.

Procedure

 Create a local Process integration Runtime instance.

 Configure the appropriate grand-type.

 Create a key.

 Use the key parameters for authorization.

The new Process Integration Runtime instance of plan integration-flow.


Sample - Log on with ClientID and ClientSecret

Use the clientID as the user and the clientsecret as the password.

Sample - Log in with Bearer Token

Use the tokenURL with clientIS as user, and the clientsecret as password
to generate a Bearer token.
Use the Bearer token for authentication.

Sample - Log in with OAuth 2.0 Authentication

You can use OAuth 2.0 for authentication, which involves two steps: first,
generating a token, and second, using that token for authorization.
Resources

Read more here:

 Client Credentials

 Configuration Checklist for Inbound Authentication

Summary

Secure communication involves two steps: establishing a secure


connection to the load balancer and authenticating at the Cloud
Integration tenant where the integration flow is implemented. This is
achieved by using certificates in TCP (Transport Control Protocol) based
connections via the TSL (Transport Layer Security) procedure. These
protocols include HTTPS, IMAPS, POP3S, SMTPS, FTPS, and others. Many
protocols such as SOAP, OData, HTTP, also use these secure
communication protocols.

Authentication at the endpoint of the integration flow can be


accomplished in two ways: direct assignment of a user role to a user or
the use of a local OAuth 2.0 client that offers extra authentication options,
such as ClientId/Clientsecret, Bearer token, or OAuth 2.0. These methods
are not personalized and are more secure than the first option.

Create an Inbound SOAP Adapter

Business Scenario

To initiate the process by receiving a SOAP message containing the list of


ProductIDs, it's necessary to set up a SOAP-type inbound adapter.
Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the integration flow DelayedDelivery_Process.

2. Create an Inbound SOAP Adapter.

3. Save as version and deploy your integration process.

Prerequisites

The task of substituting the timer event with a message start event has
been completed.

Outcome After This Exercise

You will get familiar with how to use and configure an inbound SOAP
adapter in the integration flow.

What do you Learn Within This Exercise?

You will gain familiarity with configuring and using an inbound SOAP
adapter in the integration flow.

Steps

1. Log on to the integration flow DelayedDelivery_Process via SAP


Integration Suite.

1. Navigate within the SAP Integration Suite Welcome page


to Design → Integrations → DelayedDelivery_Package_CLD900
_Date_Number → DelayedDelivery_Process_Number.

2. Following the status after the last exercise step.


3. Imagine that the integration flow has been further edited.

2. Rename the Sender.

1. Choose the Sender component and change the name


to Sender_startSOAPMessage.

3. Create an Inbound SOAP Adapter.

1. Create and configure an Inbound SOAP Adapter.

2. Drag the arrow from the context menu of


the Sender_startSOAPMessage to Event_Start_Message.
3. Choose SOAP → SOAP 1.x.

4. Switch to the Connection tab of the configuration bar. Enter


the following data.
Field
Input Data
Name

/receive/message/
Address DelayedDelivery_ProductIDs_Date_Number (must
be unique)

Service
Manual
Definition

Use WS-
Addressin unflagged
g

Message
Exchange One-Way (starting asynchronous)
Pattern

Processing
WS standard
Settings

Authorizat
User Role
ion

User Role ESBMessaging.send


5.

4. Save as version, deploy, and debug your integration process.

1. Perform the following steps.

1. Save as version.

2. Deploy.

3. Jump to Overview → Manage Integration Content.

4. Set the log level to trace.

2. The process is initiated and waits for an incoming SOAP


message. The endpoint to start the process looks like this:

Code Snippet

Copy codeSwitch to dark mode

https://int-cust-demo-store-xxxxxxxxxxxxxxxxxxxt.cfapps.eu10-
003.hana.ondemand.com/cxf/send/
<your_defined_address_in_the_soap_adapter>
3. Currently, we do not have the option to send a message to the
endpoint.

Allow the Sender to Send Messages to the Endpoint of the iFlow

Business Scenario

We want to enable a channel to send messages to the endpoint, and to do


this we must assign a user the role of ESBMessaging.send, which is part of
a Role Collection. Another way to define a role is by using a local process
integration runtime, which can be assigned to a user.

Task Flow

In this exercise, you will perform the following tasks:

1. Check the role ESBMessaging.send at Cloud Integration.

2. Create a Role Collection at your SAP BTP subaccount.

Prerequisites

You have completed the final step of creating an Inbound SOAP Adapter.

Outcome After This Exercise

You will have configured security for your endpoint, which allows your
integration flow to receive messages.

What do you Learn Within This Exercise?

You will learn how to implement role-based inbound security for an


integration flow.

Task 1: Check the Role of ESBMessaging.send at Cloud Integration

Steps

1. Find the configured role ESBMessaging.send at Cloud Integration.

1. Choose Monitor → Integrations and APIs.


2. Find Manage Security → User Roles .

3. The role is defined but not yet assigned to a user.

Task 2: Create a Role Collection in the SAP BTP Subaccount

Steps

1. Create a Role Collection in your SAP BTP subaccount.

1. Navigate to the overview side of your SAP BTP subaccount.


2. Choose the Role Collections menu entry.

3. Choose the Create button for a new Role Collection. Set the
name SendMessagesToCI.

4. Use the search bar to navigate to your new Role Collection


with the name SendMessagesToCI.
5. Choose the Edit button and choose the Role Name search.

6. To take over the entry for MessagingSend, search for it and


select the checkbox in the corresponding role. Only then will
the role be applied.

7. Choose the Add button at the bottom right.

Note

The role MessagingSend corresponds to the Cloud Integration


role ESBMessaging.send.

2. Assign the Role Collection to a user.

1. Assign in the ID area, a user to this Role Collection by typing


the ID into the ID field. To add the ID, press Enter.
2. Choose the Save button.

3. Now, we have a defined MessagingSend role that corresponds


to the ESBMessaging.send role at No. 1, and an assigned user
to this role via a Role Collection at No. 2.

Create an HTTP Client to Send SOAP Messages to an iFlow

Business Scenario

We will now explore one method to send messages to an integration flow -


using Postman. It is important to be cautious while entering user and
password details. Entering an incorrect password three times results in the
user being blocked for an hour.

Task Flow

In this exercise, you will perform the following optional tasks:

1. Use the Postman HTTP Client to send a message to the integration


flow.

2. Use curl to send a message to the integration flow.

3. Create an HTTP Client with Business Application Studio.

Prerequisites

The last step has been completed, which involved creating an Inbound
SOAP Adapter. As a result, your integration flow has been deployed and
now has an endpoint.

Outcome After This Exercise

You now have the capability to send messages to the


DelayedDelivery_Process using various HTTP clients.

What do you Learn Within This Exercise?

You will learn how to configure and use the Postman HTTP client for
sending a SOAP message to the DelayedDelivery_Process.
Task 1: Use Postman HTTP Client to Send a Message to the Integration
Flow

Steps

1. Get the Postman app or use it via the web.

1. You can download the Postman API platform


from: https://www.postman.com/downloads/ for free.

2. Choose Workspaces.

3. Choose the Create Workspace button.

2. Configure a workspace.

1. Enter a name and a description.

2. Choose the Create Workspace button.

3. To create a collection in your new


workspace DelayedDelivery_Process, choose the New button .
3. Configure a collection within the workspace.

1. Choose the Collection button.

2. Rename the collection to DelayedDelivery_Process_Collection.


You now have a Workspace at No. 1 and a collection at No. 2.

4. Configure an HTTP request.

1. Choose the New button and choose HTTP Request.


2. Enter your Endpoint URL of your DelayedDelivery_Process.

3. First, change the method to POST.

4. Choose Authorization → Inherit auth fr.. → Basic Auth and


enter your user and password of this user, which has assigned
the Role Collection SendMessagesToCI.
5. Choose the Body tab.

6. Choose raw → XML.

7. Enter the following SOAP message.

Note

It is possible to verify that the Product IDs exist in the database.

Code Snippet

Copy codeSwitch to dark mode

12345678910111213141516171819202122232425262728293031323334
35363738394041424344454647484950

<soapenv:Envelope
xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">

<soapenv:Header/>
<soapenv:Body>

<List>

<Product>

<ProductID>HT-2001</ProductID>

</Product>

<Product>

<ProductID>HT-1020</ProductID>

</Product>

<Product>

<ProductID>HT-1035</ProductID>

</Product>

<Product>

<ProductID>HT-6101</ProductID>

</Product>

<Product>

<ProductID>HT-7000</ProductID>

</Product>

<Product>

<ProductID>HT-2026</ProductID>

</Product>

<Product>

<ProductID>HT-6100</ProductID>

</Product>

<Product>

<ProductID>HT-9994</ProductID>

</Product>

<Product>

<ProductID>HT-1254</ProductID>

</Product>
<Product>

<ProductID>HT-1031</ProductID>

</Product>

<Product>

<ProductID>HT-1035</ProductID>

</Product>

<Product>

<ProductID>HT-1601</ProductID>

</Product>

<Product>

<ProductID>HT-2025</ProductID>

</Product>

<Product>

<ProductID>HT-1067</ProductID>

</Product>

</List>

</soapenv:Body>

</soapenv:Envelope>

8. Choose the Save button and rename your HTTP request while
saving.

5. Send a Message.

1. Choose the Send button.


6. Navigate to Integration Suite → Monitor → Overview → Manage
Integration Content.

1. Choose Monitor Message Processing.

2. Choose Completed → Trace.


3. The process runs smoothly without any errors.

Using Integration Patterns


Integration Patterns

In this Lesson, the Following Topic is Discussed:

Show Integration Patterns.

Show Integration Patterns

The following Integration Patterns are included in the example package:

 Aggregator

 Composed Message Processor

 Content-Based Routing

 Content Enricher

 Content Filter
 Message Filter

 Recipient List

 Resequencer

 Scatter-Gather

 Splitter

 Quality of Service Exactly Once

The Integration Patterns in Detail:

Aggregator

To process related individual messages in bulk, you can use an Aggregator


pattern. This pattern involves collecting and storing individual messages
until a complete set of related messages is received. The aggregated
message is then sent to the intended receiver.

Composed Message Processor

The Composed Message Processor pattern is useful when you must


process a message that contains multiple elements, each requiring
different processing. The pattern involves splitting the message into
submessages, routing each submessage to a different destination, and
then reaggregating the responses back into a single message.

Content-Based Routing

Assuming you have an order process in which the inventory system


handling the order depends on the shipping address, you can use content-
based routing to direct the message to the appropriate recipient based on
its content. In this exercise, we use a content-based router.

Read more here:

 Variant: Send to Default Receiver

 Variant: Ignore

 Variant: Raise an Error

Content Enricher

Suppose you must send an order to a supplier, but you don't have all the
required information for the receiver system to process it. For example,
the product items only have a category code, and the main category
name is missing. In this case, you can use a Content Enricher pattern that
reads data synchronously from an external system and appends the
missing information to the original message before forwarding it to the
receiver.

Content Filter

Assuming you receive an order from a partner in a standardized format


with many fields, but your backend system only needs a small fraction of
them. You have two options to implement this scenario:

 Using a Filter step.

 Using Message Mapping.

Read more here:

 Variant: Content Filter Step

 Variant: Message Mapping

Message Filter
You can implement the Message Filter pattern to remove unwanted data
from a channel. For example, if you must send product information to an
inventory system, but the inventory system only handles a specific range
of products based on product category, you can apply a Message Filter to
discard any irrelevant data. The Message Filter is a subtype of the
Message Router pattern, with only one receiver channel. It evaluates
incoming messages and routes them to the receiver channel only if they
meet the specified criteria; otherwise, they are discarded.

Recipient List

Assuming that you want to find the best quote for an order by sending it
to several suppliers, but not all suppliers are relevant for every product in
the order. In this case, the suppliers that should receive the order are
dynamically determined based on the specific products being ordered. To
achieve this, you can use the Dynamic Router pattern, which sends a copy
of the message to multiple receivers based on dynamically determined
criteria. Unlike the content-based router, which forwards the original
message to a single receiver, the dynamic router sends a copy of the
message to multiple receivers.

Read more here:

 Variant: Static Routing

 Variant: Dynamic Routing

 Variant: Dynamic Routing Using JMS Message Queues

Resequencer

If you must rearrange messages that were received by Cloud Integration


in an incorrect order, you can employ the Aggregator pattern. This pattern
enables you to gather individual messages in batches that are sorted by
sequence number. To transform the message batches back into separate
messages, you can use the Splitter pattern and send the individual
messages to the intended receiver.
Scatter-Gather

The Scatter-Gather pattern allows you to send a message to multiple


recipients and collect their replies. By broadcasting a message to multiple
recipients, the pattern enables parallel processing of the message by all
recipients. After receiving the responses, the pattern reaggregates them
back into a single message.

Splitter

The Splitter pattern can be used when a message contains multiple


elements that require different processing. The pattern can break up the
message into individual messages based on the number of elements.
However, in the given exercise, the Splitter pattern is not used.

A distinction is made between the following use cases:

 Splitting a bulk order message into multiple orders.

 Splitting a single order with multiple items.

Read more here:

 Variant with Iterating Splitter

 Variant with General Splitter

 Variant with Message Mapping

Quality of Service Exactly Once

You want to guarantee that a message is delivered and processed by the


receiver exactly once. This requirement can be achieved by combining the
following two enterprise integration patterns:
Guaranteed Delivery

The Guaranteed Delivery pattern guarantees that a message will


eventually be delivered to a receiver, even if there are temporary failures
in the messaging system. However, because of the possibility of
redeliveries, this pattern can result in a message being delivered multiple
times.

Idempotent Receiver

The Duplicate Message Suppression pattern addresses the issue of


handling duplicate messages and ensures that if a component receives
the same message multiple times, it processes it only once.

Read more here: Quality of Service Exactly Once

Summary

This lesson describes various integration patterns, including Aggregator,


Composed Message Processor, Content-Based Routing, Content Enricher,
Content Filter, Message Filter, Recipient List, Resequencer, Scatter-Gather,
Splitter, and Quality of Service Exactly Once.

Install Example Integration Flows

Business Scenario

The enterprise patterns, including aggregator, content enricher, and


dynamic routing, among others, have specific technical requirements. To
learn and apply these patterns effectively, it is useful to have a collection
of best practices, tips, and tricks. SAP offers self-study options, such as a
complete implementation as integration flow, a comprehensive
description on sap.help, and a Postman Collection for testing the deployed
integration flows. You can find all this knowledge in the integration flow
Design Guidelines, which we can use to implement selected scenarios.

Task Flow

In this exercise, you will perform the following tasks:

1. Log on to the Discover → Integration view.

2. Search, find, and copy the example integration flows to your Design
area.

3. Check out the documentation.

4. Check out the help pages.

Prerequisites
A functional Integration Suite → Cloud Integration with an appropriate user
is all that is required.

Outcome After This Exercise

The integration flow guidelines were discovered and reviewed.

What do you Learn Within This Exercise?

To gain knowledge about integration patterns, it is highly recommended to


locate and explore the integration flow design guidelines, which serve as
an excellent source of information.

Task 1: Log On to the Integration View

Steps

1. Log on to the Discover → Integration view.

1. Navigate to Discover → Integrations.

Task 2: Search, Find, and Copy the Example Integration Flows to Your
Design Area

Steps

1. Search for Examples.

1. In the search bar, enter Examples, and choose the magnifying


glass icon to search for examples.
2. Locate the eight Integration Flow Design Guidelines.

2. Copy the content of Integration Flow Design Guidelines - Learn the


Basics to your Design area.

1. Select the Integration Flow Design Guidelines link from the


search result list.

2. Choose Artifacts (33).


3. There are more than 30 integration flows available for
exploration. There is also a Documents tab, which we will
explore later.

4. On the top right, choose the Copy button to copy the complete
package to your Design area.

5. Open from the left-side menu Design → Integrations.

6. Choose the Integration Flow Design Guidelines - Learn the


Basics package.

7. In this section, you find the Artifacts and Documents tabs.


8. Choose the Artifacts tab.

9. First, locate the Generic Receiver and choose it. This artifact is
a prerequisite for all other artifacts in this package, so it must
be installed first.

10. Use the Action → Deploy button to deploy the Generic


Receiver.

Task 3: Check Out the Documentation

Steps

1. Check out the Documentation.

1. Navigate to the Documents tab.

 No. 1: The Postman Collection (all needed Messages are


ready for use out of the box within the HTTP Client
Postman).

 No. 2: Jump to the Help Portal.

 No. 3: Jump to the installations tutorial for the


Integration Flow Design Guidelines.

Task 4: Check Out the Help Pages

Steps
1. Check out the How to Work with the Example Integration Flows (No.
3).

1. Choose the How to Work with the Example Integration


Flows link (No. 3).

2. Check out this page. We will do the same in the next task.

2. Check out Package Documentation on SAP Help Portal (No. 2).

1. Choose the Package Documentation on SAP Help Portal link


and access the documentation of each integration flow.

2. Go to the left-side menu and navigate


to Development → Integration Flow Design Guidelines.

Note

The Integration Flow Design Guidelines cover several topics, including


Decouple Processing, Decouple Sender and Flows Using Persistence, and
Decouple Sender and Flows Using JMS Message Queues*.
3. Refer to this page for the upcoming task where we will
implement and execute this scenario.

Install the Postman Collection

Business Scenario

The following steps can be followed to try out selected scenarios and
study their mode of operation using the appropriate messages sent to the
endpoints of the processes via HTTPS. SAP provides a predefined
collection of messages for this purpose. It is also possible to use other
HTTP clients, such as Insomnia.

Task Flow

In the task of this exercise, you will perform the following steps:

1. Download the Postman Collection JSON file.

2. Open the Postman HTTP Client.

3. Install the Postman Collection.

Prerequisites

 The process of creating an HTTP Client to send SOAP messages to


an integration flow has been completed.

 You have completed the task Create an HTTP Client to send SOAP
messages to an integration flow using Postman. To use Postman,
you must have it installed on your machine. Alternatively, you can
use Insomnia as an alternative to Postman. If you choose to use
Insomnia, you can import the JSON Collection of messages.

Outcome After This Exercise

A well-prepared Postman collection is available for testing all


the Integration Flow Design Guidelines.
What do you Learn Within This Exercise?

Acquire knowledge on configuring an OData Adapter for your API (API


Business Hub Enterprise) and learn how to use it.

Task 1: Install the Postman Collection

Steps

1. Download the Postman Collection JSON file.

1. Navigate to Integration Suite → Design → Integrations and


open the copied package Integration Flow Design Guidelines -
Learn the Basics.

2. Navigate to the Documents tab and choose Modeling Basics -


Postman Collection.

The download of the zipped JSON


file ModelingBasics.postman_collection.zip starts.

3. Unzip this file locally on your computer.

2. Open Postman HTTP Client with your workspace.

1. Open Postman HTTP Client and navigate to your


workspace DelayedDelivery_Process.
3. Install the Postman Collection.

1. In your workspace, choose the Import button.

2. At the File tab, choose Choose Files and import


the ModelingBasics.postman_collection.json file.

3. Choose the Import button.


4. You have now imported a new collection within your
workspace ModelingBasics.

5. Now, you have a collection of messages that can be sent to


the provided integration flows.

You might also like