Skip to main content

Celonis Product Documentation

Process Connectors Best Practices

On the following page, the App Store team provides Best Practices on different topics related to Transformations that are used in the Process Connectors.

Activity Master Data Table

On this page, the App Store Team provides you with an overview of how to use the Activity Master Data Table. It can be used to:

  • Differentiate activities that happen at the item or header Level according to our standard script.

  • Provide estimations on how long it takes to complete an activity. Can be used for savings potential calculations.

  • It gives you the freedom to add additional columns to your projects such as documentation and more.

For more information, please contact the App Store team.

Some of the values used are best guesses or assumptions. Feel free to adapt the hard-coded values according to your rollout projects and customer feedback.

Motivation

Customers are frequently asking how to know if an activity was performed on the item or header level. This is very interesting in multiple topics, for instance:

  • Automation: Often the customer objects to the provided automation ratios provided by Celonis. Their main objection is that our analyses are not taking into account that if one activity on the header level is performed manually this might trigger the execution of many automated activities on the item level.

  • Business Case Calculations: You can take into account activities at the header or item level to avoid counting activities multiple times. Furthermore, time spent on performing activities can be taken into account to count savings, FTEs among other uses cases.

  • Activity calculation in the script: Instead of going through the script to understand at which level activity is calculated, you can simply check the Level in the Master Data Table.

Contents of the table

The Activity Master Data Table has a very simple structure. It consists of the following columns:

  • "LEVEL" - This contains the document level of the activity. Currently, only 'Item', 'Header', and 'Custom' are used. The sole purpose of the latter one is to catch activities for which a level has not been specified.

  • "AVG_TIME" - This column sets a time (minutes). It is used to provide an estimation of how long it takes to complete the activity.

  • "ACTIVITY_DE"

  • "ACTIVITY_EN"

Template

Vertica transformation script template- Create Table: Activity Master Data

DROP TABLE IF EXISTS _CEL_XXX_ACTIVITY_MASTER_DATA;

CREATE TABLE _CEL_XXX_ACTIVITY_MASTER_DATA AS (

SELECT DISTINCT
    "ACTIVITY_DE" AS "ACTIVITY_DE"
    ,"ACTIVITY_EN" AS "ACTIVITY_EN"
    ,CASE
         WHEN "ACTIVITY_DE" IN ()
         THEN 'Header'
         WHEN "ACTIVITY_DE" IN ()
         THEN 'Item'
         ELSE 'Custom'
         END AS "LEVEL"
    ,CASE
         WHEN "ACTIVITY_DE" IN ()
         THEN 5
         WHEN "ACTIVITY_DE" IN ()
         THEN 60
         END AS "AVG_TIME"

FROM _CEL_XXX_ACTIVITIES);
How to connect the table to the existing data model
  1. Define your activities, level (header or item for SAP systems) and any attributes you would like to include in the activities table. This step should be performed out of Celonis.

  2. Create the table and populate it using a Transformation or use the Import file functionality to import a csv file with all the information.

  3. Connect the new table in the Data Model.

The Activity Master Data table is simply connected to the designated Activity Table by using 'ACTIVITY_EN' (the activity name) as a foreign key.

22119578.png
Example of how it is implemented in Purchase-to-Pay for SAP ECC and Order-to-Cash SAP ECC

Vertica transformation script P2P SAP ECC - Create Table: Activity Master Data

DROP TABLE IF EXISTS _CEL_P2P_ACTIVITY_MASTER_DATA;
 
CREATE TABLE _CEL_P2P_ACTIVITY_MASTER_DATA AS (
SELECT DISTINCT
    "ACTIVITY_DE" AS "ACTIVITY_DE"
    ,"ACTIVITY_EN" AS "ACTIVITY_EN"
    ,CASE
         WHEN "ACTIVITY_EN" IN ('Cancel Invoice Receipt' ,'Change Approval for Purchase Order', 'Cancel Currency','Change Final Invoice Indicator' , 'Change Vendor' ,
                                  'Clear Invoice' , 'Record Invoice Receipt' , 'Send Purchase Order Update' ,'Set Payment Block' , 'Waiting for Approval (Initial)')
         THEN 'Header'
         WHEN "ACTIVITY_EN" IN ('Block Purchase Order Item', 'Cancel Goods Receipt' , 'Change Delivery Date (actual)' , 'Change Delivery Date (scheduled)' ,
                               'Change Delivery Indicator' , 'Change Outward Delivery Indicator' , 'Change Rejection Indicator' , 'Change Price' ,
                                  'Change Quantity' , 'Change Storage Location' , 'Create Purchase Order Item' , 'Create Purchase Requisition Item',
                               'Delete Purchase Order Item' , 'PO approved (1st stage)', 'PO approved (2nd stage)' , 'PO approved (direct)' ,
                               'Reactivate Purchase Order Item' , 'Receive Order Confirmation' , 'Record Goods Receipt' , 'Remove Payment Block' , 'Send Purchase Order')
         THEN 'Item'
         ELSE 'Custom'
         END AS "LEVEL"
    ,CASE
         WHEN "ACTIVITY_EN" IN ('Create Purchase Order Item' , 'Create Purchase Requisition Item' , 'Delete Purchase Order Item' , 'PO approved (1st stage)' ,
                               'Receive Order Confirmation' , 'Record Goods Receipt' , 'Record Invoice Receipt' , 'Remove Payment Block' , 'Send Purchase Order Update' ,
                                   'Set Payment Block' , 'Waiting for Approval (Initial)' , 'Change Outward Delivery Indicator')
         THEN 5
         WHEN "ACTIVITY_EN" IN ('Change Approval for Purchase Order' , 'Cancel Currency' , 'Change Delivery Date (actual)' , 'Change Delivery Date (scheduled)' ,
                              'Change Delivery Indicator' , 'Change Final Invoice Indicator' , 'Change Rejection Indicator' , 'Clear Invoice' )
         THEN 10
         WHEN "ACTIVITY_EN" = 'Send Purchase Order'
         THEN 15
         WHEN "ACTIVITY_EN" IN ('Cancel Goods Receipt' , 'Record Invoice Receipt' , 'Change Price' , 'Change Quantity' , 'Change Storage Location' ,
                                'PO approved (2nd stage)' , 'PO approved (direct)' , 'Reactivate Purchase Order Item')
         THEN 30
         WHEN "ACTIVITY_EN" IN ('Block Purchase Order Item' , 'Change Vendor')
         THEN 60
         ELSE 30
         END AS "AVG_TIME"
FROM _CEL_P2P_ACTIVITIES);

HANA transformation script P2P SAP ECC - Create Table: Activity Master Data

-- Query No: 33
CALL {{TARGET_SCHEMA}}.DROP_IF_EXISTS ('{{TARGET_SCHEMA}}', '_CEL_P2P_ACTIVITY_MASTER_DATA');
-- Query No: 34
CREATE COLUMN TABLE {{TARGET_SCHEMA}}._CEL_P2P_ACTIVITY_MASTER_DATA AS (
SELECT  DISTINCT 
        ACTIVITY_DE AS "ACTIVITY_DE"
    ,ACTIVITY_EN AS "ACTIVITY_EN"
    ,CASE
         WHEN ACTIVITY_EN IN ('Cancel Invoice Receipt' ,'Change Approval for Purchase Order', 'Cancel Currency','Change Final Invoice Indicator' , 'Change Vendor' ,
                                  'Clear Invoice' , 'Record Invoice Receipt' , 'Send Purchase Order Update' ,'Set Payment Block' , 'Waiting for Approval (Initial)')
         THEN 'Header'
         WHEN ACTIVITY_EN IN ('Block Purchase Order Item', 'Cancel Goods Receipt' , 'Change Delivery Date (actual)' , 'Change Delivery Date (scheduled)' ,
                               'Change Delivery Indicator' , 'Change Outward Delivery Indicator' , 'Change Rejection Indicator' , 'Change Price' ,
                                  'Change Quantity' , 'Change Storage Location' , 'Create Purchase Order Item' , 'Create Purchase Requisition Item',
                               'Delete Purchase Order Item' , 'PO approved (1st stage)', 'PO approved (2nd stage)' , 'PO approved (direct)' ,
                               'Reactivate Purchase Order Item' , 'Receive Order Confirmation' , 'Record Goods Receipt' , 'Remove Payment Block' , 'Send Purchase Order')
         THEN 'Item'
         ELSE 'Custom'
         END AS "LEVEL"
    ,CASE
         WHEN ACTIVITY_EN IN ('Create Purchase Order Item' , 'Create Purchase Requisition Item' , 'Delete Purchase Order Item' , 'PO approved (1st stage)' ,
                               'Receive Order Confirmation' , 'Record Goods Receipt' , 'Record Invoice Receipt' , 'Remove Payment Block' , 'Send Purchase Order Update' ,
                                   'Set Payment Block' , 'Waiting for Approval (Initial)' , 'Change Outward Delivery Indicator')
         THEN 5
         WHEN "ACTIVITY_EN" IN ('Change Approval for Purchase Order' , 'Cancel Currency' , 'Change Delivery Date (actual)' , 'Change Delivery Date (scheduled)' ,
                              'Change Delivery Indicator' , 'Change Final Invoice Indicator' , 'Change Rejection Indicator' , 'Clear Invoice' )
         THEN 10
         WHEN "ACTIVITY_EN" = 'Send Purchase Order'
         THEN 15
         WHEN "ACTIVITY_EN" IN ('Cancel Goods Receipt' , 'Record Invoice Receipt' , 'Change Price' , 'Change Quantity' , 'Change Storage Location' ,
                                'PO approved (2nd stage)' , 'PO approved (direct)' , 'Reactivate Purchase Order Item')
         THEN 30
         WHEN "ACTIVITY_EN" IN ('Block Purchase Order Item' , 'Change Vendor')
         THEN 60
         ELSE 30
         END AS "AVG_TIME"
FROM {{TARGET_SCHEMA}}._CEL_P2P_ACTIVITIES
);

Vertica transformation script O2C SAP ECC - Create Table: Activity Master Data

DROP TABLE IF EXISTS _CEL_O2C_ACTIVITY_MASTER_DATA;

CREATE TABLE _CEL_O2C_ACTIVITY_MASTER_DATA AS (
SELECT DISTINCT
    "ACTIVITY_DE" AS "ACTIVITY_DE"
    ,"ACTIVITY_EN" AS "ACTIVITY_EN"
    ,CASE
         WHEN "ACTIVITY_EN" IN ('Create Quotation' ,'Create Purchase Order', 'Create Picking', 'Quotation Statuses' , 'Sales Order Statuses' ,
                                 'Approve Credit Check' , 'Deny Credit Check' , 'Partially approve Credit Check' ,'Set Delivery Block' , 'Change Delivery Block',
                               'Remove Delivery Block', 'Change Delivery Block', 'Delete Delivery Note', 'Record Goods Issue', 'Cancel Goods Issue', 'Create Billing Documents', 
                                'Create Other Billing Document', 'Change Shipping Type', 'Change Freight Terms', 'Change Payment Terms', 'Change Inco Terms (Part 1)', 'Change Inco Terms (Part 2)',
                                'Due Date Passed', 'Material Availability Date Passed',' Clear Invoice')
         THEN 'Header'
         WHEN "ACTIVITY_EN" IN ('Create Sales Order Item', 'Set Reason for Rejection' , 'Change Reason for Rejection' , 'Cancel Reason for Rejection' ,
                               'Change Price' , 'Change Division' , 'Book Delivery Shortage' , 'Change Delivery Amount' ,
                                  'Change Material' , 'Change Plant' , 'Change Route' , 'Create Delivery Documents',
                               'Create Other Delivery Documents')
         THEN 'Item'
         ELSE 'Custom'
         END AS "LEVEL"
    ,CASE
         WHEN "ACTIVITY_EN" IN ('Due Date Passed', 'Material Availability Date Passed')
         THEN 0
         WHEN "ACTIVITY_EN" IN ('Create Quotation' ,'Create Purchase Order', 'Quotation Statuses' , 'Sales Order Statuses' ,'Record Goods Issue', 'Create Billing Documents', 
              'Create Other Billing Document', 'Create Sales Order Item', 'Book Delivery Shortage' , 'Create Delivery Documents', 'Create Other Delivery Documents')
         THEN 5
         WHEN "ACTIVITY_EN" IN ('Partially approve Credit Check','Clear Invoice','Cancel Goods Issue')
         THEN 10
         WHEN "ACTIVITY_EN" IN ('Change Delivery Block','Remove Delivery Block', 'Change Delivery Block', 'Delete Delivery Note', 'Change Delivery Amount' ,'Change Material' , 'Change Plant' , 
         'Change Route' , 'Create Picking','Set Reason for Rejection' , 'Change Reason for Rejection' , 'Cancel Reason for Rejection' ,'Change Price' ,'Change Division')
         THEN 30
         WHEN "ACTIVITY_EN" IN ('Approve Credit Check' , 'Deny Credit Check' , 'Change Shipping Type', 'Change Freight Terms', 'Change Payment Terms', 'Change Inco Terms (Part 1)', 
         'Change Inco Terms (Part 2)','Set Delivery Block')
         THEN 60
         ELSE 30
         END AS "AVG_TIME" --Minutes
FROM _CEL_O2C_ACTIVITIES
);

HANA transformation script O2C SAP ECC - Create Table: Activity Master Data

-- Statement: Create Table: _CEL_O2C_ACTIVITY_MASTER_DATA
-- Query No: 1
CALL {{TARGET_SCHEMA}}.DROP_IF_EXISTS ('{{TARGET_SCHEMA}}', '_CEL_O2C_ACTIVITY_MASTER_DATA');
-- Query No: 2
CREATE COLUMN TABLE {{TARGET_SCHEMA}}._CEL_O2C_ACTIVITY_MASTER_DATA AS (
SELECT  DISTINCT 
        ACTIVITY_DE AS "ACTIVITY_DE"
    ,ACTIVITY_EN AS "ACTIVITY_EN"
    ,CASE
         WHEN ACTIVITY_EN IN ('Create Quotation' ,'Create Purchase Order', 'Create Picking', 'Quotation Statuses' , 'Sales Order Statuses' ,
                                 'Approve Credit Check' , 'Deny Credit Check' , 'Partially approve Credit Check' ,'Set Delivery Block' , 'Change Delivery Block',
                               'Remove Delivery Block', 'Change Delivery Block', 'Delete Delivery Note', 'Record Goods Issue', 'Cancel Goods Issue', 'Create Billing Documents', 
                                'Create Other Billing Document', 'Change Shipping Type', 'Change Freight Terms', 'Change Payment Terms', 'Change Inco Terms (Part 1)', 'Change Inco Terms (Part 2)',
                                'Due Date Passed', 'Material Availability Date Passed',' Clear Invoice')
         THEN 'Header'
         WHEN ACTIVITY_EN IN ('Create Sales Order Item', 'Set Reason for Rejection' , 'Change Reason for Rejection' , 'Cancel Reason for Rejection' ,
                               'Change Price' , 'Change Division' , 'Book Delivery Shortage' , 'Change Delivery Amount' ,
                               'Change Material' , 'Change Plant' , 'Change Route' , 'Create Delivery Documents',
                               'Create Other Delivery Documents')
         THEN 'Item'
         ELSE 'Custom'
         END AS "LEVEL"
    ,CASE
         WHEN "ACTIVITY_EN" IN ('Due Date Passed', 'Material Availability Date Passed')
         THEN 0
         WHEN "ACTIVITY_EN" IN ('Create Quotation' ,'Create Purchase Order', 'Quotation Statuses' , 'Sales Order Statuses' ,'Record Goods Issue', 'Create Billing Documents', 
              'Create Other Billing Document', 'Create Sales Order Item', 'Book Delivery Shortage' , 'Create Delivery Documents', 'Create Other Delivery Documents')
         THEN 5
         WHEN "ACTIVITY_EN" IN ('Partially approve Credit Check','Clear Invoice','Cancel Goods Issue')
         THEN 10
         WHEN "ACTIVITY_EN" IN ('Change Delivery Block','Remove Delivery Block', 'Change Delivery Block', 'Delete Delivery Note', 'Change Delivery Amount' ,'Change Material' , 'Change Plant' , 
         'Change Route' , 'Create Picking','Set Reason for Rejection' , 'Change Reason for Rejection' , 'Cancel Reason for Rejection' ,'Change Price' ,'Change Division')
         THEN 30
         WHEN "ACTIVITY_EN" IN ('Approve Credit Check' , 'Deny Credit Check' , 'Change Shipping Type', 'Change Freight Terms', 'Change Payment Terms', 'Change Inco Terms (Part 1)', 
         'Change Inco Terms (Part 2)','Set Delivery Block')
         THEN 60
         ELSE 30
         END AS "AVG_TIME" --Minutes
FROM {{TARGET_SCHEMA}}._CEL_O2C_ACTIVITIES

);
Outcome of the exemplary transformation snippet once it was executed:

ACTIVITY_DE

ACTIVITY_EN

LEVEL

AVG_TIME

Ändere Lieferplandatum

Change Schedule Line Date

Custom

30

Sende Bestellung

Send Purchase Order

Item

15

Lege BANF Position an

Create Purchase Requisition Item

Item

5

Erfasse Wareneingang

Record Goods Receipt

Item

5

Erfasse Leistungserfassungsblatt

Record Service Entry Sheet

Custom

30

Erfasse Rechnungseingang

Record Invoice Receipt

Header

5

Lege Bestellposition an

Create Purchase Order Item

Item

5

Lösche Bestellposition

Delete Purchase Order Item

Item

5

Sende Bestelländerung

Send Purchase Order Update

Header

5

Storniere Rechnungseingang

Cancel Invoice Receipt

Header

30

Ändere Preis

Change Price

Header

30

Ändere Freigabe für Bestellung

Change Approval for Purchase Order

Header

10

Storniere Wareneingang

Cancel Goods Receipt

Item

30

Ändere Menge

Change Quantity

Item

30

Using the Activity Master Data Table in KPI calculation (Exemplary use case: Ratio of automated activities)

One of the benefits of this helper table is being able to distinguish activities that are either performed on the header or item level. This applies to most of our use cases as we are still mainly focused on document data from SAP ECC.

Currently - in most cases - we do not distinguish between the document header and item level when calculating a KPI that is based on activity. One example of this is the calculation of the automation ratio. The current formula does not account for whether an activity is performed on the document header or on the document item level.

As a result, many customers object to the automation ratio calculation provided by Celonis.

PQL Formula App Store Analysis "Automation":

Ratio of automated activities (Automation Ratio)

AVG(
CASE
    WHEN "_CEL_P2P_ACTIVITIES"."USER_TYPE" IN (<%= auto_user_type %>)
    THEN
        1.0
    ELSE
        0.0
END
)

Using the "LEVEL" column from the Activity Master Data Table, the PQL statement can be adjusted to include a weighting of activities that are performed on the document header or document item level.

In development: Automation 2.0 :

New weighted ratio of automated activities (Automation Ratio)

SUM(
        CASE 
           WHEN "_CEL_P2P_ACTIVITIES"."USER_TYPE" IN (<%= auto_user_type %>) AND "_CEL_P2P_ACTIVITY_MASTER_DATA"."LEVEL" = 'Item' 
           THEN 1.0
           WHEN "_CEL_P2P_ACTIVITIES"."USER_TYPE" IN (<%= auto_user_type %>) AND "_CEL_P2P_ACTIVITY_MASTER_DATA"."LEVEL" = 'Header' 
           THEN 1.0 / (PU_COUNT("EKKO", "EKPO"."EBELP"))
        ELSE 0.0
        END
    )
    /
SUM(
        CASE 
           WHEN "_CEL_P2P_ACTIVITY_MASTER_DATA"."LEVEL" = 'Item' 
           THEN 1.0
           WHEN "_CEL_P2P_ACTIVITY_MASTER_DATA"."LEVEL" = 'Header' 
           THEN 1.0 / (PU_COUNT("EKKO", "EKPO"."EBELP"))
        ELSE 0.0
        END
   )

This approach is not only limited to the automation use case. It can be applied to any use case that focuses specifically on an activity-based KPI.

Benefits in business case calculation

Another benefit of this table is an arguably more precise business case calculation within a Celonis analysis. Under the assumption that most of the processing times are estimated correctly and have been validated by the customer the calculated potential savings calculated this way are more precise and less prone to objections by a customer.

Each activity count of every distinct activity specified in the helper table is multiplied by the specified individual processing times. The result is a total processing time which is then multiplied by the average cost of labor per minute.

Q + A

Q: 'In the Analysis X when we use custom table _CEL_P2P_ACTIVITY_MASTER_DATA , it uses proxies for avg . time as 5- 60. I l believe this is minutes? is it a good benchmark to consider Y minutes for 'Activity A'?

A: The assumptions made in this additional table are more than debatable. Please feel free to change it during your rollout projects. We are also happy to adjust the hard-coded durations and levels of activities based on customer feedback in the standard scripts.

Automation Rate - SAP ECC

On this page, the App Store Team provides a short overview on how the Automation Rate is calculated in the SAP P2P process and how this is implemented in our standard scripts.

Content

SAP Tables

The following tables are needed in order to calculate the Automation Rate in the SAP P2P process:

Table

Description

USR02

The standard SAP table USR02 contains information about the logon data in SAP ERP systems. For the Automation Rate we make usage of the USR02.USTYP which indicates whether a transaction was executed by a dialog user 'A' or a batch user 'B'.

EKKO

The standard SAP table EKKO contains information about the purchasing document header in SAP ERP systems and therefore contains information about the purchase order. For the Automation Rate we make usage of the EKKO.STATU = 'B' which indicates that a purchasing document was created by batch user.

EBAN

The standard SAP table EBAN contains information about the purchase requisition in SAP ERP systems and is linked to the purchasing order. For the Automation Rate we make usage of the EBAN.STATU = 'B' which indicates that a purchasing requisition was created by a batch user.

EKES

The standard SAP table EKES contains information about the vendor confirmations in SAP ERP systems and is linked to the purchasing order. For the Automation Rate we make usage of the EKES.ESTKZ = 3 which indicates that a vendor confirmation was created by a batch user.

NAST

The standard SAP table NAST contains information about the internal message status in SAP ERP systems. For the Automation Rate we make usage of the NAST.MANUE = 'X' which indicates that a transaction was executed by a dialog user.

Calculation of the Celonis Automation Rate
User Types

Previously (and still in most cases), only the USR02.USTYP was regarded for the calculation of the Automation Rate. Unfortunately, this leads to misleading results because in many sap systems the User Type is not reliable. When the status is set to A, the activity might still be executed as a batch, since not all batch activities are registered in the USR02 table. Some activities like SAP Workflows and recurring events that can be seen as semi-automated or automated were originally initiated by a dialog user.

User Names

Standard user types in SAP:

Type (in USER_TYPE)

Description

A-User

Individual, interactive system access.

B-User

Background processing and communication within a system (such as RFC users for ALE, Workflow, TMS and CUA).

C-User

Dialog-free communication for external RFC calls.

L-User

Dialog user available to a larger, anonymous group of users.

S-User

General, non-person related users that allow the assignment of additional identical authorizations. No logon is possible.

R-User

Artificial user type provided by a celonis transformation script to identify mass transactions triggered by a dialog user (A-User).

User Names that may be obtained from the table responsible for the activity, mostly the case table. In general in SAP there is the distinction between different types of user:

  • Dialog Users, which are normal persons and interact via the GUI. (U1234, amerkel)

  • Interface Users, which are used by a third party tool to feed data from another system (e.g. Ariba) or Background processing and communication within a system (such as RFC users for ALE, Workflow, TMS, and CUA).

  • Service Users are users which are available to a larger group of users.

Examples for User Names that indicate a batch activity may be 'BatchUser', 'System', 'B', 'Generic', 'Robot' and so on.

Extending the Activity Table by adding a dedicated column for the identification of automated activities.

In Order to calculate the new Automation Rate for the P2P process, it is necessary to extend the existing transformation scripts. The activity table is extended by a field indicating automation by taking into account the previously mentioned fields and approaches:

In the creation of the activities, a top down approach using CASE WHEN statements is used in order to find the right value for this newly created field. The following example shows the procedure for the creation of a purchase order item:

Activity Master Data

In order to calculate the Automation Rate and the implications on the business case on a more granular level another table is added to the datamodel, the _CEL_P2P_ACTIVITY_MASTER_DATA table. This table provides information whether an activity was executed on header or item level in order to take into account the duplication of cases that occur on header level. Furthermore, the estimated activity time, which should be validated by the customer, is added.

Q&A
Q: How can I use the improved Automation Rate for my P2P process?

In order to be able to use the extended calculation of the Automation Rate KPI, you need expand the data model of your P2P process as described in this article.

Q: Is it also possible to use the improved Automation Rate for other processes?

So far only for P2P, new indicators for the calculation of the Automation Rate KPI have been added.

Q: Why don't we just compare the number of timestamps per minute for a user in order to find automatic activities?

When a dialog user saves a transaction in SAP multiple activities can be created at the same time without this transaction being a batchjob. Furthermore, semi-automatic transactions like SAP Workflows that require the execution of a dialog user can create multiple different activities at the same time.

Currency Conversion - SAP ECC

On this page, the App Store Team provides you with a short overview of what Currency Conversion looks like in SAP ECC and how it is implemented in our standard scripts.

Content

Related SAP Tables

The main tables that are needed to convert from one currency to another are:

TCURR

Contains the absolute exchange rates from one currency to another in a specific timeframe

→ Most important field: UKURS

To avoid the necessity to round exchange rates, SAP stores values in negative numbers, which means, that the inverse of that value explains the exchange rate (see example below).

Example:Converting US Dollars to Euro. Currenct exchange rate: 1$ = 0,87719 €.

Solution: To increase accurancy (only 5 decimal places are possible!) and to avoid rounding errors SAP stores the inverse. E.g. -1,14, which means 1/1,14 = 0,8771929824561...

The exchange rates itself contained in table TCURR cannot be interpreted, since they always refer to a data record in factor table TCURF.

Attribute

Explanation

MANDT

SAP Mandant

KURST

Exchange Rate Type. Can differ from Customer to Customer The standard script uses KURST = 'M', which is defined as the Average exchange rate type

FCURR

FROM Currency - The initial currency you want to convert from

TCURR

TO Currency - The output currency you want to convert to

GDATU

The date, on which the exchange rate is valid

UKURS

The absolute exchange rate on the specific date (GDATU); The UKURS values are stored in the numerical format XXXX,XXXXX, so the exchange rates can have values from 0,00001 to 9999,99999.

FFACT*

Ratio for the "From" Currency Units

TFACT*

Ratio for the "To" Currency Units

* The correct values for FFACT und TFACT are usually taken from the table TCURF.

TCURF

Contains the conversion of currencies in ratios; needed to provide a more accurate conversion

→ Most important fields: TFACT, FFACT

The UKURS values are stored in the numerical format XXXX,XXXXX, so the exchange rates can have values from 0,00001 to 9999,99999. To increase the range, TCURF.TFACT/FFACT add a factor to the UKURS (see example below)

Example: Converting Indonesian Rupiahs to Euro. The current exchange rate is around 1€ = 0,000079... IDR. As a result of the maximum number of decimal places, TCURR.UKURS would only contain the value 0,00008, which is not very accurate.

Solution:

  1. To avoid rounding errors, SAP stores the inverse, e.g. -12,54 (taken from real Customer Data), resulting in 1/12,54 = 0,07974481658692...

    As the real exchange rate is way smaller than the one stored in TCURR.UKURS (0,000079 vs. 0,079745), an additional factor is needed. That is where TCURF steps in:

  2. TCURF.FFACT (IDR) = 1000 → 0,079745/1000 = 0,000079745

    With the help of the FFACT, we are able to get the exchange rate we actually need.

Attribute

Explanation

MANDT

SAP Mandant

KURST

Exchange Rate Type. Can differ from Customer to Customer The standard script uses KURST = 'M', which is defined as the Average exchange rate type

FCURR

FROM Currency - The initial currency you want to convert from

TCURR

TO Currency - The output currency you want to convert to

GDATU

The date, on which the exchange rate is valid

FFACT

Ratio for the "From" Currency Units

TFACT

Ratio for the "To" Currency Units

ABWCT*

Alternative exchange rate type

ABWGA*

Date from which the alternative exchange rate type is valid

* not taken into account

TCURX

Contains information about the number of decimal places of different currencies.

→ Most important field: CURRDEC

Example: The Japanese Yen (JPY) does not have any decimal places → TCURX.CURRDEC = 0, SAP stores every value as a result of the following formula: Stored value = actual value / ( 10 (2 – CURRDEC) )

When 100 JPY is entered in the SAP GUI, the value that is stored in the backend table is 100 JPY / (10(2-0) ) = 1,00 because TCURX.CURRDEC (JPY) = 0

The Tunesian Dinar (TND) has 3 decimal places → TCURX.CURRDEC = 3

When 100 TND (Tunesian Dinar) is entered in the SAP GUI, the value that is stored in the backend table is 100 TND / (10(2-3)) = 1000 because TCURX.CURRDEC (TND) = 3

Solution: To get the real value back, we need to use the information stored in TCURX.CURRDEC. This is done with the calculated TDEC value

The SAP standard is 2 decimal places, so you will not find information for all currencies here.

Attribute

Explanation

CURRKEY

Currency Abbreviation

CURRDEC

Number of Decimals the currency contains

Currency Conversion in the App Store Standard Scripts

The currency conversion in the standard scripts is calculated as follows:

Currency Conversion SAP - Example: P2P Standard, Transformation Script -> Table: Cases

....
,CASE 
                WHEN E.WAERS = '<%=currency%>' THEN E.NETWR*ISNULL(TCURX.TDEC,1) 
                WHEN TCURR_CC.UKURS < 0 THEN (E.NETWR*ISNULL(TCURX.TDEC,1))*(1/ABS(TCURR_CC.UKURS)/(CASE WHEN COALESCE(TCURF_CC.FFACT, TCURR_CC.FFACT, 0) = 0 THEN 1 ELSE COALESCE(TCURF_CC.FFACT, TCURR_CC.FFACT) END))*(CASE WHEN COALESCE(TCURF_CC.TFACT, TCURR_CC.TFACT, 0) = 0 THEN 1 ELSE COALESCE(TCURF_CC.TFACT, TCURR_CC.TFACT) END)
                WHEN TCURR_CC.UKURS > 0 THEN (E.NETWR*ISNULL(TCURX.TDEC,1))*(TCURR_CC.UKURS/CASE WHEN COALESCE(TCURF_CC.FFACT, TCURR_CC.FFACT, 0) = 0 THEN 1 ELSE COALESCE(TCURF_CC.FFACT, TCURR_CC.FFACT) END*CASE WHEN COALESCE(TCURF_CC.TFACT, TCURR_CC.TFACT, 0) = 0 THEN 1 ELSE COALESCE(TCURF_CC.TFACT, TCURR_CC.TFACT) END)
        END AS NETWR_CONVERTED
...
...
FROM 
        ...
JOIN ...
LEFT JOIN (
        SELECT 
                TCURX.CURRKEY
                ,CAST(TCURX.CURRDEC AS INT) AS CURRDEC
                ,POWER(CAST(10 AS FLOAT),(2-CURRDEC)) AS TDEC 
        FROM 
                TCURX) AS TCURX ON 1=1
        AND TCURR_CC.FCURR = E.WAERS
Prerequisite: TDEC

See Lines 3, 4, 5, 16 in SQL Script above

SAP stores every value according to its decimal places (e.g. 100 JPY will be stored as 1,00 JPY = 1 JPY, see above). To calculate these numbers back to the real value, we need TDEC.

TDEC is calculated as follows:

41193405.png

As the standard amount of decimal places is 2, we calculate TDEC with 2 minus the number of decimal places to get back the correct amount.

Example:

100 JPY → SAP: 1,00 JPY = 1 JPY → 1 · 10(2 - 0) = 1 JPY · 100 = 100 JPY

100 TND → SAP: 1000 TND = 1000 · 10(2 - 3) = 1000 TND · 0,1 = 100 TND

CASE 1 - The currency we want to convert already fits the desired output currency

See Line 3 in SQL Script above

Scenario: FROM currency = TO currency

The conversion is calculated as follows:

41193406.png

Example 1: From EUR to EUR,

Given Data:

Value = 100 €

TCURX.CURRDEC (EUR) = NULL → For common currencies with 2 decimal places, the TCURX usually does not contain values!

→ TDEC: ISNULL(10(2 - NULL )) == true → ISNULL(TCURX.TDEC, 1) = 1

NETWR_CONVERTED: 100 € → SAP: 100 €, as EUR has 2 decimal places anyway → 100€ · 1 = 100 €

Example 2: From JPY to JPY,

Given Data:

Value = 100

TCURX.CURRDEC (JPY) = 0

→ TDEC: ISNULL(10(2 - 0 )) == false → ISNULL(TCURX.TDEC, 1) = 10(2 - 0) = 100

→ NETWR_CONVERTED: 100 JPY → SAP: 1,00 JPY = 1 JPY → 1 · 10(2 - 0) = 1 JPY · 100 = 100 JPY

CASE 2 - The currency we want to convert does not fit the desired output currency + the exchange rate (TCURR.UKURS) is < 0

See Line 4 in SQL Script above

Scenario: FROM currency != TO currency, TCURR.UKURS < 0

What does UKURS < 0 in SAP mean? If the exchange rate is < 0 SAP uses the inverse of that rate, meaning that UKURS = -0,003 is used as 1/0,003. That is the reason why we have to make a difference between cases with positive and negative exchange rates.

We also need to take into account that the conversion factors might be different for the FROM and the TO currency. Therefore we divide the FROM currency by the FROM factor and multiply the result by the TO factor.

For negative values, the conversion is calculated as follows:

Example 1:From USD to EUR,

Given Data:

Value = 100 USD

TCURX.CURRDEC (USD) = NULL

TCURR.UKURS = -1,15002

TCURF.FFACT = 1

TCURF.TFACT = 1

TDEC: 10(2 - NULL ) = NULL → ISNULL(TCURX.TDEC, 1) = 1

41193412.png

Example 2:From Indonesian Rupiahs (IDR) to Euro (EUR)

Given Data:

Value = 100 IDR

TCURX.CURRDEC (IDR) = 0

TCURR.UKURS = -12,54

TCURF.FFACT = 1000

TCURF.TFACT = 1

TDEC: 10(2 - 0 ) = 100 → ISNULL(TCURX.TDEC, 1) = 100

41193413.png
CASE 3 - The currency we want to convert does not fit the desired output currency + the exchange rate (TCURR.UKURS) is > 0

See Line 5 in SQL Script above

Scenario: FROM currency != TO currency, TCURR.UKURS > 0

We need to take into account that the conversion factors might be different for the FROM and the TO currency. Therefore we divide the FROM currency by the FROM factor and multiply the result by the TO factor.

For negative values, the conversion is calculated as follows:

41193414.png

Example 1: From British Pounds (GBP) to Euro (EUR)

Given Data:

Value = 100 GBP

TCURX.CURRDEC (IDR) = NULL

TCURR.UKURS = 1,39751

TCURF.FFACT = 1

TCURF.TFACT = 1

TDEC: 10(2 - NULL ) = NULL → ISNULL(TCURX.TDEC, 1) = 1

41193415.png

Example 2: From Indonesian Rupiahs (IDR) to Euro (EUR)

Given Data:

Value = 100 IDR

TCURX.CURRDEC (IDR) = 0

TCURR.UKURS = 0,1073

TCURF.FFACT = 1000

TCURF.TFACT = 1

TDEC: 10(2 - 0 ) = 100 → ISNULL(TCURX.TDEC, 1) = 100

41193416.png
Q + A

Q: After I ran the transformations, the NETWR_CONVERTED does not show any data for some/most of the conversions, especially for one or some specific currencies, what is wrong?

A: This can have several reasons. In the last months, the following methods fixed conversion problems:

  1. The exchange rate type (KURST = 'M') used in the standard script does not fit the customer's standard exchange rate type → Ask the customer which KURST is used and change the parameter in the EMS Data Integration! Make sure to rerun the transformations afterwards and to completely reload the datamodel to apply the changes.

  2. The date of the currency exchange is wrong / For the specific date, there is no data in the conversion tables → This problem is fixed by using a date range instead of a specific date. That is one of the main reasons on why we add VALID_START and VALID_END in the Currency Conversion Tables (TCURR_CC and TCURF_CC). If there are still problems, let us know!

Q: In the scripts we talk about TCURR_CC and TCURF_CC, whereas in the article we talk about TCURR and TCURF. Where is the difference?

A: TCURR_CC and TCURF_CC are created manually by us. They are based on the SAP source tables TCURR and TCURF and are enriched with additional data, like valid start and valid end of the exchange rates, or a ranking of the exchange rates when there is more than one possibility. However, for the sake of simplicity, the article uses the SAP standard tables for explanations. The logic is the same, as the _CC tables contain the same fields than source tables

Q: Why don't we use other fields - like EKKO.WKURS - to calculate the exchange rate?

A: The SAP standard way for converting currencies is the one that is realized in the App Store Scripts. In some rare cases, customers used the EKKO.WKURS field to convert currencies. We are not a 100% sure on how valid the data stored in this field is. However, we believe that the data of WKURS might come from a manual input during the purchasing process. Herewith the data may not represent the correct exchange rate

Q: Why is TCURF needed, FFACT and TFACT also exist in TCURR?

A: The two fields in TCURR usually do not have any importance. Anyway, to handle eventualities, we included them in the standard scripts as the "second choice" for adding the factor to the UKURS.

Q: What does UKURS < 0 in SAP mean?

A: If the exchange rate is < 0 SAP uses the inverse of that rate, meaning that UKURS = -0,003 is used as 1/0,003. That is the reason why we have to make a difference between cases with positive and negative exchange rates.

Q: The absolute exchange rates (column UKURS) in the table TCURR contain negative values - is that an indicator for bad data?

A: No. Negative exchange rates in SAP are the representation of the inverse. Example: UKURS = -1,1145 → exchange rate = 1/1,1145

Q: Why does SAP sometimes store exchange rates in negative and sometimes in positive values?

A: Actually, we currently do not know why this differs from case to case. Explanations on that are highly appreciated!

Multi-Event-Log Connectors for SAP ECC P2P-AP and O2C-AR

This documentation covers content, reasoning, and general concepts behind 2 process connectors in the Marketplace:

  • Multi-Event-Log Order-to-Cash & Accounts-Receivable (SAP ECC)

  • Multi-Event-Log Purchase-to-Pay & Accounts-Payable (SAP ECC)

Context

The 2 connectors are unifying the content of our four most popular process connectors

  • SAP ECC - Order-to-Cash (O2C)

  • SAP ECC - Accounts-Receivable (AR)

  • SAP ECC - Purchase-to-Pay (P2P)

  • SAP ECC - Accounts-Payable (AR)

The reasoning behind it is that O2C and AR as well as P2P and AP are closely interconnected processes and therefore often combined with each other. Combining them into one Data Model allows you to analyze the full end-to-end process. With the 2 ready-to-use connectors, we are simplifying the set-up and implementation effort.

Concepts

The benefits of the connectors can be tied to 3 key concepts which are applied throughout the templates.

  • Multi-Event-Log: The templates make full use of our Multi-Event Log capabilities to provide end-to-end process insights. Every event has been assigned to an individual activity table and case table based on where the event logically takes place. Activities were even differentiated on header & item level (e.g. Event-log for Purchase Order Headers and Event-log for Purchase Order Items). The single activity tables are merged together at a data model level using the event log auto-merge functionality . This allows you to analyze the process either individually by looking at the single event logs or jointly by looking at the combined activity table.

    P2P-AP

    O2C-AR

    # of Activities (events)

    26

    29

    # of Activity Tables (event logs)

    8

    1

  • Object Modularization: The connector was modularized in a way that makes it easy to enable/disable certain parts based on your needs. An Object contains at least one Activity Table and has its dedicated Data Job that can be executed independently from the others. In addition, full support of the Replication Cockpit (Real-Time Extractions & Real-Time Transformations) is included. As Data Jobs can be enabled and disabled, tailoring the connectors to specific business use cases is now by far simplified.

    The picture below gives an overview, of how the defined Objects are organized into separated Data Jobs.

    60364240.png
  • Simplification of Transformations: By introducing Multi-Event-Log and the modularization approach, the transformations could be simplified significantly. Several joins and filters were removed from the transformations and moved into the Data Model.

Content

The following objects are included:

MEL Process

"Source" Process

Object

P2P-AP

AP

Credit Memo (Payable)

P2P-AP

AR

Invoice (Payable)

P2P-AP

P2P

Purchasing Contract

P2P-AP

P2P

Purchase Orders

P2P-AP

P2P

Purchase Requisitions

P2P-AP

P2P

Quality Inspection

O2C-AR

AR

Credit Memo (Receivable)

O2C-AR

AR

Invoice (Receivable)

O2C-AR

O2C

Delivery

O2C-AR

O2C

Billing

O2C-AR

O2C

Quotation

O2C-AR

O2C

Sales Order

O2C-AR

O2C

Shipment

Data Model P2P-AP
60364241.png
Data Model O2C-AR
60364242.png
Optimal Data Job Scheduling

If you have multiple scheduled Data Jobs, these are the lowest hanging fruits which will speed up your data pipeline and increase freshness of data.

Don’t schedule everything at the same time

Whether in the cloud or not, there is always a computer on the backend processing all the tasks. If it has too many tasks, it is overloaded. This results in extended processing times and increased risk of errors.

People instinctively schedule everything at full hours (e.g. 10.00). This results in spikes in utilization of compute resources. Before the full hours the resources are underutilized.

55706985.png

If you have scheduled more than one job, make sure they are not set to run at the same times. Let’s assume you have 3 jobs. Each runs every hour at full hour for 20 minutes. The compute resource is used heavily for 20 minutes and remain idle for 40 minutes every hour. Instead, schedule jobs to run after another. I.e. one job would run at full hour (x:00), one 20 minutes past (x:20), one 40 minutes past (x:40). After the change, you will most likely notice that the tasks run in less than 20 minutes. To check at what times the jobs run, check the Schedules and Data Jobs logs.

55706986.png
In case of idle time, increase schedule frequency

If you run your data jobs for example once a day and it takes 3 hours to run them (after implementing the recommendation above), the compute resource remains idle for 21 hours. If that’s the case, why not increase the schedule frequency? The data will be more up to date. In every company there will be always a time when someone will need fresher data. Schedule your jobs in a way that the idle periods are short. There should be a “buffer” break between the jobs to compensate fluctuations in duration. How long should be the buffer? This depends how big are the fluctuations of your data job times. You can calculate them based on data in Data Job logs.

Table statistics

This can help you to shorten data job run times (transformations), reduce infrastructure workload and solve out-of-memory issues.

A lot of queries can be executed in many different ways. For example, an order of inner joining tables might be different without an impact on the end result. However, as tables differ in size, order of inner joining them might have an impact on query run times, consumption of CPU and memory. To find the optimal way of executing a query, databases estimate costs of all operations and chose the execution plan which has the lowest cost. To estimate the cost, databases relay on table statistics, i.e. a metadata with basic characteristics of a table (e.g. count of rows). The more accurate and up-to-date statistics, the better cost estimation and the more optimal and faster query executions. Analyzing tables for statistics is usually a matter of seconds even for the biggest tables.

When should you analyze tables for the statistics

How to check if the table has statistics

Before running a SELECT query, add EXPLAIN at the very beginning (EXPLAIN SELECT). This will produce a query plan. Should there be no statistics available, this information will be included in the query plan, next to the table name. In the example below, table BKPF has no statistics:

Access Path:
+-STORAGE ACCESS for bkpf [Cost: 798, Rows: 7K (NO STATISTICS)] (PATH ID: 1)

How get or update the table statistics

In Vertica, you should run query: SELECT ANALYZE_STATISTICS ('table_name');

For example:

SELECT ANALYZE_STATISTICS ('BKPF');

How to integrate the statistics query in process connectors

  • Intermediary tables created in transformations (primary use case)

If you create a new table and populate it with data in the transformations, simply add SELECT ANALYZE_STATISTICS ('table_name'); statement to the task, after the queries creating the tables and inserting the data. It doesn't make sense to analyze tables which are not used anywhere else in the transformations.

  • Extracted tables

You might analyze tables after extracting them - if the extracted tables do not have statistics already (check it using the EXPLAIN command). As of June 11th 2021 all newly extracted tables should have statistics by default. Shall the statistics no be available, you might want to create a new task, the first one in the transformations with multiple statistics queries, one for each table.

50727273.png

Content of the Analyze extracted tables task:

select analyze_statistics ('bkpf');

select analyze_statistics ('bsid');

select analyze_statistics ('bsad');

select analyze_statistics ('cdhdr');

select analyze_statistics ('cdpos');

select analyze_statistics ('dd02t');

select analyze_statistics ('dd03m');

select analyze_statistics ('kna1');

select analyze_statistics ('knb1');

select analyze_statistics ('knb5');

select analyze_statistics ('likp');

  • Tables modified in transformations

If you change a table (add columns, rows etc.), you should include the statistics query after the query that performs the change operation.

Further reading

https://www.vertica.com/docs/9.2.x/HTML/Content/Authoring/AdministratorsGuide/Statistics/AnalyzingTableStatistics.htm

TS Columns

On this page, the App Store team provides an overview of the TS Columns.

What is a TS Column?

TS Column (TimeStamp Columns) can be seen in the Standard Transformation scripts. They contain an original column explicitly casted to Timestamp, Date or Time data type. For instance, a view containing the column AEDAT, will also contain the column TS_AEDAT. TS Columns are inserted into the Standard Views so they are part of the Standard Data Models.

Why are they needed in App Store Apps and Standard Transformations?

These columns are needed because in many cases, customers store timestamps, dates or times as Strings or other non-standard data types. In order to not alter the original column in the Tables, TS Columns are inserted into the Standard Views. App Store Apps use these columns instead of the original columns. By doing this, it is possible to guarantee that the information is in the right format and the different KPIs and calculations can be performed without having to use the casting functions inside an App.

These columns ensure compatibility and the proper use out of the box of the Standard dates, timestamps and time columns in the App Store Apps. If the TS Columns are removed from the Transformations and not loaded into the Data Models, the App Store Apps would not work and the Date, Timestamps and Time columns would need to be altered in the components using them.

Example - Order to Cash - SAP ECC

The following snippet was taken from the Standard Order to Cash Vertica Script. It contains the creation of the VBAP View that is inserted in the Order to Cash Data Model. In this case, the table contains 5 Dates (ABDAT, AEDAT, ERDAT, STADAT and STDAT). These Dates are casted to Date type explicitly and included into the View with the "TS_" convention in the name. By doing this, users have access to the original column in addition to the casted column while building Apps.

22118629.png
Conclusion

It is recommended to not remove the TS columns from the Standard Transformations. In case you exclude these columns, be aware that the Apps will not work and will require modifications.