Skip to main content

Celonis Product Documentation

Real Time Transformations FAQ

General

What are Real Time Transformations?

Real-Time Transformations represent a new concept that enables the transformation of unprocessed records. This allows avoiding to drop/recreate the data that has already been processed, with the benefits of:

  • Fast and effective transformations

  • Robust and error-free transformations

  • Predictable runtimes

Who should consider using Real Time Transformations?

The feature is tailored to support real-time connectivity use cases. Anyone, who plans on using Celonis applications and Action Engine should first achieve real-time connectivity.

In general, Real Time Transformations are an effective countermeasure against long-running and unstable transformations.

Which source systems are supported?

Currently, Real Time Transformations are only supported for SAP.

Will a migration be required for existing customers?

Real Time Transformations follow a different processing logic compared to transformations executed in Data Jobs. It requires some migration effort and changes in the transformation scripts.

Further reference: Real Time Transformations: Setup Guide

Which process templates are available to import?

Templates for O2C and P2P (both SAP ECC) can be installed via the Marketplace or via the import functionality in Replication Cockpit.

Note

If you are interested for the AP or AR real-time process templates, we are happy to collaborate with you.

In this case, please reach out to us via Service Desk.

Technical

What are the pre-requisites?

Real-Time Transformations are part of the Replication Cockpit. Therefore, the same pre-requisites are valid:

  • Full Cloud

  • Already have the SAP Real Time Extractor installed

  • Min. version of RFC Module: 2.0.0

  • Min. version of Extractor: 2020-10-19

How do Real Time Transformations differentiate from transformations that are executed in Data Jobs?

Batch Transformation (Data Jobs):

  • With the Batch Transformation approach, the table of the transformation is dropped and re-created from scratch for every execution.

  • All records are being processed with each execution.

Real Time Transformation:

  • With the Real Time Transformation approach, this table is not dropped anymore.

  • Instead, the new/changed entries (Delta) are inserted or merged to the table.

Should I move all transformations in the Replication Cockpit to run in Real Time mode?

In theory, every transformation can be converted. However, there are some scripts for which it doesn't make sense like the creation of the currency conversion tables (those should still remain in Data Jobs).

Also, a Full Load is currently not supported in the Replication Cockpit. This use case still needs to be covered by the Data Jobs.

What is a Trigger table?

Real Time Transformations are executed each time new records are extracted to a specific table. This means that you need to map each transformation statement to a table whose extraction should trigger the transformation. This table is called a Trigger Table.

What is a Staging table?

Staging tables are intermediary tables that allow you to execute Real Time Transformations. They contain only a specific set of records. For every table that is extracted via the Replication Cockpit, 2 different corresponding staging tables exists:

  • Transform Staging Table: contains only the records that have been created or updated after the last extraction (e.g. for KNA1 it is _CELONIS_TMP_KNA1_TRANSFORM_DATA). This table is deleted after each successful Replication execution.

  • Delete Staging Table: contains only the records that have been deleted from the table in the source system (e.g. for KNA1 it is KNA_DELETED_DATA). This table needs to be cleared manually after the data was used. Deletions are currently not part of the standard delta transformation approach, but they can be added if needed.

What are Dependent tables?

The dependent tables for a transformation are all tables that are required to be up-to-date at the point when the transformation is executed. The Replication Cockpit automatically considers a special logic so that only the records of the staging table are taken into account for which corresponding updates in the dependent tables exist.

Do Real Time Transformations support parameters?

In the Replication Cockpit, you can use and reference all Data Pool Parameters. However, there is no concept of replication-specific parameters anymore - like it is the case for Data Jobs.

Do you know how long the transformations will take with this tool?

We have done several pilots. Here are the overall numbers:

Pilot A

  • 33 transformations in 23 tables

  • longest-running table: CDHDR

  • 6 transformation scripts

  • Avg: 3.5 min (350 records)

Pilot B

  • 16 transformations in 8 tables

  • longest-running table: VBAK

  • 2 transformation scripts

  • Avg: 1 min (100 records)

Pilot C

  • 44 transformations in 22 tables

  • longest-running table: CDHDR

  • 7 transformation scripts

  • Avg: 10 min (2,500 records)

Could it happen that the Replication Time is quicker than the Transformation - so that the staging table changes in the middle of the Transformation?

The Transformation is always a part of the Replication (Replication = Extraction + Transformation). This means that a new extraction will not be executed (and hence the staging table won't change) unless the transformation is over.

How are those staging tables created, especially for custom tables that are customer-specific?

The staging table is created automatically before executing the transformation. You don't need to do anything manually.

What if we only need a left join (e.g. from VBAP to VBAK): will the dependencies also be applied and will it also cut our entries from VBAP that are not in VBAK?

It is up to the user to define the dependency or not. This is done manually. For an INNER JOIN defining the dependency is a must, and for LEFT JOIN it can be optional, assuming you are okay with having NULL values. If you want to enforce consistency and avoid NULLs, you have to define the dependency and the technology behind the scenes will make sure that only the records from VBAP that have "equivalents" in VBAK are processed.

Do Real-Time Transformations also handle deleted records?

Yes, we can handle deletions too, but a different approach is required here. The deleted records are pushed to a separate staging table, e.g. VBAP_DELETED_DATA, and then you need to setup transformations to remove these records from the source and data model tables. We have not done this for our standard process connectors, but it works theoretically.

You can also choose to delete the records directly from the table, i.e. VBAP. In that case you loose the flexibility of executing a transformation against it. However, you can set-up nightly/weekly drop/re-create transformations which will make sure that the deleted records are handled.

Do we have an update/insert indicator in the trigger table to distinguish between new and updated entries?

Yes, there is a column CL_CHANGE_TYPE with the values I/U.