consenso Intelligent Data Modelling

Our digital building blocks for optimising the creation and maintenance of master and transaction data in your SAP system.

Data quality as a success factor. The key to increasing sales and reducing costs

Deficiencies in your master data processes and in the quality of your master and transaction data will lead to lost sales and higher costs later on. Unreached customers, incorrect prices, problems in the supply chain, increased costs due to complex and often manual (re-)maintenance of data.

The problems can be quantified. For example, insufficient data quality leads to losses of 1.3 trillion dollars per year in US companies. 47% of companies state that they have lost customers due to poor data and 33% report a negative impact on sales.

With Intelligent Data Modelling, consenso offers you a consulting solution to sustainably improve the quality of your master and transaction data in your SAP S/4HANA system - integrated into standard SAP and without additional licence costs.

One service - many modules. Solutions to meet your requirements

One of the many benefits of our Intelligent Data Modelling service is its modular structure. Our service provides you with a toolbox of several independent modules supporting you throughout the entire data maintenance process:

EXTRACT - TRANSFORM - LOAD (ETL)

Our solution for data integration. With ETL, you can load real-time data in a wide range of formats and content and carry out comprehensive validation and cleansing of the data.

The details:

Our ETL tool allows you to process files comprehensively, regardless of their format or structure. Structured database tables or semi-structured Excel, csv or txt files can be processed seamlessly, making it much easier to integrate data from different sources. Plus, you have the option of connecting different types of interfaces.

The ETL flexibility allows you to map and customize the fields supplied in the input file (e.g. price lists from suppliers) as required. This is done in a user-friendly way using drag & drop. A variety of different functions are available to transform the data in the source file so that it corresponds to the various data schemes and business requirements. The option to link conditions to the functions allows the output to be further specified.

The structured and intuitive interface of the Fiori app also allows users without IT knowledge to define the transformation steps and the time of the assignment. To process a new price list, you simply create and trigger a run that executes the predefined mapping steps. Delta loading only creates or updates the data that has changed since the last update. Once the initial definition has been made, the ETL tool can automatically integrate further data into the system.

schliessen-BITTE NICHT LÖSCHEN

 

DATA MINING & PROFILING (DMP)

DMP helps you analyse data in order to gain a clear understanding and overview of its structure, content and quality.

The details:

The definition of data quality rules and their systematic classification in the repository creates a clear structure for the quality guidelines. Both the BRF+ included in S/4HANA and an integrated DMP function enable intuitive modelling and management in order to implement the rules and integrate complex business logic into the system. The user interface of the Fiori app also allows users without IT knowledge to manage the quality check of the data.

An integral part of the rule management process is the configuration of a data quality index. This determines how this index is calculated on the basis of master data and in conjunction with the defined data quality rules. The data quality index provides a quantitative metric for evaluating the quality of the data - the basis for introducing targeted measures to improve it.

The defined data quality rules are used for active analyses and checks of data. Check runs can be controlled easily and efficiently in a clear user interface and set up as a job. This enables continuous monitoring and evaluation of data quality. Potential quality problems can be recognised and rectified at an early stage.

Integration into the Data Process Stream (DPS) module enables automated cleansing of incorrect data. In addition, connecting other systems via mechanisms such as Smart Data Access enables access to external data and corresponding checks. By integrating the defined rules into the new creation process, potentially incorrect entries can be prevented right from the start.

schliessen-BITTE NICHT LÖSCHEN

 

DATA DUPLICATE CHECK (DDC)

Our DDC supports you in identifying, recognising and processing duplicates in your master and transaction data.

The details:

DDC enables you to efficiently manage individual duplicate runs. Customer-specific evaluations with various check rules are defined via customizing. The module allows you to check all SAP tables - and therefore both master and transaction data - using these rules. Once the initial definition has been made, the checks can be carried out automatically via a user-friendly interface.

In addition to checking internal data in the system, the integration of an import function also enables data to be imported directly from an Excel file, for example, and compared with the database tables in the system. Use cases include classic migration work (e.g. as part of an S/4HANA conversion) or price comparisons (e.g. allocation of article prices from web crawling and existing articles already in the system).

When running a check, conditions are set to limit the data analysed and further refine the result. Once the duplicate check has been successfully completed, clear result values are presented for each potential duplicate. These results provide a structured basis for the targeted processing and cleansing of duplicates.

The duplicate check can not only be applied to existing data, but can also be integrated into the dialogue for creating new data.

ANOMALY DETECTION | RULE-MINING (DEA)

DEA (Data Embedding and Augmentation) can be used to recognise hidden patterns in large master data sets in order to generate correction suggestions for master and transaction data and support decisions with the help of artificial intelligence.

The details:

In addition to our existing set of rules, we also present extended solutions in the area of data mining based on PAL (Predictive Analysis Library) and APL (Analytic Processing Library). These solutions enable multi-faceted data analysis, including classification, clustering (profiling) and anomaly detection.

The Predictive Analysis Library offers powerful statistical algorithms for comprehensive data analyses. The association algorithm in rule mining recognises patterns and rules in data, evaluates them statistically and enables hidden patterns and anomalies to be identified. After plausibility checks, the rules identified provide clear insights into data correlations. In master and transaction data, the association algorithm reveals how and why certain data elements are linked to each other. A practical example of this is analysing shopping baskets to determine which items frequently appear together in transactions. The users thus benefit from automated analyses to discover correlations in their data. In addition to the PAL algorithm, we use various statistical methods, including AI libraries in Python, to identify anomalies in the data. An example use case here is the analysis of deviations in order or bank transfer processes.

The data presented is displayed in clear dashboards that enable user-friendly visualisation and interpretation. From here, users can seamlessly process the data to quickly make informed decisions and derive measures. Depending on the application, we integrate the results into the various modules of our iDM toolkit.

schliessen-BITTE NICHT LÖSCHEN

 

DATA PROCESS STREAM (DPS)

DPS is to be understood as a bracket across the modules of our Intelligent Data Modelling. Transformed data from ETL is checked by DMP, matched with suitable articles (DDC), enriched with the help of AI (DEA) and integrated into the system.

The details:

DPS enables the mapping of processes to check and cleanse data and carry out transactions. This makes it possible to update or create master data or generate transaction data such as purchase orders.

The individual steps and their sequence can be individually configured via customizing. Our "Data Mining and Profiling" and "Duplicate Check" applications can be seamlessly integrated and existing analyses can be reused in the various apps. The integration of AI algorithms makes it possible, for example, to make intelligent predictions for product groups or to shorten material descriptions to 40 characters. Suggestions made by the system can be overridden by the user at any time.

The application therefore offers the possibility of mapping numerous processes in which data is transformed, prepared and then further processed in the system. In addition to the creation of articles and business partners, other examples include the entry of orders via Excel and the checking and posting of sales data from various transactions.

schliessen-BITTE NICHT LÖSCHEN

 

Key facts in a nutshell

We would be happy to give you a personal presentation on the many functions and potential benefits of consenso Intelligent Data Modelling and evaluate with you how we can best meet your individual requirements.

Just get in touch with us, we look forward to hearing from you! To the contact form

consenso Intelligent Data Modelling. Successfully proven in practice...

Use the building blocks of our digital service consenso Intelligent Data Modelling in the context of creating and maintaining your master and transaction data. Not only will you increase your digital maturity level, you will also quickly create measurable benefits.

Examples of specific use cases can be found in our projects:

 

  • During article creation, various supplier price lists are transformed using ETL, compared with existing articles using DDC, checked using DMP and finally integrated into the system.
  • In property management, service charge settlements with different structures are transformed using ETL, checked for anomalies, then automatically assigned to accounts and allocated to cost centres.
  • During order creation, orders with different formats and structures are loaded into the system using ETL. DDC is used to check whether articles need to be updated or newly created before a purchase order is automatically generated.
  • Partners upload their sales data daily, which is automatically transformed by ETL, checked for anomalies and correctness and then posted in the system.

Our customers speak for us! Examples of the use of Intelligent Data Modelling modules in practice can also be found in our reference projects:

  • In September 2023, the Bavarian ERDINGER wheat beer brewery commissioned consenso to analyse its current business partner and material master data. The goal: to significantly improve their quality and identify potential to increase it permanently. The status: Mission successfully completed!
    Learn more
  • In order to be able to ensure article maintenance at AGRAVIS Raiffeisen AG in the long term with as little effort as possible, the manual work of the employees in master data management will be supported by artificial intelligence in the future.
    Learn more

You can find more reports about us and our projects in our project stories