consenso Intelligent Data Modelling

Our digital building blocks for optimising the creation and maintenance of master and transaction data in your SAP system.

Data quality as a success factor. The key to increasing sales and reducing costs

Deficiencies in your master data processes and in the quality of your master and transaction data will lead to lost sales and higher costs later on. Unreached customers, incorrect prices, problems in the supply chain, increased costs due to complex and often manual (re-)maintenance of data.

The problems can be quantified. For example, insufficient data quality leads to losses of 1.3 trillion dollars per year in US companies. 47% of companies state that they have lost customers due to poor data and 33% report a negative impact on sales.

But the topic of digitalisation is also closely linked to data quality. If you want to quickly and permanently benefit from the advantages of artificial intelligence in your company, high-quality data is the key to success. Without high-quality, structured and relevant data, even the best AI model cannot deliver reliable or economically viable results.

With Intelligent Data Modelling (iDM), consenso offers you a consulting solution to sustainably improve the quality of your master and transaction data in your SAP S/4HANA system - integrated into standard SAP, applicable to all SAP objects and without additional licence costs.

One service - many modules. Solutions to meet your requirements, proven in practice many times over

One of the many benefits of our Intelligent Data Modelling service is its modular structure. Our service comes as a modular system with several independent modules or apps that can be used flexibly, either individually or as part of a process. A no-code approach ensures independent, individual use without IT support.

Our reference examples illustrate the areas in which the modules of our Intelligent Data Modelling toolkit are already being used to benefit our customers:

Use Cases DATA MINING & PROFILING

  • SALES PRICES: Are there fixed sales prices for all items? Is there a valid price information record for each item?
  • MATERIAL MASTER RECORD: Is the brand stored in accordance with the brand directory? Is the manufacturer specified for the material?
  • BUSINESS PARTNER: Do first and last names contain special characters? Does the postcode comply with the country-specific syntax?
  • SERIAL NUMBER: Are the same serial numbers assigned to different devices? If so, the devices are issued with a corresponding message.

Use Cases AI & ANOMALIES

  • PRODUCT GROUPS & SERIAL NUMBERS: AI-based product group assignments and automated AI classification of serial numbers optimise quality standards.
  • BASE AND SALES QUANTITIES: AI-supported base and sales quantity units are differentiated for precise inventory management through automatic unit classification.
  • PAYMENT TRANSACTIONS: Duplicate payment transactions are automatically detected and outliers (anomalies) in payment processes are identified.
  • INVENTORY DISCREPANCIES: Data science methods are used to detect discrepancies between system and actual inventory in order to identify anomalies in inventory-changing merchandise management processes at an early stage.

Use Cases INTELLIGENT DATA MODELLING

  • ARTICLE ASSET: Various supplier price lists are transformed using ETL (Extract-Transform-Load), compared with existing articles via DDC (Data Duplicate Check)
    and checked and integrated into the system using DMP (Data Mining & Profiling).
  • REAL ESTATE MANAGEMENT: Various utility bills with different structures are transformed using ETL, checked for anomalies and then automatically assigned to cost centres.
  • ORDER SYSTEM: Orders with different formats and structures are loaded into the system via ETL. DDC is used to check whether items need to be updated or newly created before an order is automatically generated.
  • SALES DATA: Partners upload their sales data daily, which is automatically transformed by ETL, checked for anomalies and correctness, and then posted to the system.

Our customers speak for us: Successful iDM projects

In order to be able to ensure article maintenance at AGRAVIS Raiffeisen AG in the long term with as little effort as possible, the manual work of the employees in master data management will be supported by artificial intelligence in the future.

Learn more

Master data optimisation at our customer ERDINGER Weißbräu took place in two phases:

  • Phase 1 focused on the targeted improvement of the quality of business partner and article master data using our Data Mining & Profiling (DMP) tool.
  • In phase 2, we concentrated intensively on the creation of new materials, which enabled us to fundamentally optimise the process.

Learn more

With the active support of our ETL (Extract-Transform-Load) tool, we were able to improve the processes surrounding master and transaction data for our customer SAGAFLOR.

Learn more

The modules of consenso Intelligent Data Modelling in detail

EXTRACT - TRANSFORM - LOAD (ETL)

Our solution for data integration. With ETL, you can load real-time data in a wide range of formats and content and carry out comprehensive validation and cleansing of the data.

The details:

Our ETL tool allows you to process files comprehensively, regardless of their format or structure. Structured database tables or semi-structured Excel, csv or txt files can be processed seamlessly, making it much easier to integrate data from different sources. Plus, you have the option of connecting different types of interfaces.

The ETL flexibility allows you to map and customize the fields supplied in the input file (e.g. price lists from suppliers) as required. This is done in a user-friendly way using drag & drop. A variety of different functions are available to transform the data in the source file so that it corresponds to the various data schemes and business requirements. The option to link conditions to the functions allows the output to be further specified.

The structured and intuitive interface of the Fiori app also allows users without IT knowledge to define the transformation steps and the time of the assignment. To process a new price list, you simply create and trigger a run that executes the predefined mapping steps. Delta loading only creates or updates the data that has changed since the last update. Once the initial definition has been made, the ETL tool can automatically integrate further data into the system.

schliessen-BITTE NICHT LÖSCHEN

 

ARTICLE MANAGER & REQUEST MANAGER ARTICLE (ARM & RQM AR)

While the Article Manager supports the departments in recording material data with AI support and duplicate checks, the Request Manager provides an overview of new and change requests and helps to control the processes.

The details:

With the help of the Article Manager (ARM), new entries and changes to articles (materials) are recorded by the various departments in a uniform layout.

Specific workflows can be defined depending on the material type and other properties. Field properties can be controlled with different authorisations depending on the status.

The integration of additional iDM apps enables check rules, AI-supported enrichment (e.g. product groups) and automatic duplicate checks to support the process.

schliessen-BITTE NICHT LÖSCHEN

 

DATA MINING & PROFILING (DMP)

DMP helps you analyse data in order to gain a clear understanding and overview of its structure, content and quality.

The details:

The definition of data quality rules and their systematic classification in the repository creates a clear structure for the quality guidelines. Both the BRF+ included in S/4HANA and an integrated DMP function enable intuitive modelling and management in order to implement the rules and integrate complex business logic into the system. The user interface of the Fiori app also allows users without IT knowledge to manage the quality check of the data.

An integral part of the rule management process is the configuration of a data quality index. This determines how this index is calculated on the basis of master data and in conjunction with the defined data quality rules. The data quality index provides a quantitative metric for evaluating the quality of the data - the basis for introducing targeted measures to improve it.

The defined data quality rules are used for active analyses and checks of data. Check runs can be controlled easily and efficiently in a clear user interface and set up as a job. This enables continuous monitoring and evaluation of data quality. Potential quality problems can be recognised and rectified at an early stage.

Integration into the Data Process Stream (DPS) module enables automated cleansing of incorrect data. In addition, connecting other systems via mechanisms such as Smart Data Access enables access to external data and corresponding checks. By integrating the defined rules into the new creation process, potentially incorrect entries can be prevented right from the start.

schliessen-BITTE NICHT LÖSCHEN

 

DUPLICATE DETECTION (DDC)

Our Data Duplicate Check helps you identify, detect and process duplicates in your master and transaction data.

The details:

DDC enables you to efficiently manage individual duplicate runs. Customer-specific evaluations with various check rules are defined via customizing. The module allows you to check all SAP tables - and therefore both master and transaction data - using these rules. Once the initial definition has been made, the checks can be carried out automatically via a user-friendly interface.

In addition to checking internal data in the system, the integration of an import function also enables data to be imported directly from an Excel file, for example, and compared with the database tables in the system. Use cases include classic migration work (e.g. as part of an S/4HANA conversion) or price comparisons (e.g. allocation of article prices from web crawling and existing articles already in the system).

When running a check, conditions are set to limit the data analysed and further refine the result. Once the duplicate check has been successfully completed, clear result values are presented for each potential duplicate. These results provide a structured basis for the targeted processing and cleansing of duplicates.

The duplicate check can not only be applied to existing data, but can also be integrated into the dialogue for creating new data.

DATA EXPLORER (DEX)

Data Explorer supports anomaly detection and rule mining. Hidden patterns in large master data sets can be identified in order to generate correction suggestions for master and transaction data with the help of artificial intelligence, derive rules and define further steps based on these.

The details:

In addition to our existing set of rules, we also present extended solutions in the area of data mining based on PAL (Predictive Analysis Library) and APL (Analytic Processing Library). These solutions enable multi-faceted data analysis, including classification, clustering (profiling) and anomaly detection.

The Predictive Analysis Library offers powerful statistical algorithms for comprehensive data analyses. The association algorithm in rule mining recognises patterns and rules in data, evaluates them statistically and enables hidden patterns and anomalies to be identified. After plausibility checks, the rules identified provide clear insights into data correlations. In master and transaction data, the association algorithm reveals how and why certain data elements are linked to each other. A practical example of this is analysing shopping baskets to determine which items frequently appear together in transactions. The users thus benefit from automated analyses to discover correlations in their data. In addition to the PAL algorithm, we use various statistical methods, including AI libraries in Python, to identify anomalies in the data. An example use case here is the analysis of deviations in order or bank transfer processes.

The data presented is displayed in clear dashboards that enable user-friendly visualisation and interpretation. From here, users can seamlessly process the data to quickly make informed decisions and derive measures. Depending on the application, we integrate the results into the various modules of our iDM toolkit.

schliessen-BITTE NICHT LÖSCHEN

 

DATA PROCESS STREAM (DPS)

DPS is to be understood as a bracket across the modules of our Intelligent Data Modelling. Transformed data from ETL is checked by DMP, matched with suitable articles (DDC), enriched with the help of AI and integrated into the system.

The details:

DPS enables the mapping of processes to check and cleanse data and carry out transactions. This makes it possible to update or create master data or generate transaction data such as purchase orders.

The individual steps and their sequence can be individually configured via customizing. Our "Data Mining and Profiling" and "Duplicate Check" applications can be seamlessly integrated and existing analyses can be reused in the various apps. The integration of AI algorithms makes it possible, for example, to make intelligent predictions for product groups or to shorten material descriptions to 40 characters. Suggestions made by the system can be overridden by the user at any time.

The application therefore offers the possibility of mapping numerous processes in which data is transformed, prepared and then further processed in the system. In addition to the creation of articles and business partners, other examples include the entry of orders via Excel and the checking and posting of sales data from various transactions.

schliessen-BITTE NICHT LÖSCHEN

 

MASS DATA EDITOR (MDE)

The mass change mode can automatically make identical changes to multiple items or business partners – quickly, efficiently and with quality assurance.

The details:

Similar change requests that affect multiple items or business partners are time-consuming and prone to errors. With the support of our MDE app for mass changes, the effort required can be reduced while significantly improving quality.

The app can be accessed directly from the item manager or business partner manager. There, identical adjustments can be made for multiple items or business partners at the same time using the mass change mode.

A separate approval process ensures the dual control principle. The integration of additional iDM apps ensures quality standards for changes.

schliessen-BITTE NICHT LÖSCHEN

 

BUSINESS PARTNER MANAGER & REQUEST MANAGER BUSINESS PARTNER (BPM & RQM BP)

An app specifically designed for creating and maintaining business partner data – supplemented by workflows that efficiently support business partner-related processes.

The details:

Business Partner Manager makes it easier to create new business partner data and modify existing data. With the help of this solution, data is recorded by the various departments in a uniform layout.

Depending on the type of partner, specific workflows, including approval, can be controlled in a targeted manner. And by integrating additional iDM apps, the process is optimised through validation rules, AI-supported enrichment and an automatic duplicate check.

schliessen-BITTE NICHT LÖSCHEN

 

Come and talk to us

We would be happy to give you a personal presentation on the many functions and potential benefits of consenso Intelligent Data Modelling and evaluate with you how we can best meet your individual requirements.

 

 

*) Mandatory fields