Master data management (MDM) is an increasing focal point at management level. It allows organizations to control the steadily increasing amount of data and to sustainably guarantee and enhance their competitiveness through the improved evaluation and use of external and internal company data. But does that mean that (as soon as the need has been identified and an MDM strategy successfully implemented) optimal corporate decisions are made on the basis of the analysis data available thanks to structured data processing?
The last few years have been characterized by a focus on data quality and on an overarching strategic vision that includes the group-wide definition of targets for the management and use of master data. In addition, an attempt has been made to involve specialist areas in the structuring of system landscapes, as they are directly affected by business processes.
This article demonstrates which current technologies and concepts can contribute to the achievement of these criteria and even higher data quality. This means that companies are able to position themselves even more successfully in the areas of planning, procurement, production, delivery and complaints.
Development of Master Data Management
Despite MDM, in most organizations there is still a high level of manual work involved in the updating and management of systems, and isolated solutions are in place for the handling of department-specific processes. Many central teams have been set up to relieve the pressure on specialist areas, but they simultaneously create isolated data literacy, which limits usability considerably, at least in operational business. Extremely extensive operational processes have been developed to enable this structure, resulting in inefficiencies and transfer steps that are susceptible to error. Business regulations that are intended to guarantee processes and data quality also play a key role. The work involved in making these appear to end users to be both sensible and necessary and in ensuring and monitoring compliance by the latter has been significantly underestimated to date.
Alongside complexity (data harmonization and updates, as well as governance decisions), additional challenges include
- constantly changing requirements;
- tight schedules for set-up;
- personnel planning and enabling;
- extra workload; and
- IT infrastructure (hardware and software).
These factors have an impact on employees who grapple with operational processes on a daily basis. Accordingly, they affect value creation and the quality of data produced by individuals.
The new target concept for companies who have already set up a master data management system is geared toward the topics of data availability, the reduction of manual work and data literacy. It is more important than ever for corporate success along the supply chain and in production processes that employees have quick and easy access to all data required and that they can draw the correct conclusions from this data.
Various approaches are presented below, which minimize the volume of work and susceptibility to errors and that re-orientate the structure, responsibility and usability of master data.
How Companies Can Achieve the Next Level of Master Data Management
Since data quality is generally already solid, the architects of system landscapes can now devote themselves to further necessary development steps. The focus should be on the ongoing improvement of data quality. The aim of this is to enable data-based decisions for an increase in efficiency, productivity and service quality. In addition, the simplification of collaboration with business partners and the decentralization of data literacy (returning it to specialist areas) should be targeted.
The following tools, technologies and approaches are a good way-marker for this. One important topic is the as-a-service approach, which may relate to software, data or data governance. The benefits of the concept are that it can dramatically reduce the specific pressure on the operational structure of an organization during implementation and that it includes an advisory component for strategic concerns.
Data governance as a service
Particularly in the newer concept of data governance as a service, in contrast to the traditional individual approach, the requirements of employees are increasingly the focus of attention, alongside risk minimization. After all, only the competent and error-free use of data management systems guarantees high data quality, which in turn represents the basis for accurate analyzes and insights.
A fundamental development, which admittedly is not MDM-specific but is still highly relevant for the achievement of the next development stages, is the promotion of automation. Through the substitution or simplification of repetitive, manual transmission and consolidation tasks by means of the automated execution of process steps, companies have an effective lever to support MDM optimization, resulting in a further increase in data quality and less work for employees. The basic prerequisite for automation is a functioning workflow whose manual steps can potentially be replaced, leading to an increase in efficiency. One example is the automatic import of a PDF supplier master data sheet into an ERP/MDG system, which minimizes time and eliminates transmission errors caused by manual processing. Another area of application includes the automatic querying of third-party providers, who supplement data entered in SAP MDG with values such as credit information or address validation. Many additional application options have already been implemented, but others have not. Once a solid level of automation has been achieved, this can be continued through the introduction of robotic process automation (RPA). More on this in the blog post robotic process automation for MDM.
One important topic in the design of master data management is the decision about whether updates should be carried out centrally or decentrally in the company. A central approach may be more effective in the long term, particularly if a high level of data security is required. On the other hand, a decentral approach could be easier to manage within a larger organization; it leaves data in the places where it is created and utilized. A data fabric can contribute to decentralization. This is a combination of an architecture approach and a technology set that eliminates data silos and makes relevant data accessible, processable and employable for users throughout the company. In addition, the concept of data sharing contributes to the exchange of data within and outside individual organizations and so enables new areas of business and application.
One example of exchange beyond organizational boundaries is the Catena-X Automotive Network industry association. In Business Partner Data Management, harmonized business partner data records from different sources can be reliably identified as so-called “Golden Records” and made available to all members. The prerequisite for exchange and shared use is the standardization of the data in question. In other words, it must be ensured that everyone derives the same information from specific data and that a standard data record scope is defined.
The data literacy of consumers is, in turn, a prerequisite for this. Put simply, this describes the ability to interpret data correctly. This should be structurally promoted in order to ensure the correct interpretation of data and the ability of data consumers to analyze it, as well as application skills. Further information is available in these articles on data literacy as a concept and different data literacy levels.
Data literacy is also required for the self-service approach, which is intended to relieve the pressure on IT professionals and transfers data responsibility to end users, i.e. those who create, call up and use the data. What is important for this is the use of a data catalog tool to present required data in a comprehensible way and to make it easily accessible for operational business areas.
Self-service analytics tools build on the data self-service approach. They enable non-technical users to access data from different systems or business areas themselves, to carry out queries and to create reports, based on which they can make effective business decisions. Purchase decisions are among the decentral analysis tasks that are executed and processed during the operational decision-making of individual business areas.
This concept relieves pressure on data/analysis experts in their everyday work, as they no longer have to process analysis queries. It also increases decision-making speed and quality thanks to the guaranteed currency of analysis results, freedom from transmission errors and the clear definition of targets.
How Companies Are Fully Exhausting Their MDM Potential Now
It is evident that MDM is an extremely necessary framework for controlling the increasing quantity and complexity of data, but one that does require constant attention to operational maintenance and strategic further development.
Alongside automation, which is sure to remain a focal topic for some time, we believe that recent developments, such as cloud-based services and self-service approaches, offer concrete opportunities and benefits. They can take into account the individual circumstances, needs and opportunities of individual companies even better than before. There is potential for resource and capacity preservation, cost reductions and a further increase in user friendliness, data quality and hence the usability of data.
The developments shown here are indicators of the direction in which MDM will develop over the next few years, and thus possibly become the next standard solutions.
For this reason, companies should start to set the course for their future MDM system now in order to meet new requirements and to be able to leverage potential at an early stage. As a first step, a review of the level of MDM maturity in one’s own company at present is beneficial. Building on this, options can be developed to take the next step toward the MDM of the future.