All blog posts
Data & AI

Good data management is the basis for the business models of tomorrow

Mr. Budke, data management is also becoming increasingly important because of AI. What do you think are the biggest challenges currently facing companies when handling large amounts of data?

One of the biggest challenges in today's business world is ensuring the quality of data. In view of the immense, exponentially growing amounts of data that companies generate and store every day, it is becoming increasingly difficult to ensure that this data remains consistent, correct and usable. At the same time, data security requirements are growing. Protecting sensitive data from misuse and unauthorized access has become a central task in an increasingly complex data landscape. One example of this is instant payment, where fast transactions take place, but at the same time reliable automatic fraud detection must be guaranteed within seconds. This requires not only modern security solutions, but also comprehensive governance processes within companies. I would particularly like to emphasize the importance of clean metadata management, i.e. the systematic processing of data about data, such as through a data catalog. Data management is thus developing into a challenge that is not only technical but also increasingly strategic, and one that must be addressed at the highest management level. Good data management is the solid basis for tomorrow's business models. If the database is not solid, even the best artificial intelligence cannot achieve anything. Only if the data quality is right, especially when training models that are fed with company data (e.g. through Retrieval-Augmented Generation, RAG), can an LLM (Large Language Model) make informed decisions.

You advocate the use of data platforms combined with continuous real-time data processing. What are the advantages?

With a data platform, companies can make data available across the organization, thus not limiting access to information to individual applications. This eliminates the need for expensive, complex and often tedious point-to-point connections that are traditionally used to link systems. Such connections often lead to a fragile and unstable infrastructure in which even a small disruption can have a significant impact. The result is usually a chaotic data architecture that can be compared to a plate of spaghetti in terms of its complexity and lack of clarity. A data platform at the center, on the other hand, promotes structured, robust and future-proof data management. The great advantage of data platforms with real-time data processing, also known as data streaming, lies not only in the current availability of the data, but also in the consistency and thus the quality of the data. With data streaming, data is not collected and processed overnight in a batch run, but can be incorporated continuously. This simplifies the individual processing steps and ensures that the data is up-to-date and consistent, and that other applications can also use this data operationally. This ensures greater efficiency and the ability to make informed decisions based on the data at any time.

At the beginning, you also talked about the importance of data quality and data consistency. How do you ensure these aspects?

A clear and well-thought-out data model in the company is essential to create a stable foundation for all processes. It is necessary to define which data objects are of central importance in the company – such as customers, products or materials – and these must be treated uniformly throughout the company or at least within the respective domain. It is not enough to expect that an artificial intelligence or other system alone is able to handle inconsistent data. Uniform standards and clear definitions ensure data quality and consistency in the long term. Such discipline in data management makes it possible to perform reliable analyses, make informed decisions and ultimately realize the full potential of modern technologies, including AI. A well-structured data model is therefore not only a technical requirement, but a crucial factor for business success.

Can you strengthen the resilience of your business by building an internal LLM?

For most companies in Switzerland, building their own internal LLM is not a realistic or sensible option. The development effort, the necessary resources and the specialized know-how required for this exceed the capacities and budgets of most organizations. However, there are alternatives: on the one hand, companies can use publicly available LLM models for internal use. These models offer a cost-effective and flexible alternative because they can be adapted to specific needs without the huge effort of developing your own model. In addition, using such models allows for greater control over data and processes because they can be fully implemented in your own infrastructure. Another trend I am seeing is the movement away from large LLMs to smaller language models (SLM), which are easier to run locally and require fewer resources. However, even this is often not realistic for small and medium-sized enterprises (SMEs). Instead, SMEs should rely on cloud providers that they can trust with their data. It is important to ensure that the data is neither passed on to third parties nor used to train the LLMs. This way, SMEs can also provide their employees with up-to-date AI models and dedicated AI applications based on them, thus increasing resilience.

Is there nothing a company can do? Isn't that the cheapest alternative?

If companies do nothing, they risk employees independently using kit tools that they have found somewhere, along the lines of “bring your own AI”. In this case, the company loses all control over what happens to sensitive company data. This can lead to significant security risks with fatal consequences.

How can a company make data-driven decisions and how can I prepare my organization and its data for the era of AI now?

To make data-driven decisions as a company and prepare for the era of AI, several steps are crucial: First, a comprehensive assessment of the current situation should be carried out with the most important stakeholders in the area of data management and one's own data maturity. It is essential to understand the current level of your own organization and to identify possible gaps in order to drive targeted improvements. Companies must begin to view data as a valuable asset that requires careful maintenance and management. Often, the immense value of existing data is underestimated. A data-conscious mindset should be established in which data is viewed as an internal product with a clear life cycle (“data as a product”). Each team should take responsibility for data quality and its use. It is also advisable to list the metadata in a catalog. This approach helps to recognize and exploit the full value of the data. Another goal should be to avoid maintaining multiple copies of data and instead establish a central data hub using data streaming. This platform makes data easily accessible and ensures that it can be used consistently and correctly. This significantly increases efficiency and keeps the data landscape manageable. Once the data is available, it should be effectively evaluated and used. The previous approach of manually copying or extracting data back and forth should be a thing of the past. Instead, the process of obtaining data from applications and for AI purposes must be clearly defined and automated. These steps lay the foundation for a data-driven culture and optimally prepare the company for the future in the AI era. This is how to derive new insights from data and make decisions and take the right actions based on them.

All blog posts

View all stories

Ready to talk?

Do you have any questions? Does your business need our expertise? Or are you interested in one of our products? Drop us a message - we will get in touch as soon as we can.