Industry 4.0 solves quality problems

Most Popular Insights


Most companies have already relentlessly improved quality internally. Yet beyond company boundaries it is often another story. In the past, going beyond internal policies, computer systems and data networks was simply too complicated. Now, change is bound to happen. Looking at the value chain of glass refinement you see many different stakeholders, all with the same goal: supply goods of the highest quality – with the lowest possible reject rate. The temperature of the raw material plays a big role for quality. Therefore, one would want to integrate the data along the entire value chain in order to identify possible sources of errors and reduce reject rates.


The Data Intelligence Hub turns the Internet of Things into the Internet of Production – for example at GlasGo, one of the world’s leading glass refineries. Despite flawless input controls at the company, occasionally customers were returning glasses because of fading colours.

One suspect is exposure of color shipments to low temperatures. Lacquers get damaged at temperatures below 2°C. All is good in the warehouses of GlasGo and its suppliers. But what about the critical shipping link, the transportation on the German Autobahn all the way from Bavaria to Westerwald? Now, with the Data Intelligence Hub, it is possible to quickly connect the data captured by the Cloud of Things end-to-end, along the entire supply chain and monitor key variables: from vendors via highways and storage locations all the way to the receiving dock.


The Data Intelligence Hub serves not only as a data pool, but also as a workshop: With artificial intelligence and machine learning, errors can be quickly identified, causes found and solutions to problems deduced. This means GlasGo can reduce the rejects and returns. In addition, they save money because they can react to the environmental data and adjust the transport conditions. What’s particularly practical is that all the components can be easily combined or extended based on the modular principle. The value-generating analytics can be developed by impartial data scientists and made available to all users. This neutrality enables the members to provide impartial data. All users always have data sovereignty and determine the duration of use and intended use. Telekom guarantees that data will be used for the intended purpose. In addition, customers have the option of simply, transparently and, of course, securely tapping additional sources of income by monetizing their own databases.

Read more the application of the Data Intelligence Hub in the glas industry in T-Systems magazine best practice, issue 2 / 2019, p.48

Chris S. Langdon
Chris S. Langdon

Business Lead, Data Analytics Executive, Catena-X Product Manager

Read more