End-to-End-Process

Data continuity in manufacturing industry: How software and IT are changing

Digital Platforms: Understanding IT Trends

Data continuity in manufacturing industry: How software and IT are changing
10 min to read
published 12/14/2021

The executive media keep flashing the same warning sign: “Industry has to speed up digital transformation!” Yet it’s also true that the platform economy has already brought many changes to the IT landscape. To help you better understand these changes, we asked Digital Factory expert Nikolai D’Agostino to tell us about IT trends that would be of interest to managers from manufacturing companies. Read on for his seven theses describing and evaluating these developments.

For industrial enterprises, the Covid-19 crisis is a challenge unlike any other. Sales and entire markets are collapsing and value chains are faltering. Over the past years, many decision-makers have been busily discussing the pros and purported cons of digitalization in manufacturing. The 4th Industrial Revolution, proclaimed in 2011 – the first industrial revolution, by the way, to be proclaimed before it happened – is not progressing as expected. But due to the Coronavirus crisis, we now find ourselves in a situation that could strongly encourage the transition.

An important aspect in this context is the gap between IT – with respect to the automation pyramid, that means the upper levels – and operational technology (OT) which controls the manufacturing-relevant processes at field level.

Operational Technology seems to lag several years behind IT when it comes to software technology topics like service orientation, object orientation or communication in distributed systems (pub-sub vs. client server).

To a large extent, this is a consequence of the long product lifecycles of high-value capital goods such as automated plants, as well as the special requirements on time-deterministic (real-time) behavior.

This is proving to be an obstacle to the advent of integrated systems in line with Industry 4.0, which requires a convergence of IT and OT.

 

  1. Open Source
    Industrial enterprises are choosing more and more open-source applications to drive their automation and production projects. Many applications in the machine learning and AI sphere already rely on open source (e.g. Rapidminer or TensorFlow). 

    This trend requires some rethinking – in development, but also in the application and use of software. Open source is an important prerequisite for the creation of open data markets and client platforms. At the same time, open-source approaches promote new impulses from the IT industry which may encourage the desired convergence of OT and IT.

    There’s another reason why open standards and open source are spreading: Businesses don’t want to become dependent on a single provider. And the Covid crisis may even encourage this trend.
     
  2. Low Code
    A further trend associated with open source is 
    low code. This technology uses visual approaches to develop applications that could previously only be realized via control structures in a high-level programming language. An example would be the visual wiring of functional blocks.

    Low code lets engineers who lack deep coding skills develop their own applications and/or software components. For skilled users, this new form of programming enables short-term development of prototypes that can be tested and fine-tuned together with the customer.

    The use of low code thus increases flexibility, reduces development times and costs, and accelerates app development for platforms to enable rapid testing and successful further evolution of business models.
     

  3. Software-Defined Manufacturing
    The convergence of IT and OT is particularly significant to line control programming. Currently, a large part of the business logic lies deeply hidden in the PLC (programmable logic controller). The logic programs based on the IEC 61131-3 standard are usually hard-coded and offer only a limited degree of flexibility.

    Here it would be more efficient to only limit just basic functions with real-time requirements directly in the control, and use an API to connect to the event-driven business logic in superordinate systems. This would permit the development of more flexible adaptations of the business logic using high-level programming language or by using low code.

    The approach of increasingly shifting the functionality of production systems to the software is reflected in the steadily growing value-adding role that software plays in the development of production facilities.

    The topic of “software-defined X” is already underway in various fields. In essence, software-defined X means that software, and only software, is used to realize functions – without having to modify the underlying hardware. Examples include smart phones, software-defined networking, software-defined radio, software-defined infrastructure…

    Software-defined manufacturing transfers this approach to production technology. Major roles in this development fall to the digital twin concept in production and the engineering of automation facilities, to be discussed below.
     

  4. Middleware
    To realize such concepts of convergent communication, a unifying communication layer is required that addresses all connected systems via a common software interface (API). This also permits the realization of service-oriented architectures that enable loose coupling of systems. Here too, the open source community offers a wide range of so-called middleware systems like Camel, Kafka, and many others.

    The ability of software systems to integrate with such middleware creates the foundation for data-consistent communication systems, as exemplified by an enterprise or manufacturing service bus.
     

  5. Connectivity
    Given the range of middleware systems now available, we must ask how one can ensure connectivity and interoperability across widely divergent systems.

    Standardization plays a major role in this respect, and OPC UA is among the most important of these standards for industrial communication. Thanks to its integrated data model, OPC UA permits connectivity with a high degree of data abstraction.

    Industry associations like the VDW are doing development work here. Its UMATI initiative was established to promote the data model for data exchanges in the machine tool user domain. The outcome will be a so-called OPC UA Companion Specification, i.e. a specific data model for this application domain.
     

  6. Interoperability
    Another major standardization initiative is the so-called Asset Administration Shell (AAS), which is derived from RAMI 4.0 (Reference Architecture Model Industry 4.0). The AAS aims, in future, to enable the development of systems with such a degree of integration that production processes are able to self-organize which would form the foundation of a truly Smart Factory.

    The asset administration shell can be viewed as a standardized version of the digital twin. All AAS specifications are public and freely available. By using the open-source application AASX, anyone can try out the concept of the AAS and run it through a few initial prototypes.

    In my opinion, there is to date no unequivocal definition of the digital twin. At a very generic level, the digital twin may be seen simply as the digital representation of a relevant object in the application domain that considers it.

    Thus, a 3D kinematic simulation model can be a digital twin of a corresponding physical production plant as well as its logical control model, each of which exists as a sub-model in the common AAS. The AAS represents the physical production plant in the digital world and can provide runtime information there as well as serve as an interface for controlling interventions from the digital world. This property makes the AAS in conjunction with the represented asset an important component of a future smart factory, which together can then be viewed as a cyberphysical system.

    An easily grasped analogy to the digital twin is a LinkedIn profile. People present their specific capabilities and propositions in the digital world and can thereby way network with other members via their own digital representation.

    Applying this concept to production technology, the digital twin of a product will be able to negotiate the terms of its own production with the digital twin of a production facility.

    Technical approaches that would enable such activities, like capability-based manufacturing planning, are in development currently. Here a useful analogy would be the USB interface for computer peripherals, which permits automated configuration of a wide range of different devices based on self-describing mechanisms. We will one day find this approach in automation technology as well. There is, however, a prerequisite: seamless semantic data exchange.
     

  7. Seamless semantic data Exchange
    For data exchange, the AAS requires data formats that can be understood and processed by all participants. A promising approach is the AutomationML standard, which in concert with the AAS permits data serialization and thus complete, loss-free exchange even of complex internal data structures.

    To achieve this, AutomationML relies on a concept that draws on relevant libraries to also communicate the semantics of the data objects, enabling the self-description mechanism described above. In this way, the so-called sub-models which make up an AAS can contain complete descriptions of automation components. In addition to 3D geometry, these may contain kinematics, behavioral models and control programs. AutomationML is a neutral, XML-based data format for storing and exchanging plant planning data, freely available as an open standard.

    The original goal of the AutomationML standard was to enable the exchange of engineering data in a heterogeneous environment of engineering tools for various disciplines such as mechanical design, electrical design, HMI development, SPS programming or robot control.

    However, AutomationML is increasingly used as a metadata format that can be leveraged as a universal tool for resolving data exchange problems. A corresponding OPC UA Companion Specification is now available to map the AutomationML metadata model to the specific data model representation of an OPC UA server (Name Space). This permits automated provision of data models described by AutomationML via an OPC UA server – a useful complement to both standards.

 

So much for my views on the major current IT trends in industry. Have I missed a topic? Would you like further details? Please get in touch with me!

RELATED CONTENT

Blog What digital platform does my business need? Are your investments in digitalization delivering what you wanted? In the manufacturing industry, a great deal depends on how you design the digital platform that is supposed to give you true end2end processes. To make informed decisions, you have to… Read more about
Customer Projects
Customer Project Customer Projects Gain insights into the Business Benefits our clients realize through Process Digitalization. Read about how customers, ranging from SMEs to Global Tier-1 Suppliers, evaluate their collaboration with Us and why they value us as a Trusted Advisor. Read more about