Refine
Year of publication
Document Type
- Article (1)
- Book (1)
- Part of a Book (4)
- Conference Proceeding (31)
- Contribution to a Periodical (8)
- Lecture (2)
- Internet Paper (1)
- Working Paper (3)
Is part of the Bibliography
- no (51)
Keywords
- 02 (3)
- 03 (1)
- 04 (1)
- 5G (2)
- AI (2)
- Architektur <Informatik> (1)
- Artificial intelligence (1)
- Auftragsabwicklung (1)
- Auto-ML (2)
- Beratung (1)
- Big Data (2)
- CPS (3)
- Collaborative Planning (1)
- Compliance (2)
- Conversational interfaces (1)
- Cyber Security (2)
- Cyber-Security (1)
- Data ecosystem (1)
- Datenanalyse (1)
- Datenaustausch (1)
- Datenauswertung (1)
- Datenmanagement (1)
- Datensouveränität (1)
- Delphi study (1)
- Digital Transformation (1)
- Digital sovereignty (1)
- Digital technologies (2)
- Digital transformation (1)
- Digitaler Schatten (2)
- Digitalisierung (1)
- Digitalization (2)
- EMISA (1)
- ERP (1)
- Echtzeitfähigkeit (1)
- Energieflexibilitäten (1)
- Energiemanagement (3)
- Energy Management (1)
- Enterprise-Resource-Planning (1)
- Event-driven IT-Architecture (1)
- Fallstudien (1)
- Finanzkrise (1)
- FlAixEnergy (1)
- General Engineering (1)
- Geschäftsmodelle (1)
- GradeIT (1)
- ILN (1)
- IS Landscape (2)
- IS-architectrue of manufacturing companies (1)
- IT OT Integration (1)
- IT complexity (3)
- IT organization (1)
- IT-Security (1)
- IT-Service-Management (1)
- IT-Unterstützung (1)
- Industrie 4.0 (9)
- Industrie 5.0 (1)
- Industrie-4.0-Environments (1)
- Industry 4.0 (2)
- Industry 5.0 (1)
- Information System Architecture (1)
- Information Systems Integration (1)
- Information systems (3)
- Informationslogistik (1)
- Informationsqualität (2)
- Informationssicherheit (1)
- Informationstechnologie (1)
- Intelligente Produkte (1)
- Intelligentes Stromnetz (1)
- Internet of Production (1)
- KI (2)
- KMU (1)
- Konzepte (1)
- Krise (1)
- Künstliche Intelligenz (4)
- Lastmanagement (1)
- Lastverteilung <Energietechnik> (1)
- Literature Review (2)
- Load Management (1)
- Machine Learning (2)
- Management (1)
- Management Science and Operations Research (1)
- Manufacturing Companies (2)
- Manufacturing companies (1)
- Maturity Index (1)
- Modellierung (1)
- Natural-Language-Processing (1)
- PLM (2)
- Platform (1)
- ProSense (2)
- Product-Lifecycle-Management (1)
- Product-Service-Systems (1)
- Produktdatenmanagement (1)
- Produktion (2)
- Produktionsplanung (1)
- Produktionssteuerung (1)
- Produktlebenszyklus (2)
- RFID (1)
- Regulation (2)
- Rezepte (1)
- SDM (1)
- SGAM (1)
- SME (1)
- SMEs (2)
- SV7185 (1)
- SV7313 (1)
- SV7427 (1)
- SV7459 (1)
- Self-managed (1)
- Sensorsystem (2)
- Smart Data (1)
- Smart Machines (1)
- Smart Products (2)
- Smartification (1)
- Stammdaten (1)
- Stammdatenmanagement (1)
- Strategy and Management (1)
- Studie (2)
- Störungsmanagement (1)
- Subscription Business (1)
- Subscription Business Models (1)
- Supply-Chain-Management (2)
- Supply-Chain-Planning (1)
- TechFit (2)
- Technologiemanagement (2)
- Trends (1)
- Unternehmensberatung (1)
- Wandlungsfähigkeit (1)
- Whitepaper (1)
- Wirtschaftskrise (1)
- Zielsystem (1)
- agile and learning companies (1)
- agile development (1)
- artificial intelligence (1)
- artificial intelligence lifecycle (1)
- cyber-physische Systeme (1)
- data mining (1)
- decision-maker (1)
- development process (1)
- digital technologies (1)
- digital transformation (2)
- eMobility (1)
- economic quantification (1)
- electric vehicle communication (1)
- energy management (3)
- energy management use cases (1)
- energy monitoring (2)
- fix and intersection point of eMobility (1)
- grid management (1)
- information logistics (1)
- information quality (1)
- information system architecture (1)
- load management (2)
- machine learning (1)
- manufacturing companies (2)
- manufacturing company (2)
- manufacturing industry (1)
- product development process (1)
- renewable energies (1)
- rev (8)
- serious gaming (1)
- smart grid architecture model (1)
- smart products (2)
- smart services (1)
- smartification (1)
- supply chain event management (1)
- text data (1)
- text mining (1)
- use case modeling (1)
Institute
- Informationsmanagement (51) (remove)
In the age of digitalization, manufacturing companies are under increased pressure to change due to product complexity, growing customer requirements and digital business models. The increasing digitization of processes and products is opening up numerous opportunities for mechanical engineering companies to exploit the resulting potential for value creation. Subscription business is a new form of business model in the mechanical engineering industry, which aims to continuously increase customer benefit to align the interests of both companies and customers. Characterized by a permanent data exchange, databased learning about customer behavior, and the transfer into continuous innovations to increase customer value, subscription business helps to make Industry 4.0 profitable. The fact that machines and plants are connected to the internet and exchange large amounts of data results in critical information security risks. In addition, the loss of knowledge and control, data misuse and espionage, as well as the manipulation of transaction or production data in the context of subscription transactions are particularly high risks. Complementary to direct and obvious consequences such as loss of production, the attacks are increasingly shifting to non-transparent and creeping impairments of production or product quality, which are only apparent at a late stage, or the influencing of payment flows. A transparent presentation of possible risks and their scope, as well as their interrelationships, does not exist. This paper shows a research approach in which the structure of subscription models and their different manifestations based on their risks and vulnerabilities are characterized. This allows suitable cyber security measures to be taken at an early stage. From this basis, companies can secure existing or planned subscription business models and thus strengthen the trust of business partners and customers.
Subscription business transforms traditional business models of machinery and plant engineering. Many manufacturing companies struggle to pull out the potential created by Industry 4.0 and make it economically usable. In addition to technological innovations, it is necessary to transform the business model. This leads to a shift from ownership-based and product-centric business models to outcome-based business models, which focus on the customer's value and thus realize a unique value proposition and competitive advantage – the outcome economy. Based on a case study analysis among manufacturing companies, this paper provides further clarification including a definition and constituent characteristics of subscription business models in machinery and plant engineering.
The adoption of artificial intelligence (AI) technologies in manufacturing companies is challenging, particularly for SMEs that lack the necessary skills to develop and integrate AI-based applications (AI applications) into their existing IT system landscape. To address this challenge, the research project VoBAKI (IGF-Project No.: 22009 N) aims to enable SMEs to identify and close skill gaps related to AI application development and implementation using proper sourcing strategies. This paper presents the interim results from the second phase of the project, which involves identifying the tasks in the lifecycle of AI applications and determining the specific skills required for executing these tasks. The presented results provide a detailed lifecycle including the phases for the development and usage of AI applications, as well as the specific tasks that SMEs must consider when implementing an AI application. These results serve as the foundation for future research regarding the required skills to execute the presented tasks and provide a roadmap for SMEs to close skill gaps and successfully implement AI applications.
Die Herausforderungen der Zukunft werden geprägt durch digital veredelte Produkte von höchster Qualität und hoher Variantenvielfalt bei gleichzeitig kleiner werdenden Losgrößen. Konventionelle Entwicklungsmethoden stoßen aufgrund zunehmender Komplexität und kürzer werdender Lebenszyklen auf Produktebene an ihre Grenzen. Dadurch werden bei kundenindividueller Produktion die Aufwände in der Arbeitsplanung und -vorbereitung überproportional größer. Eine mögliche Lösung stellt die generative Erstellung der Produktionsstückliste während der Montage dar. Durch das eventbasierte „Mitschreiben der Montage“ werden administrative und planungsintensive Prozesse in der Arbeitsvorbereitung überproportional reduziert und die Erstellung der Stückliste in die manuelle Montage transferiert.
Unternehmen aller Branchen sehen sich mit immer neuen Anforderungen an den Produktentstehungsprozess konfrontiert. Um wettbewerbsfähig zu bleiben, müssen sie ihren Kunden eine höhere Variantenvielfalt bei gleichzeitig geringeren Produktentwicklungs- und Markteinführungszeiten bieten. Zur Realisierung dieser Ziele reagieren sie mit der Einführung von modularen Produktbaukästen und der Etablierung von global verteilten Wertschöpfungsnetzwerken.
Eine effiziente und durchgängige Unterstützung der Unternehmensfunktionen erfordert die Integration und das harmonische Zusammenspiel der IT-Systeme. Eine zwingende Voraussetzung für das Erreichen dieser Integration ist die Vereinheitlichung und Pflege des Fundaments der Systemlandschaft – der Stammdaten.
Das (volks-)wirtschaftliche Umfeld produzierender Unternehmen wird aktuell mehr denn je durch unvorhersehbare und tiefgreifende Veränderungen geprägt. Die deutsche Industrie muss die Dynamik zukünftig aus eigener Kraft beherrschen. Teilweise nachteilige Standortfaktoren müssen kompensiert werden, um die Produktion in Deutschland langfristig zu sichern. Wandlungs- und Echtzeitfähigkeit in Prozessen und Strukturen stellen die zentralen Enabler zur Beherrschung des Produkt-Produktionssystems dar.
Holistic PLM- Model
(2010)
Product Lifecycle Management (PLM) is a widely discussed topic concerning the increase of efficiency of product development in terms of time to market as well as customizing products to the different needs of customers worldwide adequately. Historically PLM focuses the early phases of the product’s lifecycle, namely the product development phase. Therein the roots of PLM are based in supporting the information logistics of product data: Consistent data sets should be available to all stakeholders in the different departments at all times. Due to the increasing product complexity PLM has to be extended in terms of the temporal dimension (not limited to product development phase) and systemic dimension (not limited to the information logistic aspect). In this paper the authors derive a holistic framework for Product Lifecycle Management by analysing existing integrated management approaches. The framework consists of four dimensions: PLM strategy, PLM process, Product structure and PLM IT-Architecture. The sustainability and benefits of the framework is demonstrated by applying the framework to the communication service provider industry (CSP).
Manufacturing companies face the challenge of managing vast amounts of unstructured data generated by various sources such as social media, customer feedback, product reviews, and supplier data. Text-mining technology, a branch of data mining and natural language processing, provides a solution to extract valuable insights from unstructured data, enabling manufacturing companies to make informed decisions and improve their processes. Despite the potential benefits of text mining technology, many manufacturing companies struggle to implement use cases due to various reasons. Therefore, the project VoBAKI (IGF-Project No.: 22009 N) aims to enable manufacturing companies to identify and implement text mining use cases in their processes and decision-making processes. The paper presents an analysis of text mining use cases in manufacturing companies using Mayring's content analysis and case study research. The study aims to explore how text mining technology can be effectively used in improving production processes and decision-making in manufacturing companies.
Smartification and digital refinement of products to enable the design of smart ones is a pivotal challenge in the manufacturing industry. Companies fail to design smart products due to missing knowledge of digital technologies and their integral part in product development processes. This paper presents a methodology that enables the derivation of digital functions for smart products through selected cases in manufacturing usage. We develop a morphology that consists of digital functions for smartification. In this context, we explained and derived characteristics by a set of examples regarding smart products in the manufacturing industry. Our methodology reduces the time spent initiating a development project with the focus on smartification.
The number of cyber-attacks on small and medium enterprises (SMEs) is constantly increasing. SMEs do not recognize the attacks until the damage has occurred. Only then, they fight with measures to increase IT-security and IT-safety. Many studies come to the point that this refers to a lack of budget, expertise and awareness of the need for IT-security. There are many compendia with recommendations for action, but they are too comprehensive and unspecific to the individual needs of SMEs. In this paper, we present the results of a research activity on the gaps that address the challenges faced by SMEs. In addition, we develop a concept for a serious gaming approach that includes an economic perspective on IT-security measures and shows how SMEs can derive their own IT-seurity target state
Manufacturing companies face the challenge of selecting digitalization measures that fit their strategy. Measures that are initiated and not aligned with the company’s strategy carry the risk of failing due to lack of relevance. This leads to an ineffective use of scarce human and financial resources. This paper presents a target system to help companies select relevant digitalization measures compliant with their strategy for IT-OT-integration projects. The target system was developed based on literature research and expert interviews, and later validated in two use cases. The target system considers the goals of production companies and combines them with digitalization measures. The measures are classified by different maturity levels required for their realization. Thus, the target system enables manufacturing companies to evaluate digitalization measures with regards to their strategic relevance and the required Industrie 4.0 maturity level for their realization. This ensures an effective use of resources.
The number of available technologies is constantly rising. Be it additive manufacturing, artificial intelligence (AI) or distributed ledger technologies. The choice of the right technologies may decide the fate of a company. Due to the overwhelming amount of information sources, regular technology market research becomes increasingly challenging, especially for SMEs. In order to assist the technology management process, the authors will introduce the architecture of an automated, AI-based technology radar. The architecture will automatically collect data from relevant sources, assess the relevance of the respective technology (i.e. their maturity level) and then visualize it on the radar map.
Numerous traditional, agile and hybrid development approaches have been proposed for the development of CPS. As the choice of development process is crucial to the success of development projects, it has become a major challenge to identify the best-suited process. This paper introduces a methodology for identifying the best-suited CPS development process, based on the individual boundary conditions for a certain development project within a company. The authors used a set of eight indicators to assess a CPS-development project. The results of the assessment were matched with CPS-development approaches. Based on the matching results a best-suited development process was selected. The application is shown for a use case in the German manufacturing industry. The developed method aims to reduce the risk of project failure due to the wrong choice of development process.
Methods of machine learning (ML) are difficult for manufacturing companies to employ productively. Data science is not their core skill, and acquiring talent is expensive. Automated machine learning (Auto-ML) aims to alleviate this, democratizing machine learning by introducing elements such as low-code or no-code functionalities into its model creation process. Due to the dynamic vendor market of Auto-ML, it is difficult for manufacturing companies to successfully implement this technology. Different solutions as well as constantly changing requirements and functional scopes make a correct software selection difficult. This paper aims to alleviate said challenge by providing a longlist of requirements that companies should pay attention to when selecting a solution for their use case. The paper is part of a larger research effort, in which a structured selection process for Auto-ML solutions in manufacturing companies is designed. The longlist itself is the result of six case studies of different manufacturing companies, following the method of case study research by Eisenhardt. A total of 75 distinct requirements were identified, spanning the entire machine learning and modeling pipeline.
Eine Transformation findet einen Abschluss, nachdem der gewünschte Zielzustand erreicht wurde. Wie sieht es bei der digitalen Transformation aus? Kann es im Hinblick auf technologische Entwicklungen jemals zu einem Ende kommen? Oder befindet sich ein Unternehmen hierbei in einer kontinuierlichen Transformation durch die Weiterentwicklung der Digitalisierung? Wenn ja, wie kann ein Unternehmen mit diesem ständigen Wandel effizient und sicher umgehen? (Quelle: https://link.springer.com/chapter/10.1007/978-3-662-63758-6_17 )
Inhaltsangabe Band:
Die vernetzte Digitalisierung hat die produzierende Industrie fundamental verändert. Im Rahmen dessen eröffnen sich produzierenden Unternehmen kontinuierlich neue Chancen, in einem zunehmend dynamischen und durch das Internet geprägten Wettbewerb, wirtschaftliche Erfolge zu erzielen. Durch die veränderten Rahmenbedingungen der vernetzten Digitalisierung müssen produzierende Unternehmen jedoch neue Ansätze für die Organisation der digitalen Transformation verfolgen: Sie müssen die neue Führungsaufgabe Digitalisierungsmanagement gestalten. Dabei muss das Digitalisierungsmanagement eine breite Aufgabenvielfalt abdecken.
Dieses Buch befähigt produzierende Unternehmen die digitale Transformation erfolgreich zu gestalten. Dazu werden Nutzen und Funktionsweisen der wesentlichen Aufgaben des Digitalisierungs- und Informationsmanagements praxisnah dargestellt. Ein spezifisch für produzierende Unternehmen, die eine digitale Transformation anvisieren, entwickeltes Digitalisierungs- und Informationsmanagement-Modell verknüpft schließlich die Inhalte.
Das vorliegende Buch ist als ein Nachschlagewerk für Führungskräfte und Entscheider entwickelt worden, die die Herausforderungen der Realisierung von digitalen Geschäftsmodellen, digitalisierten Produkten und digitalen Geschäftsprozessen angehen wollen. Die Methoden in diesem Buch helfen dabei, die richtigen Managementaufgaben zu verfolgen und diese in der Unternehmensorganisation umzusetzen. Dabei werden auch die Schnittstellen zwischen dem strategischen Digitalisierungsmanagement und dem taktischen bis operativen Informationsmanagement behandelt. Das Buch bietet einen schnellen und einfachen Zugriff auf die wichtigsten Methoden und viele unterstützende Beispiele. Es ist Teil der Reihe „Handbuch Produktion und Management“ und ergänzt dessen Ordnungsrahmen.
(Quelle: https://link.springer.com/book/10.1007/978-3-662-63758-6)
Nowadays, the market for information and communication technologies used for IOT-applications grows daily. Since companies need technologies to transform their business processes corresponding to the digital revolution, they need to know which technologies are available, and fit the best for their use case. Their inertial issue is the lacking overview of technologies suitable to connect their production or logistics. Hence, this paper presents a methodology to select technologies (and combinations) based on their functions. It differentiates between information and communication technologies, digital technologies and connecting technologies by the physical function and its role in a cyber-physical system. Depending on the use case, the applicability of every technology varies. Due to that reason, the paper illustrates a ranked qualification of the technologies for typical use cases, focussing tracking and tracing issues in the intralogistics of producing companies. The evaluation is performed upon a literature research, a market study to identify suitable technologies, and various expert interviews to assess the applicability of the technologies.
Digital technologies have gained significant importance in the course of the 4th Industrial Revolution and these technologies are widely implemented, nowadays. However, it is necessary to bear in mind that an ill-considered use can quickly have a negative impact on the environment in which the technology is used. For more responsible and sustainable use, the regulation of digital technologies is therefore necessary today. Since the government is taking a very slow response, as the example of the AI Act shows, companies need to take action themselves today. In this context, one of the central questions for companies is: "Which digital technologies are relevant for manufacturing companies in terms of regulation? This paper conducted a quantitative Delphi study to answer this question. The results of the Delphi study are presented and evaluated within the framework of a data analysis. In addition, it will be discussed how to proceed with the results so that manufacturing companies can benefit from them. Furthermore, the paper contributes to the development of an AI platform in the German research project PAIRS by investigating the compliance relevance of artificial intelligence applications.
Feasibility Analysis of Entity Recognition as a Means to Create an Autonomous Technology Radar
(2021)
Mit den neuesten Technologietrends auf dem Laufenden zu bleiben, ist für Fertigungsunternehmen eine entscheidende Aufgabe, um auf einem global wettbewerbsfähigen Markt erfolgreich zu bleiben. Die Erstellung eines Technologieradars ist ein etablierter, jedoch meist manueller Prozess zur Visualisierung der neuesten Technologietrends.
Der Herausforderung, Technologien zu identifizieren und zu visualisieren, widmet sich das Projekt TechRad, das maschinelles Lernen einsetzt, um ein autonomes Technologie-Scouting-Radar zu realisieren. Eine der Kernfunktionen ist die Identifizierung von Technologien in Textdokumenten. Dies wird durch natürliche Sprachverarbeitung (NLP) realisiert.
Dieser Beitrag fasst die Herausforderungen und möglichen Lösungen für den Einsatz von Entity Recognition zur Identifikation relevanter Technologien in Textdokumenten zusammen. Die Autoren stellen eine frühe Phase der Implementierung des Entity Recognition Modells vor. Dies beinhaltet die Auswahl von Transfer Learning als geeignete Methode, die Erstellung eines Datensatzes, der aus verschiedenen Datenquellen besteht, sowie den angewandten Modell-Trainings-Prozess. Abschließend wird die Leistungsfähigkeit der gewählten Methode in einer Reihe von Tests überprüft und bewertet.
The digital transformation is changing the way companies think and design their manufacturing environment. Both due to the increasing number of connections between IoT-Devices, tooling machines, and production lines and the phenomenon of the convergence of IT and OT, systems are becoming more complex than years ago. Organizational and cultural changes within manufacturing companies strengthen this trend and form Industry 4.0 environments and cyber-physical production systems (CPPS). As these systems do not longer stay alone but are connected to each other and the company’s outside, the size of the potential attack surface is increasing as well. Besides that, manufacturing companies, small and medium-sized in particular, are facing complex challenges based on lack of knowledge, budget, and time to understand as well as to interpret their current situation and risk level and therefore to derive necessary counter-measures. Efficient as well as pragmatic tools and methods for these companies do not exist. This paper shows a research approach in which the company-specific set-up of Industry 4.0 environment and CPPS is characterized by its potential vulnerabilities. This enables companies to evaluate their risk potential before setting up this kind of environments and to undJo,erstand the potential consequences more precisely. By doing so, companies can derive and prioritize important counter-measures and so to strengthen their level of cyber-security efficiently. This will decrease the number of cyber-security attacks and increase the company’s competitiveness.
Klar Schiff
(2009)
Im Rahmen dieser Studie untersuchten das Forschungsinstitut für Rationalisierung e. V. an der RWTH Aachen und die Universität St. Gallen
(Lehrstuhl Produktionsmanagement) 24 Veröffentlichungen von 11 Beratungsunternehmen. Dabei wurden über 200 Aussagen zur Bewältigung der Krise bewertungsneutral identifiziert und analysiert.
Assessment of IS Integration Efforts to Implement the Internet of Production Reference Architecture
(2018)
As part of a collaborative network, manufacturing companies are required to be agile and accelerate their decision making. To do so, a high amount of data is available and needs to be utilized. To enable this from a company internal information system perspective, the Internet of Production (IoP) describes a future information system (IS) architecture. Core element of the IoP is a digital platform building the basis for a network of cognitive systems. To implement and continuously further develop the IoP, manufacturing companies need to make architecture-related decisions concerning the accessibility of data, the processing of the data as well as the visualization of the information. The goal of this research is the development of a decision-support methodology to make those decisions, taking under consideration the evaluated IS integration effort. Therefore, this paper describes the allocation of IS functions and identifies the effort drivers for the respective IS integration by analyzing the integration possibilities. Conclusively this approach will be validated in a case study.
Methods of machine learning (ML) are notoriously difficult for enterprises to employ productively. Data science is not a core skill of most companies, and acquiring external talent is expensive. Automated machine learning (Auto-ML) aims to alleviate this, democratising machine learning by introducing elements such as low-code / no-code functionalities into its model creation process. Multiple applications are possible for Auto-ML, such as Natural Language Processing (NLP), predictive modelling and optimization. However, employing Auto-ML still proves difficult for companies due to the dynamic vendor market: The solutions vary in scope and functionality while providers do little to delineate their offerings from related solutions like industrial IoT-Platforms. Additionally, the current research on Auto-ML focuses on mathematical optimization of the underlying algorithms, with diminishing returns for end users. The aim of this paper is to provide an overview over available, user-friendly ML technology through a descriptive model of the functions of current Auto-ML solutions. The model was created based on case studies of available solutions and an analysis of relevant literature. This method yielded a comprehensive function tree for Auto-ML solutions along with a methodology to update the descriptive model in case the dynamic provider market changes. Thus, the paper catalyses the use of ML in companies by providing companies and stakeholders with a framework to assess the functional scope of Auto-ML solutions.
Growing information systems (IS) often come along with growing IT complexity, because of emerging rag rug landscapes. This development causes rising IT costs and dependencies, which hinder the maintenance and expansion of the IS landscape. This article outlines the current research on published and presented methods to manage the rising IT complexity in a literature review. Because definitions of “IT complexity” vary a lot in literature, this paper also includes a definition of the term. In addition to that, it delivers a presentation of the used research methodology. Subsequently, it presents the findings in literature, highlights the research gap and – based on the literature analysis – presents the steps that need to be taken. A discussion of the results and a summary complete the article.
In this paper, an approach towards energy management 4.0 will be presented. Energy management 4.0 is understood as an encompassing energy data based concept for manufacturing companies acting in an flexible energy grid of the future with the final goal of autonomous self-optimization Controlling, supervising and scheduling production and logistic steps based on a reliable communication infrastructure and real time data in accordance to achieve a maximum of profitability with regard to human factor is executed.
Guided by a four maturity levels of the "acatech Industrie 4.0 Maturity Index" developed by the German National Academy of Science and Engineering (acatech) different use cases are presented according to the steps of visibility, transparency, prognostic capacity and self-optimization. The basic idea of energy management 4.0 is described and an outlook of further steps that are needed to be evaluated for an implementation are presented.
Growing information systems (IS) often come along with growing IT complexity, because of emerging rag rug landscapes. This development causes rising IT costs and dependencies, which hinder the maintenance and expansion of the IS landscape. This article outlines the current research on published and presented methods to manage the rising IT complexity in a literature review. Because definitions of “IT complexity” vary a lot in literature, this paper also includes a definition of the term. In addition to that, it delivers a presentation of the used research methodology. Subsequently, it presents the findings in literature, highlights the research gap and – based on the literature analysis – presents, the steps that need to be taken. A discussion of the results and a summary complete the article.
Im Kontext Industrie 4.0 kommt der Erfassung der anfallenden Daten in der Produktion und deren Nutzung eine zentrale Bedeutung zu. Analysen betrieblicher Daten, welche auf verschiedenen Ebenen generiert werden, lassen Rückschlüsse und Erkenntnisse zur besseren Entscheidungsfindung zu. Die Basis für den Einsatz von Verfahren der Datenanalyse und -auswertung stellt ein hinreichend genaues Abbild der relevanten Daten - der Digitale Schatten - in der Auftragsabwicklung, Produktion, Entwicklung oder angrenzenden Bereichen dar.
Im Rahmen des vorliegenden Beitrages wird ein Modell für den Digitalen Schatten in der Auftragsabwicklung vorgestellt, welches die Basis für die Implementierung von Methoden der Datenanalytik darstellt.
Im Rahmen der vernetzten Digitalisierung stehen insbesondere kleine und mittlere IT-Organisationen und IT-Dienstleister vor der großen Herausforderung, in einem immer dynamischer werdenden Umfeld Leistungen in hoher Qualität zu liefern. Die Verknüpfung dieser Leistungen mit den zu unterstützenden Geschäftsprozessen und Geschäftsmodellen gestaltet sich schwierig und erfordert eine service- und prozessorientierte Denkweise.
Zur Bewältigung dieser Herausforderungen und der Umsetzung des "service- und prozessorientierten Denkens" bietet das IT-Service-Management (ITSM) Methoden und Maßnahmen zur kundenorientierten, prozessgesteuerten und transparenten Erbringung von IT-Services. Trotz bestehender ITSM-spezifischer Referenzmodelle und Regelwerke werden die beschriebenen Methoden von kleinen und mittleren IT-Organisationen und IT-Dienstleistern kaum genutzt. Der Grund hierfür liegt unter anderem in der hohen Komplexität der Regelwerke und dem damit verbundenen großen Implementierungsaufwand. Es fehlt ein Vorgehen, das die Fähigkeiten und Möglichkeiten von kleinen und mittleren Unternehmen (KMU) berücksichtigt, um IT-Prozesse eigenständig hinsichtlich der Serviceorientierung zu bewerten und zu optimieren.
Das Ergebnis des Forschungsvorhabens "GradeIT" ist eine Vorgehensweise, die KMU dabei unterstützt, relevante IT-Service-Prozesse für sich selbst zu identifizieren, um diese dann eigenständig zu bewerten und auf Basis transparent dargestellter Wirkungszusammenhänge zu spezifischen Einflussfaktoren erfolgversprechende Handlungsempfehlungen auszusprechen.
Due to the drastically increasing amount of data, decision making in companies heavily relies on having the right data available. Also because of an increasing complexity of structures and processes, quick and precise flows of information become more important.
This paper introduces a new approach for modelling information flows, creating a basis for an efficient information management. It can be used to structure the information requirements and identify gaps within the information processing.
To display its benefits, the proposed Information Logistics Notation (ILN) is applied to the information logistics of todays and future energy market and grid stability management, both processes of increasing complexity.
The manufacturing industry has to exploit trends like “Industrie 4.0” and digitization not only to design production more efficiently, but also to create and develop new and innovative business models. New business models ensure that even SMEs are able to open up new markets and canvass new customers. This means that in order to stay competitive, SMEs must transform their existing business models.
The creation of new business models require smart products. The required data base for new business models cannot be provided by SMEs alone, whereas smart products are able to provide a foundation, given the creation of smart data and smart services they enable. These services then expand functions and functionality of smart products and define new business models.
However, the development of smart products by small and medium-sized enterprises is still lined with obstacles. Regarding the product development process the inclusion of smart products means that new and SME-unknown domains diffuse during the process. Although there are many models regarding this process there appears to be a substantial lack of taking into account the competencies enabled by the implementation of digital technologies. Hence, several SME-supporting approaches fail to address the two major challenges these enterprises are faced with. This paper generally describes valid objectives containing relevant stakeholders and their allocation to the phases of the product life cycle.
Within each objective the potential benefit for customers and producers is analyzed. The model given in this paper helps SMEs in defining the initiation of a product development project more precisely and hence also eases project scoping and targeting for the smartification of an already existing product.
In order to introduce load management in the manufacturing industry, some obstacles need to be pointed out. This paper presents a feasible approach on how to implement load management measures in companies.
To this end, load management and energy management are explained and distinguished in a first step. Subsequently, the implementation method is introduced. Therefore, by means of this paper, companies will be enabled to use load management measures and significantly reduce their energy costs. In the second part of the paper, the introduced approach will be applied.
Hence, a use case of a manufacturing company is described. Alongside energy analyses with consumption data, specific measures are presented.
Nowadays, cyber physical systems support the improvement of efficiency in intralogistics by controlling and manipulating the production and logistic environment autonomously. Due to the complexity of the individual production processes, designing suitable cyber-physical systems based on their existing production environment is a challenge for companies.
This paper presents a new methodology on how to design cyber-physical systems conceptually to suit an individual production environment. Compared to existing design approaches, this methodology matches immediately the required functions to existing information and communication technology’s components insisting on the neutral assimilation of requirements.
Therefore, the requirement specification asks for needed functions in relating to offered functions of information and communication technology (ICT) components. The paper focusses the use case of implementing a cutting-edge mobile network technology into an existing tracking and tracing process.
In order to introduce load management in the manufacturing industry, some obstacles need to be pointed out. This paper presents a feasible approach on how to implement load management measures in companies. To do so, load management and energy management are explained and distinguished in a first step. Subsequently, the implementation method is introduced. Therefore, by using this paper, companies will be enabled to use load management measure and reduce their energy costs significantly.
Management of information and the IT systems it is stored in becomes a crucial capability for the industry. However, companies are struggling with the management of the various requirements and frequent changes of technology. Thus, IT complexity has become a major challenge for companies. At the same time, especially manufacturing companies are striving to implement Industrie 4.0 concepts. Many of these even have developed an Industrie 4.0 roadmap including various projects to change the company. Companies can develop such roadmaps by applying the Industrie 4.0 Maturity Index that gives a broad view on necessary capabilities for Industrie 4.0.
In our research, we analyzed data sets from over 10 manufacturing companies that have performed an Industrie 4.0 maturity assessment. Our hypothesis was that IT complexity challenges are hindering the implementation of Industrie 4.0 roadmaps significantly. We could prove this hypothesis at least for the companies analyzed and give insights on the specific challenges. Based on our analysis, we conclude our article by giving concrete recommendations on how to tackle IT complexity.
Digital technologies such as 5G, augmented reality, and artificial intelligence (AI) are currently being used in various ways by manufacturing companies. As the fourth industrial revolution progresses, it has become apparent that reckless use and inadequate regulation of these technologies have a detrimental effect on the environment in which they are utilized. Therefore, regulation of digital technologies is imperative today to ensure more responsible and sustainable use. While governments usually establish regulations, progress is not keeping pace with the demands and hazards of employing digital technologies. The European AI law serves as an example of the considerable distance yet to be covered before binding guidelines are established. Consequently, companies must take proactive measures today to ensure that they use digital technologies responsibly in their environments. In this context, identifying which digital technologies are pertinent to manufacturing companies in terms of regulation is crucial. Furthermore, a comprehensive approach is required to design compliance holistically for digital technologies and to systematically derive the corresponding guidelines. This paper introduces a set of models that not only determine the importance of
compliance in the application of different technologies but also present a framework for methodically designing compliance. Furthermore, the paper contributes to the development of an AI platform in the German research project PAIRS by investigating the compliance relevance of applications such as artificial intelligence.
The main challenge in all application areas of EV usage still is the energy storage within, as well as the energy transmission into an EV. However, this storage and transmission of energy also allows for synergies with a smart grid, if the information is adequately exchanged between roles in the energy and mobility sector. Since the energy transmission is a so called “fixed and intersection point” of E-Mobility, interoperability is required not only on an electrical (e.g. plugs), but also on an informational level. Standardization efforts are currently underway (e.g. IEC 15118), yet a comprehensive, consolidating view on the information system around energy transmission is missing. Therefore, this paper suggests a generic information system architecture for e-mobility (EM-ISA) derived from the Smart Grid Architecture Model (SGAM). EM-ISA shall be a base for companies to develop innovative services for their particular, ICT-enabled E-Mobility application area while at the same time stay at important points informational interoperable at the fixed and intersection point of energy transmission.
Der High-Resolution-Supply-Chain-Management-Ansatz erlaubt es Unternehmen, in Echtzeit auf dynamische Einflüsse des Marktes reagieren zu können.
Echtzeitfähige Planungs- und Regelungsprozesse können den Planungsaufwand reduzieren und gleichzeitig mit den zur Verfügung stehenden Echtzeitinformationen die Planungsqualität verbessern.
Planungsprozesse auf Basis von Echtzeitinformationen setzen voraus, dass ein konsistenter Informationsaustausch zwischen den unterschiedlichen Planungsebenen besteht sowie ein hoher Autonomiegrad innerhalb der einzelnen Planungsinstanzen.
PLM trifft ERP
(2013)
Das Management des Produktlebenszyklus ist eine komplexe Aufgabe, dessen volles Potenzial erst durch die Integration des gesamten Unternehmens erreicht wird. Um die Einbindung aller Fachabteilungen sicherzustellen, ist eine Potenzialuntersuchung notwendig, bei der Herausforderungen und mögliche Verbesserungen entlang des gesamten Produktlebenszyklus untersucht werden müssen. Der PLM-QuickCheck, den das FIR an der RWTH Aachen und das WZL der RWTH Aachen gemeinsam entwickeln, liefert hier einen möglichen Ansatz.
Der Begriff „Digitaler Schatten“ steht für ein hinreichend genaues, digitales Abbild der Prozesse, Information und Daten eines Unternehmens. Dieses Abbild wird benötigt, um eine echtzeitfähige Auswertebasis aller relevanten Daten zu schaffen, um hieraus letztendlich Handlungsempfehlungen abzuleiten. Die Bildung des Digitalen Schattens ist damit ein zentrales Handlungsfeld von Industrie 4.0 und stellt die Grundlage für alle weitergehenden Aktivitäten dar.
Digitale Technologien sind ein wesentlicher Bestandteil der Wertschöpfungskette in der industriellen Praxis geworden. Die Digitalisierung hat die Produktion und den modernen Arbeitsplatz in den vergangenen Jahrzehnten auf eine Art beeinflusst, die mit keiner anderen technischen Entwicklung vergleichbar ist, und die nun der vierten industriellen Revolution den Weg ebnet.
Die Essenz von Industrie 4.0 ist die Vernetzung von Produktionssystemen mithilfe von IT und dem Internet der Dinge, um prognosefähig zu sein und die Produktion effizienter und flexibler zu gestalten. Wesentliche Befähiger dieser Vision sind Daten aus Prozessen, Anlagen und Ressourcen, aus denen für das Unternehmen entscheidungskritische Informationen gewonnen werden. Hieraus lassen sich Erkenntnisse ableiten, die bisher verborgene Wirkungszusammenhänge zutage fördern.
Prognosemodelle errechnen auf der Basis dieser Erkenntnisse mögliche Zukunftsszenarien und belegen sie mit Wahrscheinlichkeitswerten bezüglich ihres Eintritts. Durch die Vernetzung der Informationen unterschiedlicher Aufgaben, Funktionen und Domänen lassen sich Handlungsempfehlungen fundieren, wobei eine unüberschaubare Anzahl relevanter Parameter berücksichtigt wird. Der Produktion wird ähnlich dem Rennsport eine Ideallinie aufgezeigt, an der sie sich orientieren kann, um in kürzester Zeit optimierte Ergebnisse zu erzielen.
In recent years supply chain participants are increasingly suffering the effects of disturbances in transportation supply chains. Both, dynamics in consumer demands and global supply chains lead to a growth in unplanned supply chain events. These can cause from rather manageable disturbances through to complete break-downs of transportation chains, resulting in high follow-up and penalty costs.
Consequently, concepts for an efficient supply chain disturbance management are needed, preferably with a real-time identification and reaction to disturbance events. Therefore in the following paper the research results of the German research project Smart Logistic Grids with the focus on designing an integrated model for the real-time disturbance management in transportation supply networks are presented. This includes the introduction of elaborated classification models for disturbances and action patterns as well as an associated costs and performance measurement system. Finally, a procedure model for the disturbance management is presented.
Systematization models for taylor-made sensor system applications and sensor data fit in production
(2015)
Industrial digitalization to realize smart factories is driven by an informatory base of high-resolution data provided by sensor systems on the shop-floor level. The challenge of technical availability of fitting measurement solutions nowadays turns in a struggle of finding the optimal solution for a specific task in an ever-growing sensor market. This paper analyzes and specifies necessary models to systematically derive and describe organizational, technical and informatory requirements for sensor system applications increasing the technological fit for faster integration and lower misinvestment rates.
Systematization models for taylor-made sensor system applications and sensor data fit in production
(2015)
Industrial digitalization to realize smart factories is driven by an informatory base of high-resolution data provided by sensor systems on the shop-floor level. The challenge of technical availability of fitting measurement solutions nowadays turns in a struggle of finding the optimal solution for a specific task in an ever-growing sensor market. This paper analyzes and specifies necessary models to systematically derive and describe organizational, technical and informatory requirements for sensor system applications increasing the technological fit for faster integration and lower misinvestment rates.
Companies are transforming from transactional sales to providing solutions for their customers. Mostly, smart products, enabling companies to enhance their products by providing smart services to their customers, are a key building block in this transformation. However, the development of a smart product requires many digital skills and knowledge, which regular companies do not have. To facilitate the design and conceptualization of smart products, this paper presents a use-case-based information systems architecture prototype for smart products. Furthermore, the paper features the application and evaluation of the architecture on two different smart product projects. The use of such an architecture as a reference in smart product development serves as a huge advantage and accelerator for inexperienced companies, allowing faster entry into this new field of business. [https://link.springer.com/chapter/10.1007/978-3-031-14844-6_16]
With the development of publicly accessible broker systems within the last decade, the complexity of data-driven ecosystems is expected to become manageable for self-managed digitalisation. Having identified event-driven IT-architectures as a suitable solution for the architectural requirements of Industry 4.0, the producing industry is now offered a relevant alternative to prominent third-party ecosystems. Although the technical components are readily available, the realisation of an event-driven IT-architecture in production is often hindered by a lack of reference projects, and hence uncertainty about its success and risks. The research institute FIR and IT-expert synyx are thus developing an event-driven IT-architecture in the Center Smart Logistics' producing factory, which is designed to be a multi-agent testbed for members of the cluster. With the experience gained in industrial projects, a target IT-architecture was conceptualised that proposes a solution for a self-managed data-ecosystem based on open-source technologies. With the iterative integration of factory-relevant Industry 4.0 use cases, the target is continuously realised and validated. The paper presents the developed solution for a self-managed event-driven IT-architecture and presents the implications of the decisions made. Furthermore, the progress of two use cases, namely an IT-OT-integration and a smart product demonstrator for the research project BlueSAM, are presented to highlight the iterative technical implementability and merits, enabled by the architecture.
Industry 4.0 is driven by Cyber-Physical Systems and Smart Products. Smart Products provide a value to both its users and its manufacturers in terms of a closer connection to the customer and his data as well as the provided smart services. However, many companies, especially SMEs, struggle with the transformation of their existing product portfolio into smart products. In order to facilitate this process, this paper presents a set of smart product use-cases from a manufacturer’s perspective. These use-cases can guide the definition of a smart product and be used during its architecture development and realization. Initially the paper gives an introduction in the field of smart products. After that the research results, based on case-study research, are presented. This includes the methodological approach, the case-study data collection and analysis. Finally, a set of use-cases, their definitions and components are presented and highlighted from the perspective of a smart product manufacturer.
Künstliche Intelligenz (KI) hat sich über die letzten Jahre stetig zu einem Thema mit strategischer Priorität für Unternehmen entwickelt. Das zeigt sich nicht zuletzt in der gesteigerten Investitionsbereitschaft deutscher Unternehmen in KI-Projekte. Wirtschaftliche Akteure haben erkannt, dass durch eine sinnvolle Nutzung von KI-Technologien Wettbewerbsvorteile erzielt werden können. Die vorliegende Studie legt das Augenmerk auf den industriellen Einsatz einer KI-Technologie, die bereits heute von vielen Unternehmen erfolgreich genutzt wird: Die natürliche Sprachverarbeitung (engl. Natural Language Processing, kurz NLP). Die wirtschaftlichen Potenziale der Technologie liegen dabei in ihrer Fähigkeit, betriebliche Abläufe zu automatisieren und die Schnittstelle zwischen Mensch und Maschine zu verbessern und zu vereinfachen. Ziel der Studie ist es, die Potenziale der NLP-Technologie für Unternehmen nutzbar zu machen, indem konkrete Anwendungsfälle und allgemeine Handlungsempfehlungen sowie Nutzen und Risiken aufgezeigt werden.
Prinzipien zur erfolgreichen Umsetzung von KI-Geschäftsmodellinnovationen
In Zeiten des zunehmenden globalen Wettbewerbs und hoch vernetzter Wertschöpfungsketten entwickelt sich Künstliche Intelligenz zu einem immer wichtiger werdenden Wettbewerbsfaktor für Unternehmen am Wirtschaftsstandort Deutschland. Durch den Einsatz von KI-Verfahren können nicht nur interne Geschäftsprozesse kostensenkend optimiert, sondern auch neue, digitale Geschäftsfelder und -modelle erschlossen werden. Es lassen sich zum einen Trends identifizieren, denen der Einsatz von KI in deutschen Unternehmen folgt. Zum anderen zeigt sich, dass sich KI unterschiedlich stark auf verschiedene Dimensionen innovativer Geschäftsmodelle auswirkt. Insgesamt lassen sich so Prinzipien ableiten, die die erfolgreiche Umsetzung von KI-Geschäftsmodellinnovationen beschreiben.
Neue Technologie- und Anwendungstrends kennzeichnen KI-Nutzung
Die tatsächliche KI-Landschaft in den Wertschöpfungsketten von KI-nutzenden Unternehmen ist durch Trends gekennzeichnet. Diese lassen sich in Technologie- und Anwendungstrends unterteilen. Experteninterviews zeigen beispielsweise, dass KI-Anwendungen bevorzugt auf Cloud-Infrastrukturen entwickelt und bereitgestellt werden. Das wiederum rückt die Frage nach der Wahrung der Datensouveränität in den Vordergrund. Anwendung findet KI tendenziell zur Prognose und Überwachung.
Sechs Prinzipien beeinflussen die erfolgreiche Umsetzung von KI-Geschäftsmodellinnovationen
Fallstudien über ein breites Spektrum der deutschen Wirtschaft beleuchten, welche Aspekte eines KI-basierten Geschäftsmodells den größten Effekt auf das Unternehmen haben. Hier lässt sich ein besonders hoher Einfluss von KI auf das Nutzenversprechen neuartiger, digitaler Leistungen der Unternehmen an die Kundinnen und Kunden feststellen. So lassen sich sechs Erfolgsprinzipien zur erfolgreichen Implementierung von KI-Technologien identifizieren, um die wirtschaftliche Nutzung von KI für Unternehmen in Deutschland im globalen Wettbewerb weiter zu steigern. So empfiehlt es sich zum Beispiel – neben der Auswahl des richtigen KI-Anwendungsfalles – ebenfalls darauf zu achten, dass die KI-Anwendung sowohl den Anbietenden wie auch den Anwendenden nützt. Diese und weitere Erfolgsprinzipien werden detailliert in der Studie Künstliche Intelligenz – Geschäftsmodellinnovationen und Entwicklungstrends beschrieben.