Refine
Year of publication
Document Type
- Article (3)
- Part of a Book (12)
- Conference Proceeding (105)
- Contribution to a Periodical (3)
- Lecture (2)
Language
- English (125) (remove)
Is part of the Bibliography
- no (125)
Keywords
- 02 (11)
- 03 (9)
- 04 (1)
- 5G (2)
- AI (2)
- APMS (1)
- APS (1)
- Adaptability (1)
- Advanced Planning System (1)
- Anomaly detection (1)
Institute
Methods of machine learning (ML) are difficult for manufacturing companies to employ productively. Data science is not their core skill, and acquiring talent is expensive. Automated machine learning (Auto-ML) aims to alleviate this, democratizing machine learning by introducing elements such as low-code or no-code functionalities into its model creation process. Due to the dynamic vendor market of Auto-ML, it is difficult for manufacturing companies to successfully implement this technology. Different solutions as well as constantly changing requirements and functional scopes make a correct software selection difficult. This paper aims to alleviate said challenge by providing a longlist of requirements that companies should pay attention to when selecting a solution for their use case. The paper is part of a larger research effort, in which a structured selection process for Auto-ML solutions in manufacturing companies is designed. The longlist itself is the result of six case studies of different manufacturing companies, following the method of case study research by Eisenhardt. A total of 75 distinct requirements were identified, spanning the entire machine learning and modeling pipeline.
More and more manufacturing companies are starting to transform the transaction-based business model into a customer value-based subscription business to monetize the potential of digitization in times of saturated markets. However, historically evolved, linear acquisition processes, focusing the transactionoriented product sales, prevent this development substantially. Elemental features of the subscription business such as recurring payments, short-term release cycles, data-driven learning, and a focus on customer success are not considered in this approach. Since existing transactional-driven acquisition approaches are not successfully applicable to the subscription business, a systematic approach to an acquisition cycle of the subscription business in the manufacturing industry is presented, aiming at a long-term participative business. Applying a grounded theory approach, a task-oriented model for themanufacturing industry was developed.
The model consisting of five main tasks and 14 basis tasks serves as best practice to support manufacturing companies in adapting or redesigning acquisition activities for their subscription business models.
Producing companies are confronted with a growing number of product ramp-ups, since product life cycles are decreasing and product diversity is increasing. Production Planning and Control (PPC) of ramp-up products is particularly challenging, as there is a significant lack of reliable experienced data.
The information deficit is exceptionally high for the first step of PPC process, namely Production Program Planning (PPP). The paper in hand proposes an innovative approach of cybernetic PPP that enables companies with numerous ramp-ups to design reliable and fast PPP processes that can react highly adaptable on unpredictable environmental disturbances. The Viable System Model (VSM) is used as frame of reference for the design of PPP processes in line with principles from management cybernetics.
As industrial service portfolios grow, many companies overlook the implications of their business operations: rising complexity and resulting complexity costs. One reason are nonexistent tools that help service managers to decide in planning phases with an adequate effort about the implications that variety and complexity decisions have on the complexity costs of their portfolio. This paper depicts the challenges service companies have to face in this context and presents a concept of a heuristic approach to evaluate the complexity costs for industrial services. The concept is being developed in strong cooperation with industrial partners.
For a considerable time, European companies in the capital goods industry experience stagnating growth in material goods markets. Moreover, increasing international competition forces European companies to improve their market position. In order to stay successful, an increasing number of companies adapt their businesses from manufacturing to service provider. Unfortunately, the number of companies who manage to turn their portfolio change into a competitive advantage is comparatively low. Therefore, this paper focuses on the development of a framework for the positioning as industrial services provider. Besides, it provides support for management in shaping the changes that occur with the transformation.
This paper presents a simulation approach for service production processes on the basis of which an optimal operating point for service systems can be identified. The approach specifically takes into account the characteristics of human behavior. The simulation is based on a system theory approach to the service delivery process. A specific use case of the simulation approach is presented in detail to illustrate how characteristic curves are deduced and an optimal operating point is obtained.
Production systems are exposed to an increasing planning-related uncertainty and susceptibility. The inter-company coordination has not sufficiently been considered in contemporary concepts of supply chain management. Against this background, it is crucial to provide a suitable tool that increases the planning capability of the players and the robustness of the supply chain as a whole. Therefore, this article provides the relevant causes and effects of planning uncertainties within the production planning and presents based on that an inter-company supply chain planning concept.
Companies in high wage countries are increasingly confronted with the challenge of optimizing economies of scope and economies of scale simultaneously to succeed on a global market place. An integrated assessment of production systems facing this challenge is essential to evaluate the actual state of a company and to provide a basis for drawing the right conclusions to reconfigure production systems successfully.
In this paper an integrated model for measuring economies of scope as well as economies of scale is introduced, defining the fundamental domains of a production system. The major objectives resulting from the overall scale-scope dilemma are broken down for each domain and the main dimensions for an assessment of each domain are defined. A new measure named Degree of Efficiency is defined, quantifying the fulfillment of the opposing objectives in each domain and hence, the contribution to an overall resolution of the scale-scope dilemma.
The efficient dealing with the dynamic environment of production industries is one of the most challenging tasks of Supply Chain Management in high-wage countries. Relevant and current information are still not used sufficiently, to handle the influence of the dynamic environment on intra- and inter-company order processing adequately. Among other things, the problem is caused by missing or delayed feedback of relevant data. As a consequence of that, planning results differ from the actual situation of production. High Resolution Supply Chain Management describes an approach aiming on high information transparency in supply chains in combination with decentralized, self-optimizing control loops for Production Planning and Control. The final objective is to enable manufacturing companies to produce efficiently and to be able to react to order-variations at any time, requiring process structures to be most flexible.
Companies in the manufacturing sector are confronted with an increasingly dynamic environment. Thus, corporate processes and, consequently, the supporting IT landscape must change. This need is not yet fully met in the development of information systems. While best-of-breed approaches are available, monolithic systems that no longer meet the manufacturing industry's requirements are still prevalent in practical use. A modular structure of IT landscapes could combine the advantages of individual and standard information systems and meet the need for adaptability. At present, however, there is no established standard for the modular design of IT landscapes in the field of manufacturing companies' information systems. This paper presents different ways of the modular design of IT landscapes and information systems and analyzes their objects of modularization. For this purpose, a systematic literature research is carried out in the subject area of software and modularization. Starting from the V-model as a reference model, a framework for different levels of modularization was developed by identifying that most scientific approaches carry out modularization at the data structure-based and source code-based levels. Only a few sources address the consideration of modularization at the level of the software environment-based and software function-based level. In particular, no domain-specific application of these levels of modularization, e.g., for manufacturing, was identified. (Literature base: https://epub.fir.de/frontdoor/index/index/docId/2704)
Pricing is one of the most important, but underestimated tools, to enhance a company's profitability. Especially value-based pricing has a high potential to reach higher levels of satisfaction because it equates the needs of providers and customers. Even though, it is a well-known price model and promises higher satisfaction, many companies struggle to implement it. Especially the manufacturing industry is characterized by cost-plus pricing and competition-based pricing. However, especially for digital products these pricing strategies are insufficient. Therefore, this paper aims at exploring the design fields for value-based pricing of digital products in the manufacturing industry. To achieve this, the basics of digital products and value-based pricing are explored. Furthermore, an expert workshop is conducted that follows a framework for value-based pricing consisting of four consecutive steps analysis, price strategy, pricing, and market launch to capture the design fields. This paper concludes with limitations, and practical and research implications.
Reinforced through the pandemic and shaped by digitalization, today's professional working environment is in a state of transformation. Working remotely has become a vital component of many professions' regular routines. The design of remote work environments presents challenges to organizations of all sizes. By providing a classification, this paper reveals a comprehensive understanding of the fields of design to be considered to establish lasting remote work concepts in organizations. A hierarchical classification with four dimensions consisting of human, technology, organization, and culture, seven design elements and, twenty design parameters indicates to organizations the fields of design that need to be examined. To satisfy both the theoretical foundation and the practical application, design elements are derived by implementing a systematic review of the literature that represents key areas of interest for remote work. Additionally, these are verified and complemented by a dedicated case study research to incorporate practice-oriented design parameters.
Gap Analysis for CO2 Accounting Tool by Integrating Enterprise Resource Planning System Information
(2023)
Detailed carbon accounting is the foundation for reducing CO2 emissions in manufacturing companies. However, existing accounting approaches are primarily based on manual data preparation, although manufacturing companies already have a variety of IT systems and resulting data available. The gap analysis carried out based on the GHG Protocol and an reference ERP system shows how much of the required information for CO2 accounting can be integrated from an ERP system. The ERP system can cover 20 % of the required information. The information availability can be increased to 49 % through additionally identified modifications of the ERP system. Integrating the CO2 accounting tool with other systems of the IT landscape, e. g. Energy Information System, enables an additional increase.
Manufacturing companies face the challenge of managing vast amounts of unstructured data generated by various sources such as social media, customer feedback, product reviews, and supplier data. Text-mining technology, a branch of data mining and natural language processing, provides a solution to extract valuable insights from unstructured data, enabling manufacturing companies to make informed decisions and improve their processes. Despite the potential benefits of text mining technology, many manufacturing companies struggle to implement use cases due to various reasons. Therefore, the project VoBAKI (IGF-Project No.: 22009 N) aims to enable manufacturing companies to identify and implement text mining use cases in their processes and decision-making processes. The paper presents an analysis of text mining use cases in manufacturing companies using Mayring's content analysis and case study research. The study aims to explore how text mining technology can be effectively used in improving production processes and decision-making in manufacturing companies.
With the development of publicly accessible broker systems within the last decade, the complexity of data-driven ecosystems is expected to become manageable for self-managed digitalisation. Having identified event-driven IT-architectures as a suitable solution for the architectural requirements of Industry 4.0, the producing industry is now offered a relevant alternative to prominent third-party ecosystems. Although the technical components are readily available, the realisation of an event-driven IT-architecture in production is often hindered by a lack of reference projects, and hence uncertainty about its success and risks. The research institute FIR and IT-expert synyx are thus developing an event-driven IT-architecture in the Center Smart Logistics' producing factory, which is designed to be a multi-agent testbed for members of the cluster. With the experience gained in industrial projects, a target IT-architecture was conceptualised that proposes a solution for a self-managed data-ecosystem based on open-source technologies. With the iterative integration of factory-relevant Industry 4.0 use cases, the target is continuously realised and validated. The paper presents the developed solution for a self-managed event-driven IT-architecture and presents the implications of the decisions made. Furthermore, the progress of two use cases, namely an IT-OT-integration and a smart product demonstrator for the research project BlueSAM, are presented to highlight the iterative technical implementability and merits, enabled by the architecture.
Assets of integrated production systems, especially in the heavy industry, are facing high requirements in terms of reliability and availability. In case of component breakdown, the operating firm is confronted with high costs due to downtime and loss of production. Modern maintenance concepts in combination with advanced technologies can help to improve the plant availability and reduce the downtime costs caused by unplanned breakdowns. Against this background, the research institutes FIR and IMR from RWTH Aachen University, Germany, are collaborating within the research project “SiZu”. This project deals with the integration of condition monitoring system and real time simulation to assess the condition of components and to support failure cause analysis.
Based on the increasingly complex value creation networks, more and more event-based systems are being used for decision support. One example of a category of event-based systems is supply chain event management. The aim is to enable the best possible reaction to critical exceptional events based on event data. The central element is the event, which represents the information basis for mapping and matching the process flows in the event-based systems. However, since the data quality is insufficient in numerous application cases and the identification of incorrect data in supply chain event management is considered in the literature, this paper deals with the theoretical derivation of the necessary data attributes for the identification of incorrect event data. In particular, the types of errors that require complex identification strategies are considered. Accordingly, the relevant existing error types of event data are specified in subtypes in this paper. Subsequently, the necessary information requirements and information available regarding identification are considered using a GAP analysis. Based on this gap, the necessary data attributes can then be derived. Finally, an approach is presented that enables the generation of the complete data set. This serves as a basis for the recognition and filtering out of erroneous events in contrast to standard and exception events.
The complexity and volatility of companies’ environment increase the relevance of disruption preparation. Resilience enables companies to deal with disruptions, reduce their impact and ensure competitiveness. Especially in the context of procurement, disruptions can cause major challenges while resilience contributes to ensuring material availability. Even though past disruptions have posed various challenges and companies have recognized the need to increase resilience, resilience is often not designed systematically. One major challenge is the number of potential measures to increase resilience. The systematic design of resilience thus requires a detailed understanding of domain-specific measures. This also includes an understanding of the contribution of these measures to different resilience components and their interdependencies. This paper proposes a systematic approach for configuring resilience in procurement which enables the evaluation and selection of resilience measures. Based on a resilience framework, a resilience configurator is developed. The basis of the configurator are resilience potentials that have been characterized and clustered. Overarching approaches to design resilience and indicators to evaluate resilience are presented. Moreover, a procedure is proposed to ensure practical applicability. To evaluate the results two case studies are conducted. The results enable companies to systematically design their resilience in procurement.
One of the major tasks of operations managers is to boost uptime while simultaneously keeping budget. To meet this challenge they discover reliability-based management as strategic factor to improve performance. But which parameters are the key to “reliability excellence” and drive a company’s performance? What are the relevant levers to pull in reliability-based management?
To answer these questions McKinsey & Company partnered with Aachen University to launch a global reliability survey in process industries. Objective of the initiative is to provide a statistically proven picture of key factors that drive maintenance and reliability excellence. Furthermore benchmarks and best practices concerning overall operational performance will be identified. The study is based on a questionnaire-based approach which addresses all relevant departments within a company, complemented by best practice analyses.
This paper provides results of the survey. The results demonstrate that reliability pays off. Some unproven beliefs have been confirmed (e.g. a good reliability performance results in a low spare part inventory) but also surprises like a correlation between safety and performance were identified. The analysis also shows that structural differences like company size or geography do not influence reliability performance.
Holistic PLM- Model
(2010)
Product Lifecycle Management (PLM) is a widely discussed topic concerning the increase of efficiency of product development in terms of time to market as well as customizing products to the different needs of customers worldwide adequately. Historically PLM focuses the early phases of the product’s lifecycle, namely the product development phase. Therein the roots of PLM are based in supporting the information logistics of product data: Consistent data sets should be available to all stakeholders in the different departments at all times. Due to the increasing product complexity PLM has to be extended in terms of the temporal dimension (not limited to product development phase) and systemic dimension (not limited to the information logistic aspect). In this paper the authors derive a holistic framework for Product Lifecycle Management by analysing existing integrated management approaches. The framework consists of four dimensions: PLM strategy, PLM process, Product structure and PLM IT-Architecture. The sustainability and benefits of the framework is demonstrated by applying the framework to the communication service provider industry (CSP).