Refine
Year of publication
Document Type
- Article (1)
- Part of a Book (9)
- Conference Proceeding (37)
- Contribution to a Periodical (1)
- Lecture (1)
Language
- English (49) (remove)
Is part of the Bibliography
- no (49)
Keywords
- 02 (4)
- 03 (3)
- APMS (1)
- APS (1)
- Adaptability (1)
- Advanced Planning System (1)
- Anomaly detection (1)
- Auftragsabwicklung (1)
- Beschaffungsplanung (1)
- Best practices (1)
Institute
- Produktionsmanagement (49) (remove)
Systematisation Approach
(2023)
Current megatrends such as globalisation and digitalisation are increasing complexity, making systems for well-founded and short-term decision support indispensable. A necessary condition for reliable decision-making is high data quality. In practice, it is repeatedly shown that data quality is insufficient, especially in master and transaction data. Moreover, upcoming approaches for data-based decisions consistently raise the required level of data quality. Hence, the importance of handling insufficient data quality is currently and will remain elementary. Since the literature does not systematically consider the possibilities in the case of insufficient data quality, this paper presents a general model and systematic approach for handling those cases in real-world scenarios. The model developed here presents the various possibilities of handling insufficient data quality in a process-based approach as a framework for decision support. The individual aspects of the model are examined in more detail along the process chain from data acquisition to final data processing. Subsequently, the systematic approach is applied and contextualised for production planning and supply chain event management, respectively. Due to their general validity, the results enable companies to manage insufficient data quality systematically.
Long-term production management defines the future production structure and ensures the long-term competitiveness. Companies around the world currently have to deal with the challenge of making decisions in an uncertain and rapidly changing environment. The quality of decision-making suffers from the rapidly changing global market requirements and the uniqueness and infrequency with which decisions are made. Since decisions in long-term production management can rarely be reversed and are associated with high costs, an increase in decision quality is urgently needed. To this end, four different applications are presented in the following, which support the decision process by increasing decision quality and make uncertainty manageable. For each of the applications presented, a separate digital shadow was built with the objective of being able to make better decisions from existing data from production and the environment. In addition, a linking of the applications is being pursued:
The Best Practice Sharing App creates transparency about existing production knowledge through the data-based identification of comparable production processes in the production network and helps to share best practices between sites. With the Supply Chain Cockpit, resilience can be increased through a data-based design of the procurement strategy that enables to manage disruptions. By adapting the procurement strategy for example by choosing suppliers at different locations the impact of disruptions can be reduced. While the Supply Chain Cockpit focuses on the strategy and decisions that affect the external partners (e.g., suppliers), the Data-Driven Site Selection concentrates on determining the sites of the company-internal global production network by creating transparency in the decision process of site selections. Different external data from various sources are analyzed and visualized in an appropriate way to support the decision process. Finally, the issue of sustainability is also crucial for successful long-term production management. Thus, the Sustainable Footprint Design App presents an approach that takes into account key sustainability indicators for network design. [https://link.springer.com/referenceworkentry/10.1007/978-3-030-98062-7_15-1]
In short-term production management of the Internet of Production (IoP) the vision of a Production Control Center is pursued, in which interlinked decision-support applications contribute to increasing decision-making quality and speed. The applications developed focus in particular on use cases near the shop floor with an emphasis on the key topics of production planning and control, production system configuration, and quality control loops.
Within the Predictive Quality application, predictive models are used to derive insights from production data and subsequently improve the process- and product-related quality as well as enable automated Root Cause Analysis. The Parameter Prediction application uses invertible neural networks to predict process parameters that can be used to produce components with desired quality properties. The application Production Scheduling investigates the feasibility of applying reinforcement learning to common scheduling tasks in production and compares the performance of trained reinforcement learning agents to traditional methods. In the two applications Deviation Detection and Process Analyzer, the potentials of process mining in the context of production management are investigated. While the Deviation Detection application is designed to identify and mitigate performance and compliance deviations in production systems, the Process Analyzer concept enables the semi-automated detection of weaknesses in business and production processes utilizing event logs.
With regard to the overall vision of the IoP, the developed applications contribute significantly to the intended interdisciplinary of production and information technology. For example, application-specific digital shadows are drafted based on the ongoing research work, and the applications are prototypically embedded in the IoP.
Companies in the manufacturing sector are confronted with an increasingly dynamic environment. Thus, corporate processes and, consequently, the supporting IT landscape must change. This need is not yet fully met in the development of information systems. While best-of-breed approaches are available, monolithic systems that no longer meet the manufacturing industry's requirements are still prevalent in practical use. A modular structure of IT landscapes could combine the advantages of individual and standard information systems and meet the need for adaptability. At present, however, there is no established standard for the modular design of IT landscapes in the field of manufacturing companies' information systems. This paper presents different ways of the modular design of IT landscapes and information systems and analyzes their objects of modularization. For this purpose, a systematic literature research is carried out in the subject area of software and modularization. Starting from the V-model as a reference model, a framework for different levels of modularization was developed by identifying that most scientific approaches carry out modularization at the data structure-based and source code-based levels. Only a few sources address the consideration of modularization at the level of the software environment-based and software function-based level. In particular, no domain-specific application of these levels of modularization, e.g., for manufacturing, was identified. (Literature base: https://epub.fir.de/frontdoor/index/index/docId/2704)
Gap Analysis for CO2 Accounting Tool by Integrating Enterprise Resource Planning System Information
(2023)
Detailed carbon accounting is the foundation for reducing CO2 emissions in manufacturing companies. However, existing accounting approaches are primarily based on manual data preparation, although manufacturing companies already have a variety of IT systems and resulting data available. The gap analysis carried out based on the GHG Protocol and an reference ERP system shows how much of the required information for CO2 accounting can be integrated from an ERP system. The ERP system can cover 20 % of the required information. The information availability can be increased to 49 % through additionally identified modifications of the ERP system. Integrating the CO2 accounting tool with other systems of the IT landscape, e. g. Energy Information System, enables an additional increase.
Based on the increasingly complex value creation networks, more and more event-based systems are being used for decision support. One example of a category of event-based systems is supply chain event management. The aim is to enable the best possible reaction to critical exceptional events based on event data. The central element is the event, which represents the information basis for mapping and matching the process flows in the event-based systems. However, since the data quality is insufficient in numerous application cases and the identification of incorrect data in supply chain event management is considered in the literature, this paper deals with the theoretical derivation of the necessary data attributes for the identification of incorrect event data. In particular, the types of errors that require complex identification strategies are considered. Accordingly, the relevant existing error types of event data are specified in subtypes in this paper. Subsequently, the necessary information requirements and information available regarding identification are considered using a GAP analysis. Based on this gap, the necessary data attributes can then be derived. Finally, an approach is presented that enables the generation of the complete data set. This serves as a basis for the recognition and filtering out of erroneous events in contrast to standard and exception events.
The complexity and volatility of companies’ environment increase the relevance of disruption preparation. Resilience enables companies to deal with disruptions, reduce their impact and ensure competitiveness. Especially in the context of procurement, disruptions can cause major challenges while resilience contributes to ensuring material availability. Even though past disruptions have posed various challenges and companies have recognized the need to increase resilience, resilience is often not designed systematically. One major challenge is the number of potential measures to increase resilience. The systematic design of resilience thus requires a detailed understanding of domain-specific measures. This also includes an understanding of the contribution of these measures to different resilience components and their interdependencies. This paper proposes a systematic approach for configuring resilience in procurement which enables the evaluation and selection of resilience measures. Based on a resilience framework, a resilience configurator is developed. The basis of the configurator are resilience potentials that have been characterized and clustered. Overarching approaches to design resilience and indicators to evaluate resilience are presented. Moreover, a procedure is proposed to ensure practical applicability. To evaluate the results two case studies are conducted. The results enable companies to systematically design their resilience in procurement.
Due to shorter product life cycles and the increasing internationalization of competition, companies are confronted with increasing complexity in supply chain management. Event-based systems are used to reduce this complexity and to support employees' decisions. Such event-based systems include tracking & tracing systems on the one hand and supply chain event management on the other. Tracking & tracing systems only have the functions of monitoring and reporting deviations, whereas supply chain event management systems also function as simulation, control, and measurement. The central element connecting these systems is the event. It forms the information basis for mapping and matching the process sequences in the event-based systems. The events received from the supply chain partner form the basis for all downstream steps and must, therefore, contain the correct data. Since the data quality is insufficient in numerous use cases and incorrect data in supply chain event management is not considered in the literature, this paper deals with the description and typification of incorrect event data. Based on a systematic literature review, typical sources of errors in the acquisition and transmission of event data are discussed. The results are then applied to event data so that a typification of incorrect event types is possible. The results help to significantly improve event-based systems for use in practice by preventing incorrect reactions through the detection of incorrect event data.
Companies operate in an increasingly volatile environment where different developments like shorter product lifecycles, the demand for customized products and globalization increase the complexity and interconnectivity in supply chains. Current events like Brexit, the COVID-19 pandemic or the blockade of the Suez canal have caused major disruptions in supply chains. This demonstrates that many companies are insufficiently prepared for disruptions. As disruptions in supply chains are expected to occur even more frequently in the future, the need for sufficient preparation increases. Increasing resilience provides one way of dealing with disruptions. Resilience can be understood as the ability of a system to cope with disruptions and to ensure the competitiveness of a company. In particular, it enables the preparation for unexpected disruptions. The level of resilience is thereby significantly influenced by actions initiated prior to a disruption. Although companies recognize the need to increase their resilience, it is not systematically implemented. One major challenge is the multidimensionality and complexity of the resilience construct. To systematically design resilience an understanding of the components of resilience is required. However, a common understanding of constituent parts of resilience is currently lacking. This paper, therefore, proposes a general framework for structuring resilience by decomposing the multidimensional concept into its individual components. The framework contributes to an understanding of the interrelationships between the individual components and identifies resilience principles as target directions for the design of resilience. It thus sets the basis for a qualitative assessment of resilience and enables the analysis of resilience-building measures in terms of their impact on resilience. Moreover, an approach for applying the framework to different contexts is presented and then used to detail the framework for the context of procurement.
Generation of a Data Model For Quotation Costing Of Make To Order Manufacturers From Case Studies
(2022)
For contract or make to order manufacturers, quotation costing is a complex process that is mainly performed based on experience. Due to the high diversity of the product range of these mostly small or medium-sized companies (SMEs) and the poor data situation at the time of quotation preparation, the quality of the calculation is subject to strong variations and uncertainties. The gap between the initial quotation costing and the actual costs to be spent (pre- and post-calculation) is crucial to the existence of SMEs. Digitalization in general can help companies to get a better understanding of processes and to generate data. For improving these processes, an understanding of the important data for that specific process is crucial. Accurate quotation costing for customized products is time-consuming and resource-intensive, as there is a lack of an overview of data to be used within the process. This paper therefore derives a data model for supporting quotation costing in the company, based on literature-based costing procedures and recorded case studies for quotation and calculation. Based on the results, SMEs will have a first overview of the needed data for quotation costing to optimize their calculation process.