Prof. Dr.-Ing Boris Otto

Supply Net Order Management
TU Dortmund University

Contact
Author IDs

Hub
  • A federated infrastructure for European data spaces
    Otto, B.
    Communications of the ACM 65 (2022)
    view abstract10.1145/3512341
  • An Investigation of Antecedents for Data Governance Adoption in the Rail Industry—Findings From a Case Study at Thales
    Lis, D. and Arbter, M. and Spindler, M. and Otto, B.
    IEEE Transactions on Engineering Management (2022)
    The role of data governance is experiencing a paradigm shift as organizations increasingly incorporate data governance to encourage the strategic utilization of data and, therefore, promote data-driven innovation. However, the opportunities arising from technological advancements and novel value propositions based on data come with implications that often stem from external and internal contingent factors, e.g., industry characteristics or organizational structures. In combination with inadequate practices regarding the conduct of data, difficulties in the adoption of data governance can increase. This article draws upon established practices from information technology governance and organizational theory, specifically contingency theory, to examine occurring antecedents in the adoption of data governance at Thales Ground Transportation Systems, a manufacturer of solutions for the railway infrastructure. By investigating the nomological link between antecedents, adoption, and consequences, associated implications for data governance can be taken into account in early phases of adoption to promote data-driven innovation. The article proposes new antecedents evolving from interorganizational dynamics such as data collaborations in the respective ecosystem. IEEE
    view abstract10.1109/TEM.2022.3166109
  • Archetypes of open-source business models
    Duparc, E. and Möller, F. and Jussen, I. and Stachon, M. and Algac, S. and Otto, B.
    Electronic Markets 32 (2022)
    The open-source paradigm offers a plethora of opportunities for innovative business models (BMs) as the underlying codebase of the technology is accessible and extendable by external developers. However, finding the proper configuration of open-source business models (OSBMs) is challenging, as existing literature gives guidance through commonly used BMs but does not describe underlying design elements. The present study generates a taxonomy following an iterative development process based on established guidelines by analyzing 120 OSBMs to complement the taxonomy's conceptually-grounded design elements. Then, a cluster-based approach is used to develop archetypes derived from dominant features. The results show that OSBMs can be classified into seven archetypical patterns: open-source platform BM, funding-based BM, infrastructure BM, Open Innovation BM, Open Core BM, proprietary-like BM, and traditional open-source software (OSS) BM. The results can act as a starting point for further investigation regarding the use of the open-source paradigm in the era of digital entrepreneurship. Practitioners can find guidance in designing OSBMs. © 2022, The Author(s).
    view abstract10.1007/s12525-022-00557-9
  • Design Principles for Boundary Spanning in Transdisciplinary Design Science Research
    Möller, F. and Chandra Kruse, L. and Schoormann, T. and Otto, B.
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 13229 LNCS (2022)
    Design principles capture prescriptive design knowledge to guide design science researchers and design professionals in their design works. In the context of a transdisciplinary team, design principles can also be a powerful vehicle to bridge knowledge barriers and facilitate collaboration among team members with different backgrounds and expertise. These heterogeneous actors use design principles as a boundary object which helps to mediate their diverse perspectives. The paper draws from boundary object theory to explore the goals and the mechanisms of boundary spanning through ‘design principles-in-use’ and ‘design principles-in-formulation’. We discuss the applicability of our findings using a case of formulation and application of design principles for data spaces in a transdisciplinary research consortium. Our results add the layers of transdisciplinary collaboration to the ongoing discourse on design principles and design knowledge accumulation and evolution. © 2022, Springer Nature Switzerland AG.
    view abstract10.1007/978-3-031-06516-3_4
  • Design Principles for Industrial Data-Driven Services
    Azkan, C. and Moller, F. and Iggena, L. and Otto, B.
    IEEE Transactions on Engineering Management (2022)
    The continuously growing availability and volume of data pressure companies to leverage them economically. Subsequently, companies must find strategies to incorporate data sensibly for internal optimization and find new business opportunities in data-driven business models. In this article, we focus on using data and data analytics in product-oriented industrial companies. Although data-driven services are becoming increasingly important, little is known about their systematic design and development in research. Surprisingly, many companies face significant challenges and fail to create these services successfully. Against this background, this article presents findings from a multicase based on qualitative interviews and workshops with experts from different industrial sectors. We propose ten design principles and corresponding design features to successfully design industrial data-driven services in this context. These design principles help practitioners and researchers to understand the peculiarities of creating data-driven services more in-depth on a conceptual, technical, and organizational level. Author
    view abstract10.1109/TEM.2022.3167737
  • Design Principles for Shared Digital Twins in Distributed Systems
    Haße, H. and van der Valk, H. and Möller, F. and Otto, B.
    Business and Information Systems Engineering (2022)
    Digital Twins offer considerable potential for cross-company networks. Recent research primarily focuses on using Digital Twins within the limits of a single organization. However, Shared Digital Twins extend application boundaries to cross-company utilization through their ability to act as a hub to share data. This results in the need to consider additional design dimensions which help practitioners design Digital Twins tailored for inter-company use. The article addresses precisely that issue as it investigates how Shared Digital Twins should be designed to achieve business success. For this purpose, the article proposes a set of design principles for Shared Digital Twins stemming from a qualitative interview study with 18 industry experts. The interview study is the primary data source for formulating and evaluating the design principles. © 2022, The Author(s).
    view abstract10.1007/s12599-022-00751-1
  • Knowledge-Driven Data Ecosystems Toward Data Transparency
    Geisler, S. and Vidal, M.-E. and Cappiello, C. and Lóscio, B.F. and Gal, A. and Jarke, M. and Lenzerini, M. and Missier, P. and Otto, B. and Paja, E. and Pernici, B. and Rehof, J.
    Journal of Data and Information Quality 14 (2022)
    A data ecosystem (DE) offers a keystone-player or alliance-driven infrastructure that enables the interaction of different stakeholders and the resolution of interoperability issues among shared data. However, despite years of research in data governance and management, trustability is still affected by the absence of transparent and traceable data-driven pipelines. In this work, we focus on requirements and challenges that DEs face when ensuring data transparency. Requirements are derived from the data and organizational management, as well as from broader legal and ethical considerations. We propose a novel knowledge-driven DE architecture, providing the pillars for satisfying the analyzed requirements. We illustrate the potential of our proposal in a real-world scenario. Last, we discuss and rate the potential of the proposed architecture in the fulfillmentof these requirements. © 2021 Copyright held by the owner/author(s). Publication rights licensed to ACM.
    view abstract10.1145/3467022
  • Requirements for DataOps to foster Dynamic Capabilities in Organizations - A mixed methods approach
    Gur, I. and Moller, F. and Hupperz, M. and Uzun, D. and Otto, B.
    Proceedings - 2022 IEEE 24th Conference on Business Informatics, CBI 2022 1 (2022)
    view abstract10.1109/CBI54897.2022.00025
  • Archetypes of Digital Twins
    van der Valk, H. and Haße, H. and Möller, F. and Otto, B.
    Business and Information Systems Engineering (2021)
    Currently, Digital Twins receive considerable attention from practitioners and in research. A Digital Twin describes a concept that connects physical and virtual objects through a data linkage. However, Digital Twins are highly dependent on their individual use case, which leads to a plethora of Digital Twin configurations. Based on a thorough literature analysis and two interview series with experts from various electrical and mechanical engineering companies, this paper proposes a set of archetypes of Digital Twins for individual use cases. It delimits the Digital Twins from related concepts, e.g., Digital Threads. The paper delivers profound insights into the domain of Digital Twins and, thus, helps the reader to identify the different archetypical patterns. © 2021, The Author(s).
    view abstract10.1007/s12599-021-00727-7
  • Data Strategy Development: A Taxonomy for Data Strategy Tools and Methodologies in the Economy
    Gür, I. and Spiekermann, M. and Arbter, M. and Otto, B.
    Lecture Notes in Information Systems and Organisation 46 (2021)
    Data are a key driver of the digital era. They shift the strategic landscape of organizations and change how companies approach their business. Nevertheless, existing approaches on data strategies vary vastly and little common ground is visible. Therefore, we develop a comprehensive taxonomy for data strategy tools and methodologies in order to identify characteristics and relevant properties of data strategy. We derived the taxonomy inductively by analyzing existing data strategy tools and methodologies offered in the current economy and deductively by conducting a structured literature review on the existing body of knowledge in the scientific literature. It serves as a scientific instrument to profoundly assess and create data strategies and work towards a consensus in the respective research field. © 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.
    view abstract10.1007/978-3-030-86790-4_30
  • Design of Goal-Oriented Artifacts from Morphological Taxonomies: Progression from Descriptive to Prescriptive Design Knowledge
    Möller, F. and Haße, H. and Azkan, C. and van der Valk, H. and Otto, B.
    Lecture Notes in Information Systems and Organisation 46 (2021)
    Morphological Taxonomies are a widely popular tool in Information Systems to systematically deconstruct an artifact into designable dimensions and characteristics. Subsequently, these taxonomies have engraved in them knowledge about the design of artifacts, i.e., descriptive design knowledge. Most studies producing morphological taxonomies refrain from giving prescriptive advice about the design, i.e., the specific morphological configuration of an artifact, but rather stay descriptive. The paper proposes a framework for knowledge and artifact transformation originating in morphological taxonomies and ending in design principles. We develop a framework that assists researchers and practitioners by showing clear paths on transforming descriptive design knowledge engraved in taxonomies to prescriptive knowledge as design principles. © 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.
    view abstract10.1007/978-3-030-86790-4_36
  • Designing business model taxonomies – synthesis and guidance from information systems research
    Möller, F. and Stachon, M. and Azkan, C. and Schoormann, T. and Otto, B.
    Electronic Markets (2021)
    Classification is an essential approach in business model research. Empirical classifications, termed taxonomies, are widespread in and beyond Information Systems (IS) and enjoy high popularity as both stand-alone artifacts and the foundation for further application. In this article, we focus on the study of empirical business model taxonomies for two reasons. Firstly, as these taxonomies serve as a tool to store empirical data about business models, we investigate their coverage of different industries and technologies. Secondly, as they are emerging artifacts in IS research, we aim to strengthen rigor in their design by illustrating essential design dimensions and characteristics. In doing this, we contribute to research and practice by synthesizing the diffusion of business model taxonomies that helps to draw on the available body of empirical knowledge and providing artifact-specific guidance for building taxonomies in the context of business models. © 2021, The Author(s).
    view abstract10.1007/s12525-021-00507-x
  • European data infrastructures: Approaches and tools for the use of data for the benefit of the individual and the community [Europäische Dateninfrastrukturen: Ansätze und Werkzeuge zur Nutzung von Daten zum Wohl von Individuum und Gemeinschaft]
    Otto, B. and Burmann, A.
    Informatik-Spektrum 44 (2021)
    view abstract10.1007/s00287-021-01386-4
  • How to Design IIoT-Platforms Your Partners are Eager to Join: Learnings from an Emerging Ecosystem
    Guggenberger, T.M. and Hunke, F. and Möller, F. and Eimer, A.-C. and Satzger, G. and Otto, B.
    Lecture Notes in Information Systems and Organisation 48 LNISO (2021)
    Building and sustaining a successful platform business remains one of the biggest challenges in the age of digitalization and platformization, particularly in the manufacturing industry. The art of managing the partner ecosystem to create and distribute mutual benefits depends on the design of the platform – thus, on the implemented mechanisms and functionalities, typically complemented by third-party applications. Therefore, it is eminently important to attract potential partners to enter the ecosystem. With this article, we provide substantial insight into the case of an emerging platform and its respective ecosystem of stakeholders. We analyze their individual requirements, abstract them into general key requirements, and finally develop design principles. Thus, our research, on the one hand, extends the current knowledge of platform literature with new, generalized knowledge about platform design, especially in the development phase. On the other hand, we contribute to the emerging field of participant attraction previously focusing on complementors. © 2021, The Author(s), under exclusive license to Springer Nature Switzerland AG.
    view abstract10.1007/978-3-030-86800-0_34
  • Optimisation model for multi-item multi-echelon supply chains with nested multi-level products
    Quetschlich, M. and Moetz, A. and Otto, B.
    European Journal of Operational Research 290 (2021)
    This study proposes a mathematical optimisation model for the horizontal integration of the basic processes of production, inventory, and transportation planning on a tactical level. The model enables planning for cutting, assembly, transportation, and inventory while considering process times. Several items (raw materials, intermediate goods, finished products, and transport vehicles) are routed through a network. The general structure is defined by the layout of the supply network and the configurable multi-level bills of materials of products and transportation equipment, which can be converging, diverging, or mixed. To stress the practical applications of this study, we demonstrate how this generic model can be used and customised to solve an actual planning task in a multi-stage automotive production network using real industry data. © 2020 The Authors
    view abstract10.1016/j.ejor.2020.08.005
  • WFDU-net: A workflow notation for sovereign data exchange
    Pettenpohl, H. and Tebernum, D. and Otto, B.
    Proceedings of the 10th International Conference on Data Science, Technology and Applications, DATA 2021 (2021)
    Data is the main driver of the digital economy. Accordingly, companies are interested in maintaining technical control over the usage of their data at any given time. The International Data Spaces initiative addresses exactly this aspect of data sovereignty with usage control enforcement. In this paper, we introduce the so-called Workflow with Data and Usage control network (WFDU-net) model. The data consumer can visually define his or her workflow using the WFDU-net model and annotate the data operations and context. With model checking we validate that the WFDU-net follows the usage policies defined by the data owner. Afterwards, the compliant WFDU-net can be executed by exporting the WFDU-net in a Petri Net Markup Language (PNML). We evaluated our approach by using our example WFDU-net in a data analytics use case. Copyright © 2021 by SCITEPRESS - Science and Technology Publications, Lda. All rights reserved
    view abstract10.5220/0010550402310240
  • ‘Caution – Principle Under Construction’ a Visual Inquiry Tool for Developing Design Principles
    Möller, F. and Schoormann, T. and Otto, B.
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 12807 LNCS (2021)
    Researchers and practitioners often face challenges in structuring larger design projects and, therefore, struggle to capture, discuss, and reflect on essential components that should be considered. These first steps are, however, of great importance because decisions such as in terms of selecting an underpinning (kernel) theory, following certain development approaches, or specifying knowledge sources impact the resulting design solution. To provide a frame for developing one of the dominant forms of prescriptive knowledge in information systems (IS), we present the ‘Principle Constructor’ that seeks to support the iterative endeavor of formulating design principles. This so-called visual inquiry tool is grounded in previous research on design knowledge and an empirical analysis of IS articles that present principles, built according to available guidance for this class of tools, and evaluated through several workshops. Doing this, we provide an underlying structure with building blocks for creating design principles and complement research on their anatomy and development procedures. © 2021, Springer Nature Switzerland AG.
    view abstract10.1007/978-3-030-82405-1_23
  • Accumulating design knowledge with reference models: Insights from 12 years’ research into data management
    Legner, C. and Pentek, T. and Otto, B.
    Journal of the Association for Information Systems 21 (2020)
    Over the past several decades, digital technologies have evolved from supporting business processes and decision-making to becoming an integral part of business strategies. Although the IS discipline has extensive experience with digitalization and designing sociotechnical artifacts, the underlying design knowledge is seldom systematically accumulated across different settings and projects, and thus cannot be transferred and reused in new contexts. Motivated by this gap in the research, we turn to the data management field, where reference models have become important sources of descriptive and prescriptive domain knowledge. To study knowledge accumulation in reference models, we analyze the revelatory and extreme case of a longitudinal DSR process involving more than 30 European companies and 15 researchers from three universities over 12 years. The insights into reference model development allow us to theorize about knowledge accumulation mechanisms from both a process perspective and an artifact perspective: First, we observe that knowledge accumulation occurs in stages in response to technology’s evolving roles in business (problem space) and as a result of maturing design knowledge (solution space). Second, we find that reference models act as design boundary objects; they explicate and integrate knowledge from different disciplines and allow for the development of design knowledge over time—from descriptive (conceptual) models to prescriptive (capability or maturity) ones. To cope with fundamental changes in the problem space, these models require continuous updating as well as transfer/exaptation to new problem spaces. Our findings inform the IS community about the fundamental logic of knowledge accumulation in longitudinal DSR processes. © 2020 by the Association for Information Systems.
    view abstract10.17705/1jais.00618
  • Challenges of data management in industry 4.0: A single case study of the material retrieval process
    Amadori, A. and Altendeitering, M. and Otto, B.
    Lecture Notes in Business Information Processing 389 LNBIP (2020)
    The trend towards industry 4.0 amplifies existing data management challenges and requires suitable data governance and data quality measures. Although these topics have been previously discussed in literature, companies are still struggling to cope with the resulting challenges and fully exploit the benefits of industry 4.0. In this paper, we conducted a single case study in an automotive company. We exemplary used the material retrieval process in automotive manufacturing to uncover what challenges there are hindering the utilization of industry 4.0. We were able to identify six major challenges in the domains of data quality and data governance. © Springer Nature Switzerland AG 2020.
    view abstract10.1007/978-3-030-53337-3_28
  • Design principles for route optimization business models: A grounded theory study of user feedback
    Möller, F. and Guggenberger, T.M. and Otto, B.
    Proceedings of the 15th International Conference on Business Information Systems 2020 "Developments, Opportunities and Challenges of Digitization", WIRTSCHAFTSINFORMATIK 2020 (2020)
    The article generates design principles for business models offering route optimization software. Route optimization is a widely relevant activity in logistics, as it is its purpose to optimize the amount of time and the associated cost that a vehicle needs to reach its intended destination. The article presents a Grounded Theory study of user reviews drawn from the comparison portal Capterra to uncover possible design principles for these business models. User reviews are suitable data for that purpose as they are a rich source to derive basic requirements for a product. The data are analyzed qualitatively, adhering to established coding guidelines, and aligned with established business model components to achieve a link between empirical research and the underlying knowledge base. The synthesis of the data to design principles enables practitioners to adequately process and instantiate them into real-life design decisions that enhance the possibility of designing a successful business model. © Proceedings of the 15th International Conference on Business Information Systems 2020 "Developments, Opportunities and Challenges of Digitization", WIRTSCHAFTSINFORMATIK 2020.
    view abstract10.30844/wi_2020_j10
  • Digital Twins in Simulative Applications: A Taxonomy
    Van Der Valk, H. and Hunker, J. and Rabe, M. and Otto, B.
    Proceedings - Winter Simulation Conference 2020-December (2020)
    With the advances in information technology, the concept of Digital Twins has gained wide attention in both practice and research in recent years. A Digital Twin is a virtual representation of a physical object or system and is connected in a bi-directional way with the physical counterpart. The aim of a Digital Twin is to support all stakeholders during the whole lifecycle of such system or object. One of the core aspects of a Digital Twin is modeling and simulation, which is a well-established process, e.g., in the development of systems. Simulation models can be distinguished on the basis of different dimensions, e.g., on the basis of their time perspective. The existing literature reviews have paid little to no attention to this simulation aspect of a Digital Twin. In order to address this, the authors have developed a taxonomy based on an extended literature review to bridge the aforementioned gap. © 2020 IEEE.
    view abstract10.1109/WSC48552.2020.9384051
  • From crisis to bottleneck management: Integration of social media analysis for early response in the automotive supply chain event management
    Tietze, A.-C. and Cirullies, J. and Otto, B. and Holtz, A.
    International Journal of Integrated Supply Management 13 (2020)
    Numerous use cases exist in industrial practice in which data availability, up-to-dateness, accessibility and volume are insufficient to solve time-critical problems optimally. One example is bottleneck management (BM) in the automotive industry, where information is often time-delayed and incomplete. Time-critical volume coverage for supply risks leads to the question of how and where to obtain the information. Improved data accuracy therefore enables faster and more precise decisions. The literature offers approaches for the use of social media (SM) for risk assessment, which are reviewed regarding their suitability for transfer to BM in the automotive industry. Adapted from a multiple-cases study, this paper presents the first application of using external data generated by SM formats and develops a new, obstacle mitigated method of SM-knowledge-based BM. The goal of this study is to apply SM in the BM of the automotive industry and, hence, to enhance the typical current BM process. Copyright © 2020 Inderscience Enterprises Ltd.
    view abstract10.1504/IJISM.2020.110741
  • Towards a Method for Design Principle Development in Information Systems
    Möller, F. and Guggenberger, T.M. and Otto, B.
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 12388 LNCS (2020)
    A central goal of doing research is to make findings available to the academic and practitioner community in order to extend the current knowledge base. The notion of how to generalize, abstract, and codify knowledge gained in design endeavors is a vital issue in design science, especially in the strand of design theory. Design principles provide a medium to make such design knowledge available to others and to make it transferable from a single application onto more scenarios that are subject to similar boundary conditions. The study proposes a preliminary method for the development of design principles based on a structured literature review and the inductive derivation of methodological components from it. The purpose of the method is to give researchers and practitioners executable steps to generate design principles. © 2020, Springer Nature Switzerland AG.
    view abstract10.1007/978-3-030-64823-7_20
  • Data Sovereignty and Data Space Ecosystems
    Jarke, M. and Otto, B. and Ram, S.
    Business and Information Systems Engineering 61 (2019)
    view abstract10.1007/s12599-019-00614-2
  • Designing a multi-sided data platform: findings from the International Data Spaces case
    Otto, B. and Jarke, M.
    Electronic Markets 29 (2019)
    The paper presents the findings from a 3-year single-case study conducted in connection with the International Data Spaces (IDS) initiative. The IDS represents a multi-sided platform (MSP) for secure and trusted data exchange, which is governed by an institutionalized alliance of different stakeholder organizations. The paper delivers insights gained during the early stages of the platform’s lifecycle (i.e. the platform design process). More specifically, it provides answers to three research questions, namely how alliance-driven MSPs come into existence and evolve, how different stakeholder groups use certain governance mechanisms during the platform design process, and how this process is influenced by regulatory instruments. By contrasting the case of an alliance-driven MSP with the more common approach of the keystone-driven MSP, the results of the case study suggest that different evolutionary paths can be pursued during the early stages of an MSP’s lifecycle. Furthermore, the IDS initiative considers trust and data sovereignty more relevant regulatory instruments compared to pricing, for example. Finally, the study advances the body of scientific knowledge with regard to data being a boundary resource on MSPs. © 2019, The Author(s).
    view abstract10.1007/s12525-019-00362-x
  • Industrie 4.0 process transformation: findings from a case study in automotive logistics
    Hermann, M. and Bücker, I. and Otto, B.
    Journal of Manufacturing Technology Management 31 (2019)
    Purpose: The purpose of this paper is to explore the transformation of logistics processes to meet requirements of Industrie 4.0. Design/methodology/approach: The authors follow the principles of action design research to conduct a single-case study investigating four logistics processes at a leading German car manufacturer. For the development of artifacts, the authors used Method Engineering. Findings: The case study reveals a set of Industrie 4.0 process design principles, providing guidelines for the design and management of Industrie 4.0 compliant processes. In the second step, the authors use these process design principles for the development of a first version of a method for Industrie 4.0 process transformation. Research limitations/implications: In the light of limited scientific knowledge about Industrie 4.0 process transformation, the paper uses a single-case study design. This is adequate considering the research goal at hand and the richness of empirical insight the authors had access to. However, a single-case design is limited regarding generalizability and demand for future qualitative and quantitative research. Practical implications: The Industrie 4.0 process design principles support practitioners in the design and management of Industrie 4.0 compliant processes. In addition, the method developed by the authors supports enterprises in the transformation of their current processes toward Industrie 4.0. Originality/value: The paper describes the first attempt – as far as the authors are aware – to derive guidelines for the design and management of Industrie 4.0 processes from the analysis of a real-world industrial setting. Likewise, the method for Industrie 4.0 process transformation presented in this paper is presumed to be the first such method developed in full accordance with the principles of Method Engineering. © 2019, Emerald Publishing Limited.
    view abstract10.1108/JMTM-08-2018-0274
  • Interview with Reinhold Achatz on “Data Sovereignty and Data Ecosystems”
    Otto, B.
    Business and Information Systems Engineering 61 (2019)
    view abstract10.1007/s12599-019-00609-z
  • Reference Architecture framework for enhanced social media data analytics for Predictive Maintenance models
    Grambau, J. and Hitzges, A. and Otto, B.
    Proceedings - 2019 IEEE International Conference on Engineering, Technology and Innovation, ICE/ITMC 2019 (2019)
    Social Media data contains a lot of hidden information which is currently rarely used in the manner of service topics on product level. However, a deep analysis of existing predictive maintenance models shows, that the combined use of social media data with already existing data from products or internal service data can improve existing and new analytical models for an enhancing predictive maintenance. Therefore, this framework paper describes an approach how to gather, process and analyze Social Media data related to products of a power tool producer. The defined processes are executed with the Azure Machine Learning Studio and are visualized with Power Bi. The main result of this paper is the Reference Architecture which combines several processes combine heterogenous data sources and enable the First time to Incident algorithm which helps companies to to increase the precision of Predictive Maintenance. © 2019 IEEE.
    view abstract10.1109/ICE.2019.8792678
  • Schedule instability in automotive production networks: The Development of a network-oriented resequencing method
    Moetz, A. and Stylos-Duesmann, P. and Otto, B.
    IFAC-PapersOnLine 52 (2019)
    Schedule instability in production networks is a highly relevant topic for practitioners and academics lacking research, especially from a network perspective. This paper addresses the phenomenon schedule instability in automotive production networks by proposing a network-oriented partial car sequencing based resequencing method that integrates both, procurement and distribution related information. Based on this network-oriented approach, schedule instability stemming from short-term disruptions is reduced. © 2019, IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved.
    view abstract10.1016/j.ifacol.2019.11.634
  • Semi-automatic ontology-driven development documentation generating documents from RDF data and DITA templates
    Pikus, Y. and Weißenberg, N. and Holtkamp, B. and Otto, B.
    Proceedings of the ACM Symposium on Applied Computing Part F147772 (2019)
    For a data-driven economy, digitization of product information throughout the entire product lifecycle is key to agility and efficiency of product-related processes. Documenting products and their development, e.g., creating requirement specifications, is an indispensable, time-consuming and resource-intensive activity in large organizations. A vast amount of related information often emerges across several siloing lifecycle tools, and only a portion of it is available in the post-hoc documentation. Additionally, numerous product lines and versions additionally increase the documentation effort. To tackle these issues in a research project, we developed a semi-automatic end-to-end documentation system, able to generate documents based on templates and structured data. As a use case for document generation, we employ the RDF-based lifecycle tool integration standard OSLC and add extended publishing information. In order to generate target documents, we leverage DITA, an established digital publishing standard. A pilot implementation demonstrates that the approach is able to extract distributed lifecycle data and to generate several types of documents in multiple formats. Since the method can also be used to generate documents from arbitrary RDF graphs, the results can be generalized to other domains beyond software development. We believe that the results support the change from a document-driven to a data-driven documentation paradigm in large organizations. © 2019 Association for Computing Machinery. ACM
    view abstract10.1145/3297280.3297508
  • Usage control architecture options for data sovereignty in business ecosystems
    Zrenner, J. and Möller, F.O. and Jung, C. and Eitel, A. and Otto, B.
    Journal of Enterprise Information Management 32 (2019)
    Purpose: Current business challenges force companies to exchange critical and sensitive data. The data provider pays great attention to the usage of their data and wants to control it by policies. The purpose of this paper is to develop usage control architecture options to enable data sovereignty in business ecosystems. Design/methodology/approach: The architecture options are developed following the design science research process. Based on requirements from an automotive use case, the authors develop architecture options. The different architecture options are demonstrated and evaluated based on the case study with practitioners from the automotive industry. Findings: This paper introduces different architecture options for implementing usage control (UC). The proposed architecture options represent solutions for UC in business ecosystems. The comparison of the architecture options shows the respective advantages and disadvantages for data provider and data consumer. Research limitations/implications: In this work, the authors address only one case stemming from the German automotive sector. Practical implications: Technical enforcement of data providers policies instead of relying on trust to support collaborative data exchange between companies. Originality/value: This research is among the first to introduce architecture options that provide a technical concept for the implementation of data sovereignty in business ecosystems using UC. Consequently, it supports the decision process for the technical implementation of data sovereignty. © 2019, Emerald Publishing Limited.
    view abstract10.1108/JEIM-03-2018-0058
  • Layout optimization of a system for successive laser scanner detection and control of mobile robots
    Halmheu, R. and Otto, B. and Hegel, J.
    Robotics and Autonomous Systems 101 (2018)
    This paper describes a new, innovative method by which multiple mobile robots can be detected by a laser scanner. Each robot incorporates bars in its construction which generate a significant pattern in the laser scan. The proposed technique allows a robust detection and control of successively moving robots, despite the partial shadowing through the bars. In this paper the optimal number of bars and their best arrangement for detection is shown. Furthermore the impact of different bar diameters is described. Increased visibility of the bars by the use of multiple laser scanners and their positioning to ensure detection of the robots is also described. Finally it describes the accuracy that can be achieved with this system. The position accuracy was determined by trials on an actual system. Simulations and experiments confirm that this is a reliable and precise method for position determination of multiple robots using a single sensor. © 2017 Elsevier B.V.
    view abstract10.1016/j.robot.2017.12.007
  • Schedule Instability in Production Networks: An empirical Study on Drivers and Mechanisms within the Automotive Industry
    Moetz, A. and Meinlschmidt, J. and Otto, B.
    IFAC-PapersOnLine 51 (2018)
    Schedule instability in production networks is a highly relevant topic for practitioners and academics lacking research, especially from a network perspective. This paper addresses the phenomenon schedule instability in automotive production networks via an abductive case study approach. The objective of this paper is to • Present a structure of the different drivers influencing schedule instability • Present a model of the different mechanisms to tackle these instabilities • Derive the interdependencies between instability drivers and mechanisms © 2018
    view abstract10.1016/j.ifacol.2018.08.463
  • Towards utilizing Customer Data for Business Model Innovation: The Case of a German Manufacturer
    Michalik, A. and Möller, F. and Henke, M. and Otto, B.
    Procedia CIRP 73 (2018)
    Product-Service Systems (PSS) are a promising opportunity for manufacturing companies to realign their value creation process towards customer-oriented and resource efficient solutions. However, many companies still struggle to create new business models based on providing efficient solutions instead of physical products. The concept of PSS has been the subject of research for many years, yet there is a severe lack of utilizing the most crucial resource of the information age: Data. The authors propose products enriched with digital services based on customer data to develop a cohesive value proposition as the basis for business model innovation in manufacturing companies. © 2018 The Authors. Published by Elsevier B.V.
    view abstract10.1016/j.procir.2018.04.006
  • Design principles for industrie 4.0 scenarios
    Hermann, M. and Pentek, T. and Otto, B.
    Proceedings of the Annual Hawaii International Conference on System Sciences 2016-March (2016)
    The increasing integration of the Internet of Everything into the industrial value chain has built the foundation for the next industrial revolution called Industrie 4.0. Although Industrie 4.0 is currently a top priority for many companies, research centers, and universities, a generally accepted understanding of the term does not exist. As a result, discussing the topic on an academic level is difficult, and so is implementing Industrie 4.0 scenarios. Based on a quantitative text analysis and a qualitative literature review, the paper identifies design principles of Industrie 4.0. Taking into account these principles, academics may be enabled to further investigate on the topic, while practitioners may find assistance in identifying appropriate scenarios. A case study illustrates how the identified design principles support practitioners in identifying Industrie 4.0 scenarios. © 2016 IEEE.
    view abstract10.1109/HICSS.2016.488
  • Towards a methodology for Industrie 4.0 transformation
    Bücker, I. and Hermann, M. and Pentek, T. and Otto, B.
    Lecture Notes in Business Information Processing 255 (2016)
    Implications of market and environmental changes have always influenced the industrial world. Combined with new technologies, the current changes are summarized under the term Industrie 4.0. Since the first announcement, Industrie 4.0 is one of the most discussed topics in research and industry. However, for companies in the industrial sector, it is a challenge to assess the implications of Industrie 4.0 for their organizations, and to decide whether and how to respond. Therefore, a methodology to transform an organization towards Industrie 4.0 is required. This paper provides a metamodel for the transformation of organizations towards Industrie 4.0 as well as the first technique of this method, a framework, to structure the implications of Industrie 4.0 and to identify Industrie 4.0 action fields as a first step towards Industrie 4.0 transformation. Furthermore, it provides an outlook how to implement the identified action fields systematically in existing process change management. © Springer International Publishing Switzerland 2016.
    view abstract10.1007/978-3-319-39426-8_17
  • CC Chemokine Ligand 18 in ANCA-Associated Crescentic GN
    Brix, S. R. and Stege, G. and Disteldorf, E. and Hoxha, E. and Krebs, C. and Krohn, S. and Otto, B. and Klatschke, K. and Herden, E. and Heymann, F. and Lira, S. A. and Tacke, F. and Wolf, G. and Busch, M. and Jabs, W. J. and Ozcan, F. and Keller, F. and Beige, J. and Wagner, K. and Helmchen, U. and Noriega, M. and Wiech, T. and Panzer, U. and Stahl, R. A. K.
    Journal of the American Society of Nephrology 26 (2015)
    ANCA-associated vasculitis is the most frequent cause of crescentic GN. To define new molecular and/or cellular biomarkers of this disease in the kidney, we performed microarray analyses of renal biopsy samples from patients with ANCA-associated crescentic GN. Expression profiles were correlated with clinical data in a prospective study of patients with renal ANCA disease. CC chemokine ligand 18 (CCL18), acting through CC chemokine receptor 8 (CCR8) on mononuclear cells, was identified as the most upregulated chemotactic cytokine in patients with newly diagnosed ANCA-associated crescentic GN. Macrophages and myeloid dendritic cells in the kidney were detected as CCL18-producing cells. The density of CCL18(+) cells correlated with crescent formation, interstitial inflammation, and impairment of renal function. CCL18 protein levels were higher in sera of patients with renal ANCA disease compared with those in sera of patients with other forms of crescentic GN. CCL18 serum levels were higher in patients who suffered from ANCA-associated renal relapses compared with those in patients who remained in remission. Using a murine model of crescentic GN, we explored the effects of the CCL18 murine functional analog CCL8 and its receptor CCR8 on kidney function and morphology. Compared with wild-type mice, Ccr8(-/-) mice had significantly less infiltration of pathogenic mononuclear phagocytes. Furthermore, Ccr8(-/-) mice maintained renal function better and had reduced renal tissue injury. In summary, our data indicate that CCL18 drives renal inflammation through CCR8-expressing cells and could serve as a biomarker for disease activity and renal relapse in ANCA-associated crescentic GN.
    view abstract10.1681/ASN.2014040407
  • Proposing a Capability Perspective on Digital Business Models
    Bärenfänger, R. and Otto, B.
    Proceedings - 17th IEEE Conference on Business Informatics, CBI 2015 1 (2015)
    Business models comprehensively describe the functioning of businesses in contemporary economic, technological, and societal environments. This paper focuses on the characteristics of digital business models from the perspective of capability research and develops a capability model for digital businesses. Following the design science research (DSR) methodology, multiple evaluation and design iterations were performed. Contributions to the design process came from IS/IT practice and the research base on business models and capabilities. © 2015 IEEE.
    view abstract10.1109/CBI.2015.18
  • Quality and Value of the Data Resource in Large Enterprises
    Otto, B.
    Information Systems Management 32 (2015)
    Enterprises are facing problems in managing the quality and value of their key data objects (often referred to as master data). This article presents the findings from a case study comprising six large enterprises. The study results point to the importance of the situational nature of master data as a strategic resource, which must be considered when analyzing how the quality of data affects its value for business. Copyright © Taylor & Francis Group, LLC.
    view abstract10.1080/10580530.2015.1044344
  • Big data analytics for supply chain management
    Leveling, J. and Edelbrock, M. and Otto, B.
    IEEE International Conference on Industrial Engineering and Engineering Management 2015-January (2014)
    A high number of business cases are characterized by an expanded complexity. This is based on increased collaboration between companies, customers and governmental organizations on one hand and more individual products and services on the other hand. Due to that, companies are planning to address these issues with Big Data solutions. This paper deals with Big Data solutions focusing on Supply Chains, which represents a key discipline for handling the increased collaboration next to vast amounts of exchanged data. Today, the main focus lays on optimizing Supply Chain Visibility to handle complexity and to support decision making for handling risks and interruptions along supply chains. Therefore, Big Data concepts and technologies will play a key role. This paper describes the current skituation, actual solutions and presents exemplary use-cases for illustration. A classification regarding the area of application and potential benefits arising from Big Data Analytics are also given. Furthermore, this paper outlines general technologies to show capabilities of Big Data analytics. © 2014 IEEE.
    view abstract10.1109/IEEM.2014.7058772
  • Business value of in-memory technology-Multiple-case study insights
    Bärenfänger, R. and Otto, B. and Österle, H.
    Industrial Management and Data Systems 114 (2014)
    Purpose-The purpose of this paper is to assess the business value of in-memory computing (IMC) technology by analyzing its organizational impact in different application scenarios. Design/methodology/approach-This research applies a multiple-case study methodology analyzing five cases of IMC application scenarios in five large European industrial and service-sector companies. Findings-Results show that IMC can deliver business value in various applications ranging from advanced analytic insights to support of real-time processes. This enables higher-level organizational advantages like data-driven decision making, superior transparency of operations, and experience with Big Data technology. The findings are summarized in a business value generation model which captures the business benefits along with preceding enabling changes in the organizational environment. Practical implications-Results aid managers in identifying different application scenarios where IMC technology may generate value for their organizations from business and IT management perspectives. The research also sheds light on the socio-technical factors that influence the likelihood of success or failure of IMC initiatives. Originality/value-This research is among the first to model the business value creation process of in-memory technology based on insights from multiple implemented applications in different industries. © Emerald Group Publishing Limited.
    view abstract10.1108/IMDS-07-2014-0212
  • Concept of a proactive risk management in logistics networks
    Leveling, J. and Schier, A. and Luciano, F. and Otto, B.
    Logistics Journal 2014 (2014)
    Logistics business networks are growing rapidly and becoming more and more complex. Companies of-ten do not know of what other companies they depend and which business-critical risks are consequences of these dependencies. For this reason, a concept of a pro-active risk management in logistics networks is presented in this article. The concept is based on the Big Data technology and used for the identification of risks and the development of a logistics network using in addition to internal company data, external data such as social media platforms or other data portals. These data are analyzed and risky relationships are graphically displayed to the operator. In addition, the system can identify possible alternatives to the user to avoid these risks and thus be used for decision support. © 2014 Logistics Journal: Proceedings.
    view abstract10.2195/lj_Proc_leveling_de_201411_01
  • Toward a decision model for master data application architecture
    Baghi, E. and Schlosser, S. and Ebner, V. and Otto, B. and Oesterle, H.
    Proceedings of the Annual Hawaii International Conference on System Sciences (2014)
    Requirements such as integrated view of the customer or global business process integration make enterprise wide management of master data a prerequisite for achieving business goals. The master data application architecture, as a part of enterprise master data management, plays a critical role in enterprises. Choosing the right master data application architecture is a controversial subject in many enterprises. Unfortunately, the current state of the art in research does not provide sufficient guidance for enterprises dealing with this subject. The paper aims at overcoming this gap in research by presenting a decision model for supporting enterprises in the decision-making process regarding the choice of the right master data application architecture. To design the model, Multiple-Criteria Decision Analysis and Design Science Research Methodology were applied. © 2014 IEEE.
    view abstract10.1109/HICSS.2014.475
  • Toward a functional reference model for business rules management
    Schlosser, S. and Baghi, E. and Otto, B. and Oesterle, H.
    Proceedings of the Annual Hawaii International Conference on System Sciences (2014)
    Business rules can be crucial to an organization's business operations. In view of a growing number of internal and external challenges (such as compliance with regulations, the need for organizational agility, or the need to retain organizational knowledge), organizations increasingly are forced to actively manage their business rules in order to stay successful. However, business rules management (BRM) is an organizational task that cannot be encountered simply by implementing a software system. The paper describes the design process toward a functional reference model for business rules management. The model provides three perspectives on tasks and functions to successfully manage business rules. Practitioners may use the model to establish BRM in their organizations, facilitate communication between business and IT, and evaluate software solutions for BRM. From a scientific perspective, the model is a design artifact, representing a theory for designing and developing information systems with the objective of managing business rules. © 2014 IEEE.
    view abstract10.1109/HICSS.2014.476
  • Management of the master data lifecycle: A framework for analysis
    Ofner, M.H. and Straub, K. and Otto, B. and Oesterle, H.
    Journal of Enterprise Information Management 26 (2013)
    Purpose: The purpose of the paper is to propose a reference model describing a holistic view of the master data lifecycle, including strategic, tactical and operational aspects. The Master Data Lifecycle Management (MDLM) map provides a structured approach to analyze the master data lifecycle. Design/methodology/approach: Embedded in a design oriented research process, the paper applies the Component Business Model (CBM) method and suggests a reference model which identifies the business components required to manage the master data lifecycle. CBM is a patented IBM method to analyze the key components of a business domain. The paper uses a participative case study to evaluate the suggested model. Findings: Based on a participative case study, the paper shows how the reference model makes it possible to analyze the master data lifecycle on a strategic, a tactical and an operational level, and how it helps identify areas of improvement. Research limitations/implications: The paper presents design work and a participative case study. The reference model is grounded in existing literature and represents a comprehensive framework forming the foundation for future analysis of the master data lifecycle. Furthermore, the model represents an abstraction of an organization's master data lifecycle. Hence, it forms a "theory for designing". More research is needed in order to more thoroughly evaluate the presented model in a variety of real-life settings. Practical implications: The paper shows how the reference model enables practitioners to analyze the master data lifecycle and how it helps identify areas of improvement. Originality/value: The paper reports on an attempt to establish a holistic view of the master data lifecycle, including strategic, tactical and operational aspects, in order to provide more comprehensive support for its analysis and improvement. © Emerald Group Publishing Limited.
    view abstract10.1108/JEIM-05-2013-0026
  • Packaging and Modular Assembly of Large-Area and Fine-Pitch 2-D Ultrasonic Transducer Arrays
    Lin, D. S. and Wodnicki, R. and Zhuang, X. F. and Woychik, C. and Thomenius, K. E. and Fisher, R. A. and Mills, D. M. and Byun, A. J. and Burdick, W. and Khuri-Yakub, P. and Bonitz, B. and Davies, T. and Thomas, G. and Otto, B. and Topper, M. and Fritzsch, T. and Ehrmann, O.
    Ieee Transactions on Ultrasonics Ferroelectrics and Frequency Control 60 (2013)
    A promising transducer architecture for large-area arrays employs 2-D capacitive micromachined ultrasound transducer (CMUT) devices with backside trench-frame pillar interconnects. Reconfigurable array (RA) application-specified integrated circuits (ASICs) can provide efficient interfacing between these high-element-count transducer arrays and standard ultrasound systems. Standard electronic assembly techniques such as flip-chip and ball grid array (BGA) attachment, along with organic laminate substrate carriers, can be leveraged to create large-area arrays composed of tiled modules of CMUT chips and interface ASICs. A large-scale, fully populated and integrated 2-D CMUT array with 32 by 192 elements was developed and demonstrates the feasibility of these techniques to yield future large-area arrays. This study demonstrates a flexible and reliable integration approach by successfully combining a simple under-bump metallization (UBM) process and a stacked CMUT/interposer/ASIC module architecture. The results show high shear strength of the UBM (26.5 g for 70-mu m balls), high interconnect yield, and excellent CMUT resonance uniformity (s = 0.02 MHz). A multi-row linear array was constructed using the new CMUT/interposer/ASIC process using acoustically active trench-frame CMUT devices and mechanical/nonfunctional Si backside ASICs. Imaging results with the completed probe assembly demonstrate a functioning device based on the modular assembly architecture.
    view abstract10.1109/TUFFC.2013.2709
  • Toward a business model reference for interoperability services
    Otto, B. and Ebner, V. and Baghi, E. and Bittmann, R.M.
    Computers in Industry 64 (2013)
    The importance of interoperability for businesses is undoubted. After an evolution from electronic data interchange to interoperability in electronic business and enterprise interoperability both the scientific and the practitioners' community are today discussing the notion of interoperability service utilities. Furthermore, researchers are studying decentralized and distributed interoperability approaches such as peer-to-peer networks, for example. However, a comprehensive investigation of business models for such decentralized approaches to interoperability is still missing. Drawing from recent literature on business modeling on the one hand and interoperability research on the other hand this paper designs a business model reference for interoperability services. The business model reference assumes interoperability information as an economic good and is applied in two case studies and evaluated from multiple perspectives. The paper contributes to the scientific body of knowledge as it proposes a novel design artifact which lays the foundation for a number of future research opportunities. © 2013 Elsevier B.V.
    view abstract10.1016/j.compind.2013.06.017
  • A characteristics framework for Semantic Information Systems Standards
    Otto, B. and Folmer, E. and Ebner, V.
    Information Systems and e-Business Management 10 (2012)
    Semantic Information Systems (IS) Standards play a critical role in the development of the networked economy. While their importance is undoubted by all stakeholders-such as businesses, policy makers, researchers, developers-the current state of research leaves a number of questions unaddressed. Terminological confusion exists around the notions of "business semantics", "business-to-business interoperability", and "interoperability standards" amongst others. And, moreover, a comprehensive understanding about the characteristics of Semantic IS Standards is missing. The paper addresses this gap in literature by developing a characteristics framework for Semantic IS Standards. Two case studies are used to check the applicability of the framework in a "real-life" context. The framework lays the foundation for future research in an important field of the IS discipline and supports practitioners in their efforts to analyze, compare, and evaluate Semantic IS Standards. © 2011 The Author(s).
    view abstract10.1007/s10257-011-0183-3
  • Conceptualizing data in multinational enterprises: Model design and application
    Ebner, V. and Otto, B. and Österle, H.
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7532 LNCS (2012)
    Collaboration and coordination within multinational enterprises need unambiguous semantics of data across business units, legal contexts, cultures etc. Therefore data management has to provide enterprise-wide data ownership, an unambiguous distinction between "global" and "local" data, business-driven data quality specifications, and data consistency across multiple applications. Data architecture design aims at addressing these challenges. Particularly multinational enterprises, however, encounter difficulties in identifying, describing and designing the complex set of data architectural dimensions. The paper responds to the research question of what concepts need to be involved to support comprehensive data architecture design in multinational enterprises. It develops a conceptual model, which covers all requirements for defining, governing, using, and storing data. The conceptual model is applied in a case study conducted at a multinational corporation. Well-grounded in the existing body of knowledge, the paper contributes by identifying, describing, and aggregating a set of concepts enabling multinational enterprises to meet business requirements. © 2012 Springer-Verlag.
    view abstract10.1007/978-3-642-34002-4_42
  • How to design the master data architecture: Findings from a case study at Bosch
    Otto, B.
    International Journal of Information Management 32 (2012)
    Master data management (MDM) is a topic of increasing prominence both in the scientific and in the practitioners' information systems (IS) community. As a prerequisite for meeting strategic business requirements, such as compliance with regulations, business integration, or integrated customer management, MDM comprises numerous activities. One of the central activities is designing and maintaining the master data architecture. Interestingly, though, the scientific community has remained almost silent with regard to the question as to how companies should proceed when designing the master data architecture. In order to shed light on this unexplored topic, the paper at hand presents the findings of a case study at Bosch Group. The case study shows that designing the master data architecture is a multidimensional task which requires balancing the interests of various organizational stakeholders, managing an array of technical opportunities, and meeting requirements of numerous master data classes. Also, the case study suggests that taking advantage of architectural design patterns may be an appropriate way to adequately address the complexity of the task. © 2011 Elsevier Ltd. All rights reserved.
    view abstract10.1016/j.ijinfomgt.2011.11.018
  • Integrating a data quality perspective into business process management
    Ofner, M.H. and Otto, B. and Österle, H.
    Business Process Management Journal 18 (2012)
    Purpose: The purpose of this paper is to conceptualize data quality (DQ) in the context of business process management and to propose a DQ oriented approach for business process modeling. The approach is based on key concepts and metrics from the data quality management domain and supports decision-making in process re-design projects on the basis of process models. Design/methodology/approach: The paper applies a design oriented research approach, in the course of which a modeling method is developed as a design artifact. To do so, method engineering is used as a design technique. The artifact is theoretically founded and incorporates DQ considerations into process re-design. Furthermore, the paper uses a case study to evaluate the suggested approach. Findings: The paper shows that the DQ oriented process modeling approach facilitates and improves managerial decision-making in the context of process re-design. Data quality is considered as a success factor for business processes and is conceptualized using a rule-based approach. Research limitations/implications: The paper presents design research and a case study. More research is needed to triangulate the findings and to allow generalizability of the results. Practical implications: The paper supports decision-makers in enterprises in taking a DQ perspective in business process re-design initiatives. Originality/value: The paper reports on integrating DQ considerations into business process management in general and into process modeling in particular, in order to provide more comprehensive decision-making support in process re-design projects. The paper represents one of the first contributions to literature regarding a contemporary phenomenon of high practical and scientific relevance. © Emerald Group Publishing Limited.
    view abstract10.1108/14637151211283401
  • Managing the business benefits of product data management: The case of Festo
    Otto, B.
    Journal of Enterprise Information Management 25 (2012)
    Purpose: The paper seeks to investigate the question as to how the business benefits of product data management (PDM) can be assessed and realized. In particular, it aims at understanding the means-end relationship between PDM and product data on the one hand and a company's business goals on the other hand. Design/methodology/approach: The paper uses a case study research approach. The case of Festo is unique and allows for detailed examination of both the business benefits of PDM and of the inter-dependencies of various business benefit enablers. Due to the limited amount of scientific knowledge with regard to the management of PDM business benefits, the study is exploratory in nature. The conceptual framework used to guide the study combines business engineering concepts and the business dependency network technique. Findings: The findings are threefold. First, the paper explicates and details the understanding of the nature of PDM business benefits. Second, it provides insight into the complexity and interdependency of various "means" - such as data ownership, product data standards, for example - and the "ends" of PDM, namely the contribution to a company's business goals. Third, the paper forms the baseline for a comprehensive method supporting the management of PDM business benefits. Research limitations/implications: Single-case studies require further validation of findings. Thus, future research should aim at replicating the findings and at developing a comprehensive method for the management of PDM business benefits. Practical implications: Companies may take up the results as a "blueprint" for their own PDM activities and may reflect their own business benefits against the case of Festo. Originality/value: The paper is one of the first contributions focusing on the means-end relationship between PDM and product data on the one hand and a company's business goals on the other. © Emerald Group Publishing Limited.
    view abstract10.1108/17410391211224426
  • Toward a functional reference model for master data quality management
    Otto, B. and Hüner, K.M. and Österle, H.
    Information Systems and e-Business Management 10 (2012)
    The quality of master data has become an issue of increasing prominence in companies. One reason for that is the growing number of regulatory and legal provisions companies need to comply with. Another reason is the growing importance of information systems supporting decision-making, requiring master data that is up-to-date, accurate and complete. While improving and maintaining master data quality is an organizational task that cannot be encountered by simply implementing a suitable software system, system support is mandatory in order to be able to meet challenges efficiently and make for good results. This paper describes the design process toward a functional reference model for master data quality management (MDQM). The model design process spanned several iterations comprising multiple design and evaluation cycles, including the model's application in a participative case study at consumer goods manufacturer Beiersdorf. Practitioners may use the reference model as an instrument for the analysis, design and implementation of a company's MDQM system landscape. Moreover, the reference model facilitates evaluation of software systems and supports company-internal and external communication. From a scientific perspective, the reference model is a design artifact; hence it represents a theory for designing information systems in the area of MDQM. © 2011 Springer-Verlag.
    view abstract10.1007/s10257-011-0178-0
  • Collaborative management of business metadata
    Hüner, K.M. and Otto, B. and Österle, H.
    International Journal of Information Management 31 (2011)
    Legal provisions, cross-company data exchange and intra-company reporting or planning procedures require comprehensively, timely, unambiguously and understandably specified business objects (e.g. materials, customers, and suppliers). On the one hand, this business metadata has to cover miscellaneous regional peculiarities in order to enable business activities anywhere in the world. On the other hand, data structures need to be standardized throughout the entire company in order to be able to perform global spend analysis, for example. In addition, business objects should adapt to new market conditions or regulatory requirements as quickly and consistently as possible. Centrally organized corporate metadata managers (e.g. within a central IT department) are hardly able to meet all these demands. They should be supported by key users from several business divisions and regions, who contribute expert knowledge. However, despite the advantages regarding high metadata quality on a corporate level, a collaborative metadata management approach of this kind has to ensure low effort for knowledge contributors as in most cases these regional or divisional experts do not benefit from metadata quality themselves. Therefore, the paper at hand identifies requirements to be met by a business metadata repository, which is a tool that can effectively support collaborative management of business metadata. In addition, the paper presents the results of an evaluation of these requirements with business experts from various companies and of scenario tests with a wiki-based prototype at the company Bayer CropScience AG. The evaluation shows two things: First, collaboration is a success factor when it comes to establishing effective business metadata management and integrating metadata with enterprise systems, and second, semantic wikis are well suited to realizing business metadata repositories. © 2010 Elsevier Ltd.
    view abstract10.1016/j.ijinfomgt.2010.12.002
  • Data governance
    Otto, B.
    Business and Information Systems Engineering 3 (2011)
    view abstract10.1007/s12599-011-0162-8
  • Data quality requirements of collaborative business processes
    Falge, C. and Otto, B. and Österle, H.
    Proceedings of the Annual Hawaii International Conference on System Sciences (2011)
    High-quality data is not just a competitive factor for individual companies but also an enabler for collaboration in business networks. The paper takes a cross-company perspective on data quality and identifies requirements collaborative business processes pose to data quality. A qualitative content analysis on Business Networking case studies is conducted. The results show which combinations of data classes (e.g. order data, forecast data) and quality dimensions (e.g. business rule conformity) are crucial for the different collaborative business processes in business networks. The paper interprets the results and closes with a discussion of current data quality trends for collaborative processes. © 2012 IEEE.
    view abstract10.1109/HICSS.2012.8
  • Information and data quality in business networking: A key concept for enterprises in its early stages of development
    Otto, B. and Lee, Y.W. and Caballero, I.
    Electronic Markets 21 (2011)
    Information and data of high quality are critical for successful business performance in general and Business Networking in particular. As the trend toward sharing information between business partners and value networks is still increasing, the position paper aims at providing a comprehensive perspective on the state of research with regard to information and data quality in Business Networking. The paper shows that much has been achieved, but that fundamental aspects still remain unaddressed. Based on the results of a literature review, the paper identifies consequential areas of research and makes six propositions for future research. In doing so, the position paper aims at offering novel perspectives and at introducing new areas of research in a field of particularly high relevance in the networked business and electronic markets domain. © Institute of Information Management, University of St. Gallen 2011.
    view abstract10.1007/s12525-011-0063-1
  • Information and data quality in networked business
    Otto, B. and Lee, Y.W. and Caballero, I.
    Electronic Markets 21 (2011)
    view abstract10.1007/s12525-011-0062-2
  • Product data quality in supply chains: The case of beiersdorf
    Hüner, K.M. and Schierning, A. and Otto, B. and Österle, H.
    Electronic Markets 21 (2011)
    A number of business requirements (e.g. compliance with regulatory and legal provisions, diffusion of global standards, supply chain integration) are forcing consumer goods manufacturers to increase their efforts to provide product data (e.g. product identifiers, dimensions) at business-to-business interfaces timely and accurately. The quality of such data is a critical success factor for efficient and effective cross-company collaboration. If compliance relevant data (e.g. dangerous goods indicators) is missing or false, consumer goods manufacturers risk being fined and see their company's image damaged. Or if logistics data (e.g. product dimensions, gross weight) is inaccurate or provided not in time, business with key account trading partners is endangered. To be able to manage the risk of business critical data defects, companies ust be able to a) identify such data defects, and b) specify and use metrics that allow to monitor the data's quality. As scientific research on both these issues has come up with only few results so far, this case study explores the process of identifying business critical product data defects at German consumer goods manufacturing company Beiersdorf AG. Despite advanced data quality management structures such defects still occur and can result in complaints, service level impairment and avoidable costs. The case study analyzes product data use and maintenance in Beiersdorf's ecosystem, identifies typical product data defects, and proposes a set of data quality metrics for monitoring those defects. © Institute of Information Management, University of St. Gallen 2011.
    view abstract10.1007/s12525-011-0059-x
  • A Method for Researcher-Practitioner Collaboration in Design-Oriented IS Research
    Osterle, H. and Otto, B.
    Business & Information Systems Engineering 2 (2010)
    Design-oriented research in the Information Systems (IS) domain aims at delivering results which are both of scientific rigor and of relevance for practitioners. Today, however, academic researchers are facing the challenge of gaining access to and capturing knowledge from the practitioner community. Against this background, the paper proposes a method for Consortium Research, which is supposed to facilitate multilateral collaboration of researchers and practitioners during the research process. The method's design is based on a self-evaluating design process which was carried out over a period of 20 years. The paper's contribution is twofold. First, it addresses the science of design, since it proposes guidance to researchers for practitioner collaboration during the process of artifact design. Second, the method is an artifact itself, hence, the result of a design-oriented research process.
    view abstract10.1007/s12599-010-0119-3
  • Consortium research: A method for researcher-practitioner collaboration in design-oriented is research
    Österle, H. and Otto, B.
    Business and Information Systems Engineering 52 (2010)
    Design-oriented research in the Information Systems (IS) domain aims at delivering results which are both of scientific rigor and of relevance for practitioners. Today, however, academic researchers are facing the challenge of gaining access to and capturing knowledge from the practitioner community. Against this background, the paper proposes a method for Consortium Research, which is supposed to facilitate multilateral collaboration of researchers and practitioners during the research process. The method's design is based on a self-evaluating design process which was carried out over a period of 20 years. The paper's contribution is twofold. First, it addresses the science of design, since it proposes guidance to researchers for practitioner collaboration during the process of artifact design. Second, the method is an artifact itself, hence, the result of a design-oriented research process. © 2010 Gabler Verlag.
    view abstract10.1007/s11576-010-0238-y
  • Integrating information systems: Case studies on current challenges
    Schmidt, A. and Otto, B. and Österle, H.
    Electronic Markets 20 (2010)
    Integration of information systems is a prerequisite for efficient collaboration within and between organizations. Despite intensive research on integration issues in Information Systems research, companies nowadays still encounter considerable challenges in integration projects. The question for the reasons still engages researchers and practitioners. The paper at hand investigates current integration problems by deriving a framework for integration problems from literature research and applying it for the analysis of concrete cases from business practice. The examples are described in detail in explorative case studies examining three integration projects from different industries and with different objectives. From the comparison of theoretically derived and practically proven integration problems, the paper identifies integration problems that have been insufficiently addressed so far and discusses open challenges of integration in the Information Systems discipline. © Institute of Information Management, University of St. Gallen 2010.
    view abstract10.1007/s12525-010-0037-8
  • Organizing master data management: Findings from an expert survey
    Otto, B. and Reichert, A.
    Proceedings of the ACM Symposium on Applied Computing (2010)
    Master data management (MDM) is defined as an application-independent process which describes, owns and manages core business data entities. The establishment of the MDM process is a Business Engineering (BE) tasks which requires organizational design. This paper reports on the results of a questionnaire survey among large enterprises aiming at delivering insight into what tasks and master data classes MDM organizations cover ("scope") and how many people they employ ("size"). The nature of the study is descriptive, i.e. it allows for the identification of patterns and trends in organizing the MDM process. © 2010 ACM.
    view abstract10.1145/1774088.1774111
  • Relevance through consortium research? Findings from an expert interview study
    Otto, B. and Österle, H.
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 6105 LNCS (2010)
    The Information Systems (IS) community is discussing the relevance of its research. Design-oriented IS research is considered a promising approach since it combines practical relevance and scientific rigor. Only limited guidance, however, is available for the researcher to gain access to and exchange knowledge from the practitioners' domain. This is surprising insofar as the IS "ecosystem" is under change and research and innovation largely takes place in the practitioners' community. Consortium research addresses the issue of getting access to and exchanging knowledge from the practitioners' community. It supports the development of artifacts and is characterized by close cooperation between the university and its partners in all stages of the design-oriented research process, practical validation of research results with partner companies, and a focus on the practical benefits of the research, with all research activities being funded by the consortium partners. The research question posed in this paper is what consortium research contributes to design-oriented IS research against the background of the aforementioned phenomena. The paper presents the findings from an expert interview study among professors of the German-speaking IS community in Europe. © 2010 Springer-Verlag.
    view abstract10.1007/978-3-642-13335-0_2
  • business- and logistic networks

  • supply chain management

« back