Abstract
Data models are the backbone of digital information exchange, since they contain the structure of data to be exchanged. Just as information requirements vary, so do data models – in level of ...detail, level of abstraction, and domain coverage. Due to these differences, communication between some data models is easy and between others is difficult. Regarding the Building Information Model (BIM) initiative, the IFC standard data model varies in detail, abstraction level, and large domain coverage. In contrast, the Austrian ÖGG-guideline has a consistent detail and abstraction level focusing on a single domain, subsurface engineering. In order to participate in a loss- and distortion-free information exchange, a reliable translation via a multitude of data models is a must-have.
In this paper, we present formal criteria for distinguishing between semantics-carrying data models, such as IFC and ÖGG, and translating data models that provide reliable communication bridges between them, such as CAEX and SIMULTAN. We will show that translating data models are an indispensable part of the data model infrastructure even within a single domain. For evaluation purpose, we demonstrate our approach on a subsurface engineering use case.
Data Drops for Tunnel Information Modelling Paskaleva, Galina; Niedermoser, Christoph; Vierhauser, Michael ...
Geomechanik und Tunnelbau,
June 2022, 2022-06-00, 20220601, Letnik:
15, Številka:
3
Journal Article
Recenzirano
In the Architecture, Engineering and Construction (AEC) industry, as well as in the tunnelling domain, inter‐company processes between partners in different roles in large‐scale construction projects ...still exhibit great potential towards digitalisation. Thereby, information should be seamlessly shared between partners according to the Building Information Modelling (BIM) paradigm. Today, different types of artefacts (e.g., models, plans, documents, etc.) are shared at different points in time, which differ in terms of requirements, information content, as well as data formats. In this article, we extend and prototypically implement the concept of Data Drops to provide those artefacts in a digitalised form via a shared Data Drop management platform. For this purpose, we have developed a formal, well‐defined indexed data structure on a metadata level. This not only facilitates traceability, but also enables searching for specific meta‐information and provides a common view on Data Drops. In addition, a networked view between different drops can be provided. The approach is being evaluated on the use case of a real tunnel construction project.
Data Drops im Tunnelbau
In der AEC‐Branche (Architektur, Ingenieur‐ und Bauwesen) wie auch im Tunnelbau haben unternehmensübergreifende Prozesse zwischen Partnern in unterschiedlichen Rollen bei großen Bauprojekten nach wie vor ein großes Potenzial hinsichtlich der Digitalisierung. Dabei sollen Informationen nach dem Building Information Modelling (BIM)‐Paradigma nahtlos zwischen den Partnern ausgetauscht werden können. Heute werden zu unterschiedlichen Zeitpunkten verschiedene Arten von Artefakten (z. B. Modelle, Pläne, Dokumente, etc.) ausgetauscht, die sich in Bezug auf Anforderungen, Informationsgehalt und Datenformate unterscheiden. In diesem Beitrag wird das Konzept der Data Drops erweitert und prototypisch implementiert, um diese Artefakte in digitalisierter Form mittels einer Data Drop Managementplattform zur Verfügung zu stellen. Zu diesem Zweck entwickeln wir eine formal wohldefinierte indexierte Datenstruktur auf Metadatenebene. Dies ermöglicht die Nachvollziehbarkeit, Auffindbarkeit sowie eine gemeinsame Sicht auf die Data Drops. Zusätzlich kann auch eine vernetzte Sicht zwischen verschiedenen Drops bereitgestellt werden. Dieser Ansatz wird am Anwendungsfall eines realen Tunnelbauprojekts evaluiert.
The documentation of the tunneling process is a crucial task of every tunnel construction project. It provides evidence of the work performed and thus, serves as a basis for invoicing and for several ...further analyses. Therefore, continuous digitalisation of this documentation is essential. For this purpose, we provide a digital Tunneling Information Management System (TIMS), which is a prototypically implemented software tool for replacing the still common paper‐based documentation process of tunneling projects using the New Austrian Tunneling Method (NATM). The data model presented here defines the data structures managed by this tool. Based on this, the software architecture and the implementation of TIMS is shown.
Das Tunneling Information Management System – Ein Werkzeug zur Dokumentation von Tunnelbauprozessen in NÖT Projekten
Die Dokumentation des Tunnelbaus ist eine wichtige Aufgabe bei jedem Tunnelbauprojekt. Diese dient dem Nachweis der geleisteten Arbeit und damit als Grundlage für die Rechnungslegung und für verschiedene weitere Auswertungen. Eine durchgängige Digitalisierung dieser Dokumentation ist daher unerlässlich. Zu diesem Zweck stellen wir ein digitales Tunneling Information Management System (TIMS) zur Verfügung. Dabei handelt es sich um ein prototypisch implementiertes Softwaretool, um den immer noch üblichen papierbasierten Dokumentationsprozess von Tunnnelbauprojekten mit Neuer Österreichischen Tunnelbaumethode (NÖT) zu ersetzen. Das vorgestellte Datenmodell definiert die von diesem Tool verwalteten Datenstrukturen. Darauf aufbauend wird die Softwarearchitektur und die Implementierung von TIMS vorgestellt.
Data exchange and management methods are of paramount importance in areas as complex as the Architecture, Engineering and Construction industries and Facility Management. For example, Big Open BIM ...requires seamless information flow among an arbitrary number of applications. The backbone of such information flow is a robust integration, whose tasks include overcoming technological as well as semantic and pragmatic gaps and conflicts both within and between data models. In this work, we introduce a method for integrating the pragmatics at design-time and the semantics of independent applications at run-time into so-called “integration facades”. We utilize Model-driven Engineering for the automatic discovery of functionalities and data models, and for finding a user-guided consensus. We present a case study involving the domains of architecture, building physics and structural engineering for evaluating our approach in object-oriented as well as data-oriented programming environments. The results produce, for each scenario, a single integration facade that acts as a single source of truth in the data exchange process.
Display omitted
•Data model integration façades involve syntax-independent semantic consolidation and pragmatic conflic resolution.•Modeling framework based on MDE that enforces clean separation of syntax, semantics, and representation.•Multi-case study with different tools in the AEC domains in object-oriented and data-oriented programming environments object-oriented as well as data-oriented programming environments.
For continuously checking and updating the virtual representation of a real system during operation, the continuous sensing and interpretation of raw sensor data is a must. The challenge is to bundle ...sensor value streams (e.g., from IoT networks) and aggregate them to a higher logical state level to enable process-oriented viewpoints and to handle uncertainties about sensor measurements and state realization precision. To address these uncertainties, so-called "tolerance ranges" must be defined in which logical states are detected during operation with acceptable deviations. Specifying such tolerance ranges manually is a time-consuming, error-prone task and often not feasible due to the huge associated value search space. To tackle this challenge, the problem is turned into an optimization problem in this paper. For this purpose, we present a framework based on meta-heuristic search that enables the automatic configuration of tolerance ranges based on available execution traces of multiple sensor value streams. An exploratory study evaluates the approach. For this purpose, we implemented a lab-sized demonstrator of a five-axis grip arm robot, which we continuously monitored during operation in a simulated environment. The evaluation shows the advantage of using meta-heuristic optimizers such as Harmony Search or Genetic Algorithm to identify stable tolerance ranges automatically for state detection at runtime. Note to Practitioners -Monitoring sensor values streams is nowadays a frequently employed technique in many automation domains. However, combining and mapping single value streams to higher-level state-based representations such as state machines or other design-time related models is a major challenge due to measurement and realization precision uncertainties. Thus, simply mapping monitored raw data to these design descriptions can lead to falsely identified or missed states. To improve this situation, we present an approach that provides a mechanism to continuously analyze data streams during operation by automatically finding appropriate tolerance ranges to detect realized system states. The approach uses a small set of annotated execution traces and meta-heuristic searchers to derive optimal tolerance ranges, which provide high correctness and completeness of the identified system states. This approach represents the basis for building a "vertical bridge" from the operation technology layer considering pure sensor data streams to the IT layer where state-based process views are provided to perform monitoring and analytics, e.g., by using process mining.
Industry 4.0 production systems must support flexibility in various dimensions, such as for the products to be produced, for the production processes to be applied, and for the available machinery. ...In this article, we present a novel approach to design and control smart manufacturing systems. The approach is reactive, that is responds to unplanned situations and implements an iterative refinement technique, that is, optimizes itself during runtime to better accommodate production goals. For realizing these advances, we present a model-driven methodology and we provide a prototypical implementation of such a production system. In particular, we employ Planning Domain Definition Language (PDDL) as our artificial intelligence environment for automated planning of production processes and combine it with one of the most prominent Industry 4.0 standards for the fundamental production system model: IEC 62264. We show how to plan the assembly of small trucks from available components and how to assign specific production operations to available production resources, including robotic manipulators and transportation system shuttles. Results of the evaluation indicate that the presented approach is feasible and that it is able to significantly strengthen the flexibility of production systems during runtime. Note to Practitioners -Smart production is an umbrella for a number of shifts and initiatives that deal with digitization of manufacturing/production systems and related issues and potentials. In this work, we present an approach for utilizing automated planning for creating production plans. This is in contrast to the traditional approach, where recipes are programmed into the production system ahead-of-time. However, automated planning relies on specific languages and tools that are hard to master by nonexperts, which is a factor that strongly limited the utilization of plan-driven approaches for industrial automation in practice. Thus, we propose to generate planning tasks automatically with model-driven engineering techniques. We are utilizing the industrial standard IEC 62264 for the description of the production system, and the academic standard Planning Domain Definition Language (PDDL) for planning. PDDL is handled completely transparent to the user, that is the user is shielded from its complexity by employing the IEC 62264 model as the sole frontend.
Sustainability is defined by current research as an interdisciplinary field comprising environmental, social, and economic aspects. This paper presents a systematic literature review following the ...PRISMA guidelines investigating how authors currently view sustainability issues in the specific context of tunneling. Thereby, we introduce a new methodology for reviewing sustainability aspects in an interdisciplinary way, where key bibliographic metrics are derived from the metadata of the reviewed literature. Regarding the content of the articles, we cluster sustainability aspects into specific topics and discuss challenges and solutions. In addition, we examine the role of digital technologies applied in sustainable tunneling. Our results show that there is a lack of interdisciplinary studies and that the current research does not represent all three dimensions of sustainability equally. The current research focuses on assessing the status quo instead of presenting specific solutions. Finally, we see great potential to further leverage digital tools to enable sustainable tunneling.
The German working committee for "Industrie 4.0" identified the horizontal integration throughout value networks and the vertical integration of networked manufacturing systems as key issues in the ...context of smart factories. For this purpose we aim for a universal model-driven industrial engineering framework spanning over production chains and value networks. Thereby, we build up on the Resource Event Agent (REA) business ontology (ISO/IEC 15944-4) to describe external activities requiring horizontal integration with business partners and internal activities serving for vertical integration within a manufacturing enterprise. We plan to apply the ISA-95 industry standard (ANSI/ISA-95; DIN EN 62264) to describe the vertical integration within an enterprise and its decentralized, networked production plants. As a first step, presented in this paper, we extend the REA ontology by useful concepts known from ISA-95 towards an integrating modeling framework.
The development of software tools is a collaborative process involving both the domain experts and the software engineers. This requires efficient communication considering different expertise and ...perspectives. Additionally, the two groups utilize language and communication tools in disparate ways. This, in turn, may lead to hidden misunderstandings in the requirement analysis phase and potentially result in implementation problems later on, that is difficult and costly to correct. In this paper, we demonstrate the above mentioned challenge via a use case from the tunneling domain. In particular, during the requirement analysis phase for a software capable of handling the data model of the subsoil. The domain experts in the field can best express the complexity of their domain by describing its artifacts, which in most cases are incomprehensible to the software engineers. We outline a method that interleaves requirement analysis and software modeling to enable an iterative increase of the accuracy and completeness of the information extracted from those artifacts and integrated into a flexible software model, which can produce testable software code automatically. Furthermore, we present a prototypical implementation of our method and a preliminary evaluation of the approach.