Display omitted
•Heterogeneity in industry is increasing and poses new research challenges.•TSN and SDN are future-proof standards and paradigms for industrial cabled networks.•Wireless technologies ...are increasingly becoming an integral part of today's factories.•Requirements of industrial networks strongly depend on the application context.•Intelligent algorithms/networks will help to manage their ever increasing complexity.
The real and effective ground of all new concepts dedicated to the current advanced factories, as well as to the future digital ones, is close cooperativity of scattered applications in highly heterogeneous systems. Communication is the key enabling component, and all new approaches are inspired in practice to the demanding characteristics of industrial networks. These kinds of computer networks, together with new technologies derived from distant application fields, are the main technological means to accelerate the fast evolution of modern factory systems. Due to various communication requirements coming from the plurality of structures, components and application contexts, communication subsystems must be increasingly heterogeneous. Let us say clearly: this evolution cannot be stopped at this stage, no special universal solution is possible, and thinking about monogamous networking is a kind of dreamland. This paper is an analysis of the state of the art in the matter of heterogeneous networking in industry. It deeply investigates both wired and wireless technologies from the point of view of technological aspects and relevant key performance indicators, such as those related to dependability, and it contains a prospective estimation of future trends.
Hospitals that routinely share patients are those that most critically need to engage in electronic health information exchange (HIE) with each other to ensure clinical information is available to ...inform treatment decisions. We surveyed pairs of hospitals in a nationwide sample to describe whether and how hospitals within each hospital referral region (HRR) that have the highest shared patient (HSP) volume engaged in HIE with each other.
We used Medicare's Physician Shared Patient Patterns data to identify hospital pairs with the highest shared patient volume in each hospital referral region. We surveyed a purposeful sample of pairs and then calculated descriptive statistics to compare: (1) HIE with the HSP hospital vs HIE with other hospitals, and (2) HIE with the HSP hospital versus federal measures of HIE engagement that are not partner-specific.
We received responses from 25.5% of contacted hospitals and 33.5% of contacted pairs, allowing us to examine information sharing among 68 hospitals in 63 pairs. 23% of respondents reported worse information sharing with their HSP hospital than with other hospitals while 17% indicated better sharing with their HSP hospital and 48% indicated no difference. Our HSP-specific measures of HIE differed from federal measures of HIE engagement: while 97% of respondents are classified as routinely sending information electronically in federal measures, in our data only 63% did so with their HSP hospital.
Despite increased HIE engagement, our descriptive results indicate that HIE is not developing in a way that facilitates information exchange where it might benefit the most patients. New policy efforts, particularly those emerging from the 21st Century Cures Act, need to explicitly pursue strategies that ensure that HSP providers engage in exchange with each other.
Background: The extensive use of ICT in healthcare delivery has become important globally. As a result, patient data in many countries, including Rwanda, are dispersed across different Electronic ...Health Record (EHR) systems. Often these systems exhibit a lack of interoperability among healthcare organizations, resulting in several silos of isolated patient data.
Methods: In this study, the research strategy employed was a survey, with data collection conducted through semistructured interviews and an examination of documents. The gathered data were then analyzed thematically.
Results: The findings revealed that the existing policies at the national level do not explicitly promote health information exchange (HIE) in Rwanda. Moreover, there is a lack of a common data model supporting interoperability between healthcare facilities, and EHRs are not interoperable because there is no national HIE mediator. Therefore, the caregivers cannot access the transferred patients' complete medical history. Every healthcare facility has its specific EHR system for data storage and some of them also have frequent internet disturbances.
Conclusion: In this study, HIE challenges on different levels of interoperability in the Rwanda EHR systems have been identified. Several of these challenges can be addressed by the new guidelines and standards for HIE released by the African Union: these will greatly contribute to Rwanda and other African countries to address the challenges. Clear strategies for implementing interoperable EHR can supplement the existing HIE guidelines and standards.
Cyber-physical systems (CPS) are interconnected architectures that employ analog and digital components as well as communication and computational resources for their operation and interaction with ...the physical environment. CPS constitute the backbone of enterprise (e.g., smart cities), industrial (e.g., smart manufacturing), and critical infrastructure (e.g., energy systems). Thus, their vital importance, interoperability, and plurality of computing devices make them prominent targets for malicious attacks aiming to disrupt their operations. Attacks targeting cyber-physical energy systems (CPES), given their mission-critical nature within the power grid infrastructure, can lead to disastrous consequences. The security of CPES can be enhanced by leveraging testbed capabilities in order to replicate and understand power systems operating conditions, discover vulnerabilities, develop security countermeasures, and evaluate grid operation under fault-induced or maliciously constructed scenarios. Adequately modeling and reproducing the behavior of CPS could be a challenging task. In this paper, we provide a comprehensive overview of the CPS security landscape with an emphasis on CPES. Specifically, we demonstrate a threat modeling methodology to accurately represent the CPS elements, their interdependencies, as well as the possible attack entry points and system vulnerabilities. Leveraging the threat model formulation, we present a CPS framework designed to delineate the hardware, software, and modeling resources required to simulate the CPS and construct high-fidelity models that can be used to evaluate the system's performance under adverse scenarios. The system performance is assessed using scenario-specific metrics, while risk assessment enables the system vulnerability prioritization factoring the impact on the system operation. The overarching framework for modeling, simulating, assessing, and mitigating attacks in a CPS is illustrated using four representative attack scenarios targeting CPES. The key objective of this paper is to demonstrate a step-by-step process that can be used to enact in-depth cybersecurity analyses, thus leading to more resilient and secure CPS.
We developed and piloted a process for sharing guideline-based clinical decision support (CDS) across institutions, using health screening of newly arrived refugees as a case example.
We developed ...CDS to support care of newly arrived refugees through a systematic process including a needs assessment, a 2-phase cognitive task analysis, structured preimplementation testing, local implementation, and staged dissemination. We sought consensus from prospective users on CDS scope, applicable content, basic supported workflows, and final structure. We documented processes and developed sharable artifacts from each phase of development. We publically shared CDS artifacts through online dissemination platforms. We collected feedback and implementation data from implementation sites.
Responses from 19 organizations demonstrated a need for improved CDS for newly arrived refugee patients. A guided multicenter workflow analysis identified 2 main workflows used by organizations that would need to be supported by shared CDS. We developed CDS through an iterative design process, which was successfully disseminated to other sites using online dissemination repositories. Implementation sites had a small-to-modest analyst time commitment but reported a good match between CDS and workflow.
Sharing of CDS requires overcoming technical and workflow barriers. We used a guided multicenter workflow analysis and online dissemination repositories to create flexible CDS that has been adapted at 3 sites. Organizations looking to develop sharable CDS should consider evaluating the workflows of multiple institutions and collecting feedback on scope, design, and content in order to make a more generalizable product.
In the context of Industry 4.0, smart factories use advanced sensing and data analytic technologies to understand and monitor the manufacturing processes. To enhance production efficiency and ...reliability, statistical Artificial Intelligence (AI) technologies such as machine learning and data mining are used to detect and predict potential anomalies within manufacturing processes. However, due to the heterogeneous nature of industrial data, sometimes the knowledge extracted from industrial data is presented in a complex structure. This brings the semantic gap issue which stands for the lack of interoperability among different manufacturing systems. Furthermore, as the Cyber-Physical Systems (CPS) are becoming more knowledge-intensive, uniform knowledge representation of physical resources and real-time reasoning capabilities for analytic tasks are needed to automate the decision-making processes for these systems. These requirements highlight the potential of using symbolic AI for predictive maintenance.
To automate and facilitate predictive analytics in Industry 4.0, in this paper, we present a novel Knowledge-based System for Predictive Maintenance in Industry 4.0 (KSPMI). KSPMI is developed based on a novel hybrid approach that leverages both statistical and symbolic AI technologies. The hybrid approach involves using statistical AI technologies such as machine learning and chronicle mining (a special type of sequential pattern mining approach) to extract machine degradation models from industrial data. On the other hand, symbolic AI technologies, especially domain ontologies and logic rules, will use the extracted chronicle patterns to query and reason on system input data with rich domain and contextual knowledge. This hybrid approach uses Semantic Web Rule Language (SWRL) rules generated from chronicle patterns together with domain ontologies to perform ontology reasoning, which enables the automatic detection of machinery anomalies and the prediction of future events’ occurrence. KSPMI is evaluated and tested on both real-world and synthetic data sets.
•A novel hybrid approach combining statistical and symbolic AI for predictive maintenance is proposed.•KSPMI is a novel software for Industry 4.0 predictive maintenance.•KSPMI is evaluated and tested on both real-world and synthetic data sets.
Modelling with AAS and RDF in Industry 4.0 Rongen, Sjoerd; Nikolova, Nikoletta; van der Pas, Mark
Computers in industry,
June 2023, 2023-06-00, Letnik:
148
Journal Article
Recenzirano
Odprti dostop
Industry 4.0 has proposed the Asset Administration Shell (AAS) model for digital twins. This model should help to solve interoperability issues, a topic that is also addressed by the Semantic Web and ...its Resource Description Framework (RDF). AAS and RDF-based models have their own strengths. AAS models are easier to integrate with operational technologies in a production environment, whereas RDF-based models offer more semantic expressiveness and advanced querying. In the Horizon MAS4AI project we found that both modelling paradigms can complement each other to develop agentbased digital twins for modular production environments. In this work we propose two different approaches to bridge both modelling paradigms. First we define a set of mapping rules to generate an AAS model from a given RDF-based model, supporting model development. Secondly, we propose to use RDF-based models to generate a digital shadow of AASs to improve semantic discoverability. Preliminary results demonstrate that heterogeneity of metamodels does not exclude achieving semantic interoperability, as well as that greater functionality can be obtained compared to using both models in isolation. The solutions will be further developed in collaboration with pilot lines in the MAS4AI project.
•Bridging the gap between semantic metamodels.•Developed conversion from RDF to AAS model.•Solution design combining RDF and AAS.•Application in industrial pilot lines.
The main aim of this research is to enhance the integration of life cycle assessment (LCA) and life cycle costing (LCC) methodologies with building information modelling (BIM), as existing approaches ...still have limitations (e.g. interoperability issues, non-editable databases). For that purpose, an automatic LCA/LCC analysis within a BIM-based environment is proposed. A BIM-based environmental and economic life cycle assessment (BIMEELCA) prototype tool was developed by the authors to improve the integration of BIM with LCA and LCC. Moreover, a pilot case study located in the Netherlands was used to validate the work developed in this research, thus demonstrating the usefulness of this approach for the construction industry. This study unveiled the potential of BIM-based simulations for the assessment of the environmental and economic impacts of buildings by integrating semantic information in the model. The work presented in this research is expected to contribute to the development of automatic sustainability simulations, creation of tailor-made BIM objects’ libraries, and use of historical data contained within data-rich models for predictive analyses (i.e. analyses that identify the most suitable solutions based on the data from previous projects).
•LCA and LCC analyses were performed on an office building in the Netherlands.•Environmental, economic, and physical information were inserted in the BIM objects.•An automatic Streamlined LCA/LCC analysis is possible in semantically-rich models.•The BIM-LCA/LCC approach was compared with two other traditional approaches.•This work is expected to contribute to the development of green BIM libraries.
Complex system development and maintenance face the challenge of dealing with different types of models due to language affordances, preferences, sizes, and so forth that involve interaction between ...users with different levels of proficiency. Current conceptual data modelling tools do not fully support these modes of working. It requires that the interaction between multiple models in multiple languages is clearly specified to ensure they keep their intended semantics, which is lacking in extant tools. The key objective is to devise a mechanism to support semantic interoperability in hybrid tools for multi-modal modelling in a plurality of paradigms, all within one system. We propose FaCIL, a framework for such hybrid modelling tools. We design and realise the framework FaCIL, which maps UML, ER and ORM2 into a common metamodel with rules that provide the central point for management among the models and that links to the formalisation and logic-based automated reasoning. FaCIL supports the ability to represent models in different formats while preserving their semantics, and several editing workflows are supported within the framework. It has a clear separation of concerns for typical conceptual modelling activities in an interoperable and extensible way. FaCIL structures and facilitates the interaction between visual and textual conceptual models, their formal specifications, and abstractions as well as tracking and propagating updates across all the representations. FaCIL is compared against the requirements, implemented in crowd 2.0, and assessed with a use case. The proof-of-concept implementation in the web-based modelling tool crowd 2.0 demonstrates its viability. The framework also meets the requirements and fully supports the use case.
This article puts forth the concept of protocol power as the disproportionate influence of dominant platform actors to shape and set industry-wide standards, and thus determine certain rules of ...inclusion at the technical, existential level of protocol. In this way, protocol power shapes and prefigures dynamics of platform power and intermediation as ever more objects are made “smart” and connected through such standards. To examine protocol power, I take up the case of Matter, an emerging Internet of Things connectivity standard led by Big American Tech promising to make all smart things interoperable across all platform ecosystems in the face of growing critiques from regulators, developers, and users. Increased levels of interoperability and connectivity are opening up new sites of data production and accumulation, and provide more opportunities for service and subscription provision by dominant platforms. This article argues that Matter and its promises exemplify processes of industry self-regulation, networked governance, and power-sharing among dominant actors in the tech industry in efforts to maintain and expand their market and platform power.