Interoperability (IOP) is the ability of a product or system – whose interfaces (APIs) are publicly documented – to connect to and operate with other products or systems, without restrictions. ...Interoperability further enables information and usable data to be properly exchanged and ensures the alignment of different business processes in critical sectors. In addition, is a prerequisite for transparent, domain-agnostic, and sustainable public sector digital services, where Public Administrations (PA) can efficiently interact across borders and domains by using common frameworks, standards, and processes for sharing information and data. The European Interoperability Framework (EIF) enables interoperability with guidelines for digital services. Therefore, the alignment with EIF becomes pivotal for the European Union (EU) countries since different regulations that facilitate and impose the implementation of European policies such as the Single Digital Gateway (SDG) regulation and the Once-Only Principle (OOP) consider the IOP a crucial technical and operational component for government digital services. This article proposes the update of the Greek NIF, with guidelines of EIF, OOP and other technological trends in conjunction with new legal and policy provisions. This proposed assessment methodology can be reused in other countries and can be further adapted for updating the EIF.
•Interoperability is in digital services and is a prerequisite for transparent, domain agnostic and sustainable public sector digital services.•This proposal contributes to digital transformation and interoperability knowledge by providing a systemic approach methodology for the identification of gaps and the proposal of concrete improvements and recommendations.•The methodology includes the conceptualized technological trends such as the Once Only Principle (OOP), the CEF Building Blocks (BB) and the Single Digital Gateway (SDG) regulation, for this proposal, in conjunction with new legal and policy provisions and can be reused in other countries and for the update of EIF.
In recent years, blockchain technology has drawn a lot of attention, especially in the field of decentralised finance (De-Fi). However, scalability problems have come to light as a significant ...obstacle to the broad use of blockchain-based applications. To solve the issue of scalability, this paper has created a decentralised finance application with three main components: the addition of more liquidity to the swapping application, the implementation of a Polygon Proof of Stake bridge to enable efficient asset transfers, and the ability to transfer tokens between accounts seamlessly regardless of network agnosticism. The first feature, network agnostic capabilities for interoperability, facilitates token transfers between blockchain networks, allowing users to access and transact across them with ease The second component, the Polygon Proof-of-Stake bridge, makes asset transfers more efficient by taking advantage of the Polygon network's scalability advantages, which drastically lower transaction costs and processing times. Finally, adding more liquidity to the swapping programme makes it more scalable by guaranteeing that there is enough money for transactions, which prevents delays and bottlenecks. The scalability issue with blockchain technology is efficiently resolved by adding these three characteristics to the decentralised finance application, creating new opportunities for the mass acceptance and utilisation of blockchain-based financial services.
Technological developments have resulted in a trend of cryptocurrencies that use a technology called blockchain to create and record all transactions made into a digital ledger. Along with the ...emergence of the trend of cryptocurrencies, this development has also resulted in crimes that hit the digital world, such as data leakage and cyber espionage. This threat can be prevented by applying blockchain technology to the database that has been used. Therefore, we need a system that can facilitate the use of blockchain technology and can process data from an existing relational database into a blockchain-based database. The system developed in this study was built with the FastAPI framework that uses the Python programming language and the React framework that uses the Javascript programming language. This system was tested using Katalon and Wireshark software to perform throughput testing and man-in-the-middle attacks. Evaluation of this system is assessed based on the average throughput time and also the results of Wireshark packet capture. The system designed in this research is expected to help overcome interoperability problems when using blockchain and improve relational database integrity. The results of the test show that the system is safe from man-in-the-middle attacks while sending data through API and has a faster throughput time than BigchainDB system by 4.151 seconds.
Sandvik and Flanders signed an interoperability memorandum of understanding to develop and deliver a digital interface between the Flanders ARDVARC Autonomous Drill System and Sandvik iSeries rotary ...blasthole drills. The interface will allow the rigs to integrate the system with little to no modification. The interface will make the system a plug-and-play solution. It will also simplify deployment of Sandvik rigs to mine sites that already use the system. Further, the open-system approach ensures the customer that adopts the system retains the OEM warranty and aftermarket support. The first deployment of the digital interface is scheduled for Q4 2022 with further deployments being scheduled soon after that.
Celotno besedilo
Dostopno za:
CEKLJ, DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
The aim of this article is to present a methodology for guiding enterprises to implement and improve interoperability. This methodology is based on three components: a framework of interoperability ...which structures specific solutions of interoperability and is composed of abstraction levels, views and approaches dimensions; a method to measure interoperability including interoperability before (maturity) and during (operational performances) a partnership; and a structured approach defining the steps of the methodology, from the expression of an enterprise's needs to implementation of solutions. The relationship which consistently relates these components forms the methodology and enables developing interoperability in a step-by-step manner. Each component of the methodology and the way it operates is presented. The overall approach is illustrated in a case study example on the basis of a process between a given company and its dealers. Conclusions and future perspectives are given at the end of the article.
Over the past decades, tremendous technological advances have emerged in human motion analysis (HMA).
How has technology for analysing human motion evolved over the past decades, and what clinical ...applications has it enabled?
The literature on HMA has been extensively reviewed, focusing on three main approaches: Fully-Instrumented Gait Analysis (FGA), Wearable Sensor Analysis (WSA), and Deep-Learning Video Analysis (DVA), considering both technical and clinical aspects.
FGA techniques relying on data collected using stereophotogrammetric systems, force plates, and electromyographic sensors have been dramatically improved providing highly accurate estimates of the biomechanics of motion. WSA techniques have been developed with the advances in data collection at home and in community settings. DVA techniques have emerged through artificial intelligence, which has marked the last decade. Some authors have considered WSA and DVA techniques as alternatives to “traditional” HMA techniques. They have suggested that WSA and DVA techniques are destined to replace FGA.
We argue that FGA, WSA, and DVA complement each other and hence should be accounted as “synergistic” in the context of modern HMA and its clinical applications. We point out that DVA techniques are especially attractive as screening techniques, WSA methods enable data collection in the home and community for extensive periods of time, and FGA does maintain superior accuracy and should be the preferred technique when a complete and highly accurate biomechanical data is required. Accordingly, we envision that future clinical applications of HMA would favour screening patients using DVA in the outpatient setting. If deemed clinically appropriate, then WSA would be used to collect data in the home and community to derive relevant information. If accurate kinetic data is needed, then patients should be referred to specialized centres where an FGA system is available, together with medical imaging and thorough clinical assessments.
•Gait analysis by stereophotogrammetry provides highly accurate biomechanics of motion.•Wearable sensors enable ecological data collection at home and in community setting.•Deep-learning video analysis can perform motion capture without instruments on subjects.•Several authors suggest more modern technologies shall replace traditional gait analysis.•We argue that these three technologies complement each other and shall be “synergistic”.
•RxNorm dose forms can be aligned to EDQM using RxNorm DF codes.•Key EDQM codes for dose form identification are RCA, ISI, TRA, and AME.•An alignment obstacle is lower granularity of RxNorm compared ...to EDQM.•Many unique dosage forms are comprised of the same EDQM descriptors.
There is currently no system that aligns pharmaceutically equivalent medicinal products across nations, creating obstacles to transnational medication prescribing and medical research. EDQM has been internationally recognized as the leading system in systematic pharmaceutical product descriptions. RxNorm is a critical terminology based in the US and used widely in applications internationally that would benefit from alignment with EDQM-based dosage form descriptions.
Demonstrate a method for alignment of RxNorm dosage forms with EDQM terminologies and with EDQM dosage forms. Describe obstacles and advantages of such an alignment for ultimate application in calculating universal Pharmaceutical Product Identifiers.
A pharmaceutical sciences student and a clinical pharmacology expert in dosage forms used definitions supplied by RxNorm and EDQM technical documentation to align the 120 RxNorm dose forms to EDQM-based dosage form description terms. The alignment of RxNorm to EDQM was then used to fit the RxNorm dose forms into an ontology based on EDQM.
The alignment of RxNorm and EDQM requires further validation but provides a potential method of establishing interoperability between the two terminologies without cumbersome manual reclassification. There remain ambiguities within each dosage form nomenclature that create obstacles to integrating medication databases rooted in EDQM and RxNorm into a single worldwide database.
Toward a Model for Personal Health Record Interoperability Roehrs, Alex; da Costa, Cristiano Andre; da Rosa Righi, Rodrigo ...
IEEE journal of biomedical and health informatics,
2019-March, 2019-03-00, 2019-3-00, 20190301, Letnik:
23, Številka:
2
Journal Article
Recenzirano
Health information technology, applied to electronic health record (EHR), has evolved with the adoption of standards for defining patient health records. However, there are many standards for ...defining such data, hindering communication between different healthcare providers. Even with adopted standards, patients often need to repeatedly provide their health information when they are taken care of at different locations. This problem hinders the adoption of personal health record (PHR), with the patients' health records under their own control. Therefore, the purpose of this paper is to propose an interoperability model for PHR use. The methodology consisted prototyping an application model named OmniPHR, to evaluate the structuring of semantic interoperability and integration of different health standards, using a real database from anonymized patients. We evaluated health data from a hospital database with 38 645 adult patients' medical records processed using different standards, represented by openEHR, HL7 FHIR, and MIMIC-III reference models. OmniPHR demonstrated the feasibility to provide interoperability through a standard ontology and artificial intelligence with natural language processing (NLP). Although the first executions reached a 76.39% F1-score and required retraining of the machine-learning process, the final score was 87.9%, presenting a way to obtain the original data from different standards on a single format. Unlike other models, OmniPHR presents a unified, structural semantic and up-to-date vision of PHR for patients and healthcare providers. The results were promising and demonstrated the possibility of subsidizing the creation of inferences rules about possible patient health problems or preventing future problems.
•Using metrics for interoperability assessment is applicable and useful.•Extra metrics are recommended to add into the framework e.g. human resources and change management are also important for ...interoperability evaluation.•Technology alone cannot resolve business processes interoperability problems.•Human resources is a critical component for interoperability.
In the current socio-economic environment, to face challenges such as the emergence of new technologies, globalisation and increasing demands from their clients it is inevitable that enterprises will collaborate with others and progressively shift their boundaries. In this context, interoperability has become a prerequisite in the jigsaw of such collaboration. By definition, it is entities’ ability to work together as an organisation. This ability spans a wide range of aspects, embracing both technical and business issues. Over the past decade, both the concept and the context of interoperability have been extended from a largely IT-focused domain to a business-focused domain and the evaluation of interoperability has become a rising concern. An increasing number of studies have concentrated on not just digital but business aspects of human behaviour in the social environment. In general, the wider application domain is the assessment of the interoperability of information systems and processes in any organisation (especially medium and large) that needs multiple processes to interact effectively.
To deal with such concerns and pave the way to achievement of more effective collaborative goals in business, the concept of interoperability has been adopted to measure the efficiency and productivity of information systems’ integration. More than twenty approaches have so far been adopted to evaluate this interoperability, however most are unable to assess it at the higher levels, such as at the pragmatic, process and social levels. Hence, we have conducted a three-phase study. Phase 1 reviewed existing interoperability evaluation approaches. To prove the concept, phase 2 proposed the concept of semiotic interoperability and its application to healthcare information systems. This article reports on the third phase of the study, a proposed framework with a group of metrics to measure interoperability from a new perspective – a semiotics perspective. The framework is named the Semiotic Interoperability Evaluation Framework (the SIEF) and has the ability to analyse, measure and assess the interoperability among business processes. The metrics derive from a feasibility study to investigate several interoperability barriers at a hospital. Next, the SIEF was applied in a case study and a detailed interoperability evaluation was conducted.