...of the 21st Century Cures Act (CURES) a final rule has been issued and relates to EHR blocking, interoperability and the ONC certification program. The penalty has been a mere $156,750 fine for ..."loss of functionality2" With the unforeseen delay, the remainder of the 170 hospitals in the VA system won't go live until 2028.3 The VA EHR modernization executive director announced her resignation this week and an interim director has been appointed effective February 25,2023-Additionally, a bill has been introduced, HR 592, in the US House of Representatives that in effect would prevent the VA from implementing the Oracle Cerner EHR platform at any other VA facilities until senior clinician approval is met.4 This only represents one hospital that mirrors what has happened in many others and with all vendors- not just the VAs. First VA Medical Center (finally) Goes Live on Cerner EHR as part of $16B Project.
The adoption of digital signatures is becoming increasingly popular among Malaysian users due to its many advantages, including increased security, convenience, and cost savings. However, one of the ...challenges that users face is the lack of cross Certification Authority (CA) interoperability, which hinders the ability to use digital signatures across different platforms and services. To address this challenge, there is a growing need for promoting cross CA interoperability in Malaysia, which would enable users to use digital signatures seamlessly across various platforms and services. This paper aims to identify the CA capacity and digital signature market demand in promoting cross CA interoperability. This can be achieved through the qualitative interviews from CAs operating in Malaysia to gather views on interoperability across their platforms, the value and implications of such practice, and to establish the potential relationship between interoperability and increased Digital Signature efficiency and market demand. The interview data is analyzed using Atlas.ti and meta-analysis. Based on the result, the adoption of digital signatures and the promotion of cross CA interoperability are critical for advancing Malaysia's digital economy and enhancing the country’s overall competitiveness. With the right infrastructure and policies in place, Malaysia can become a leader in the use of digital signatures and the promotion of cross CA interoperability, which would benefit both individuals and businesses alike.
•Presents a systematic literature review of Interoperability Assessment Approaches.•Elucidates different interoperability frameworks, assessment approaches and measurement techniques and ...mechanisms.•Follows a scientific approach for selecting relevant Interoperability Assessment Approaches.•Includes a comparative analysis of the selected Approaches.•Discusses the limitations of the main Interoperability Assessment Approaches.
The development of Interoperability is a necessity for organisations to achieve business goals and capture new market opportunities. Indeed, interoperability allows enterprises to exchange information and use it to seize their shared goals. Therefore, it should be verified and continuously improved. This is the main objective of the Interoperability Assessment (INAS). Indeed, such an assessment aims at determining the strengths and weakness of an enterprise in terms of interoperability. Many surveys and reviews have been proposed in the literature to analyse the existing INAS approaches. However, the majority of these reviews are focusing on specific properties rather than a general view of an INAS. Therefore, this paper proposes a systematic literature review of INAS approaches. The objectives are to identify the relevant INAS approaches and to compare them based on a holistic view based on their similar and different properties (e.g. type of assessment, the used measurement mechanism, and the addressed interoperability barriers). A bibliometric analysis of the selected INAS approaches is also conducted with a discussion of their advantages and limitations.
Vast databases of billions of contact-based fingerprints have been developed to protect national borders and support e-governance programs. Emerging contactless fingerprint sensors offer better ...hygiene, security, and accuracy. However, the adoption/success of such contactless fingerprint technologies largely depends on advanced capability to match contactless 2D fingerprints with legacy contact-based fingerprint databases. This paper investigates such problem and develops a new approach to accurately match such fingerprint images. Robust thin-plate spline (RTPS) is developed to more accurately model elastic fingerprint deformations using splines. In order to correct such deformations on the contact-based fingerprints, RTPS-based generalized fingerprint deformation correction model (DCM) is proposed. The usage of DCM results in accurate alignment of key minutiae features observed on the contactless and contact-based fingerprints. Further improvement in such cross-matching performance is investigated by incorporating minutiae related ridges. We also develop a new database of 1800 contactless 2D fingerprints and the corresponding contact-based fingerprints acquired from 300 clients which is made publicly accessible for further research. The experimental results presented in this paper, using two publicly available databases, validate our approach and achieve outperforming results for matching contactless 2D and contact-based fingerprint images.
Summary
The vendor lock‐in is a prominent issue in cloud computing. It is caused by cloud providers who offer proprietary services, which hinders the cloud interoperability. Client‐centric ...interoperability enables the migration of the data and applications across clouds; it gives the clients control over their workloads and a wider range of service choices. Whereas, provider‐centric interoperability allows the providers to collaborate. Thus, providers, who have spare resources, can lend them to other providers who lack computational or storage capabilities to overcome the limitations of their local resources. In this article, we conduct a survey to differentiate between client‐ and provider‐centric interoperability solutions. We aim to provide an up‐to‐date analysis of the current tendencies and the neglected areas of the cloud interoperability field. Thus, we study the cloud service interoperability evolution through the years. Furthermore, we propose definitions for the intra‐cloud and inter‐cloud interoperability. Moreover, we propose a taxonomy to classify the cloud interoperability approaches into client‐centric and provider‐centric categories. Then, for each category, we classify the approaches based on their interoperability environment into single cloud or interconnected clouds. Finally, we analyze and compare the approaches based on multiple criteria. The study reveals the focus on the client‐centric solutions and the interoperability in interconnected clouds. We notice more interest in the data and application levels interoperability, mainly, in infrastructure as a service model. We also find that client‐centric solutions are, mostly, semantic technologies and brokers. However, provider‐centric solutions are middleware, protocols, and standards. We conclude that a generic cloud service interoperability model is needed.
The shop floor or factory floor is the area inside a factory where manufacturing production is executed. The digitalisation of this area has been increasing in the last few years, introducing the ...Digital Twin (DT) and the Industry 4.0 concepts. A DT is the digital representation of a real object or an entire system. A DT includes a high diversity of components from different vendors that need to interact with each other efficiently. In most cases, the development of standards and protocols does not consider the need to operate with other standards and protocols, causing interoperability issues. Transducers (sensors and actuators) use the communication layer to exchange information with digital contra parts, and for this reason, the communication layer is one of the most relevant aspects of development. This paper covers DT development, going from the physical to the visualisation layer. The reference architecture models, standards, and protocols focus on interoperability to reach a syntactic level of communication between the IEEE 1451 and the IEC 61499 standards. A semantic communication layer connects transducer devices to the digital representation, achieving a semantic level of interoperability. This communication layer adds semantics to the communication process, allowing the development of an interoperable DT based on the IEEE 1451 standards. The DT presented reaches the syntactic and semantic levels of interoperability, allowing the monitoring and visualisation of a prototype system.
IndtroductionThe aim of the current research was to identify the factors of interoperability of academic information systems in Islamic Azad University. This research is characterized by its applied ...and exploratory nature, aiming to achieve specific goals within the given context. To effectively accomplish these objectives, a mixed method and approach, combining qualitative and quantitative methodologies, has been employed.Literature ReviewThe information system and information technologies have become an integral part of processes, systems and organizational culture, and information technology as an asset and resource for creating a competitive advantage is a requirement (NooshinFard et al, 1400). In fact, information systems and information and communication technology are the same processes that are embedded in business procedures in order to absorb and use knowledge with the aim of improving organizational performance (Lufman and Lewis, 2016). Factors such as the variety of common information systems in Iranian universities, the use and gradual evolution of these systems, the special characteristics of the academic operating environment, the need to exchange data between different systems and the integration of information require their investigation and study from Various dimensions, including interactivity, which was the focus of the present study (Omidian, 1401). In the age of knowledge explosion, the development of information technology is an essential requirement for the efficiency of the educational system, and the requirement for the effectiveness of new technologies is the transformation in the teaching-learning culture (Manzhuk and Eram, 2015). For educational institutions as well as other modern institutions, the use of information technology does not only mean supporting management; Rather, it is an empowering element that helps to promote and improve the decision-making process at different levels of university management (Indrajit and Jokopranuto, 2006).MethodologyThe utilization of qualitative methods, specifically the Delphi method, allows for the extraction of essential components and indicators related to the research subject. This qualitative approach facilitates a deep and comprehensive understanding of the underlying factors at play. On the other hand, quantitative methods have been employed to validate the measurement models and examine the conceptual model. Through quantitative analysis, the researcher can assess the reliability and validity of the measurements used in the study. By conducting statistical tests and measurements, the researcher gains valuable insights into the relationships and associations between the variables outlined in the conceptual model. By employing a mixed method and approach, this research can harness the strengths of both qualitative and quantitative methodologies. This comprehensive approach provides a more robust and well-rounded understanding of the research subject, enabling the researcher to draw meaningful conclusions and make informed recommendations. The integration of qualitative and quantitative techniques enhances the validity and reliability of the research findings, ensuring a more comprehensive and impactful study. First, by using the meta-combination method, study resources including books, articles, and internet resources were studied in a structured manner in a seven-step process, and in this way, interoperability indicators were identified. Then, the indicators and components obtained from meta-combination analysis in the form of a structured questionnaire including indicators and components of interoperability. The questionnaire was submitted to the experts by using the Delphi method (qualitative approach) in order to explore their opinions in the stage. Then, based on the data collected from the qualitative stage and asking the opinions of the experts in three rounds, a questionnaire was finally compiled and in order to measure and evaluate using the analytical survey method (quantitative approach) and the use of modeling structural equations, which is actually a quantitative method and was examined in terms of the correspondence of the theoretical model with the real data (experimental data) obtained by sampling from the community. The research method in the quantitative part is descriptive-survey. The statistical population of the research in the meta-combined section includes printed and online sources and documents (such as the content of websites, databases such as Civilica, articles and scientific reports of specialized seminars and conferences, etc.) consisting of 100 sources in the form of books, articles, and scientific reports. In order to collect qualitative and quantitative data, a researcher-made questionnaire (50 items) was used, the items of which were taken from the results of the meta-composite analysis in the first stage and based on the Likert scale from 1 very little to 5 very much. In order to validate the meta-synthesis stage, the researcher returns to the previous steps to ensure that the quality is maintained in his study. In order to control the quality and review the articles to match the parameters of the study and to remove the articles that were not trusted in any way in terms of accuracy, validity and importance of its findings, all the articles and scientific reports were categorized, in several stages of study and review, a large number of sources were removed. And 24 sources were selected and exploited in the form of 12 internal sources (for the years 2005 to 1400) and 12 external sources (for the years 2002 to 2022). In the quantitative section, content validity was used from the beginning to measure the validity of the questionnaire, which had good validity, and Krobach's alpha coefficient was used to measure the reliability of the questionnaire, which was 91%, which is an acceptable coefficient for reliability. After studying and checking the details and features of the documents such as abstract, content and based on the goal setting, 24 sources (12 foreign sources and 12 Persian sources) were selected and through them the dimensions, components and interoperability idicators in the study process, were used. The statistical population in the qualitative part of the research are key informants and experts in the field of information systems and senior managers of information technology in Iranian universities. At this stage, 25 experts were selected in the field of the research topic. The statistical population in the quantitative section is made up of managers and employees of the information technology and information systems department at Islamic Azad University. The sample size was selected by simple random sampling with Cochran's formula of 151 people. In order to collect qualitative data, a library method was used, and for quantitative data, a researcher-made questionnaire (50 items) was used, the items of which were taken from the results of meta-composite analysis in the first stage. In order to analyze the data in the meta-composite part, the method of Sandelowski and Barosuke and in the quantitative part the methods of exploratory factor analysis, descriptive analysis, univariate t-test using SPSS and Lisrel software were used.ResultsThe results showed that the indicators of technical interoperability of information systems are the ability to interact and exchange data with information systems, the possibility of connecting and using decision support systems, storing information in a standard format, central security, central monitoring, integrated processing, easy communication with other systems, ability to be used through distance education. Overall, the results of this research have shown that the architecture and structure of university information systems should be such that it provides the integrity and comprehensiveness of processes and information at the level of organizations and provides a smooth flow of information between different departments of the organization. The use of interoperable information systems that can cover all technical, process and semantic interoperability indicators and activities and tasks in an organization and provide necessary information to its users in a timely manner is one of the vital tools in today's organizations. Without the systems having these characteristics (technical, semantic and process) it is impossible to increase the capabilities of the organization, improve performance, make better decisions and achieve interactive, integrated and competitive advantage.Process interoperability indicators, notification mechanism about presentation and update, change and flexibility mechanism for service update, dynamic and flexible organization, change-oriented performance management, effectiveness measurement and feedback, expandable architecture, They are expandable according to new requirements, service-oriented architecture. Also, the results of the research showed that indicators of semantic interoperability, the ability to code educational signs, the ability to interact with various systems, the use of standard terms and codes, XML translation service, mapping service, providing a common message format for communication. Among the different systems, the content-based router, the use of terms and standard codes, paying attention to the understanding of users and systems in the use of vocabulary.ConclusionPaying attention to the optimal use of information systems can make the university succeed in achieving organizational goals and achieving high effectiveness and efficiency. Therefore, the management should always consider the characteristics of these valuable and transformative resources, and with sufficient knowledge and the participation of specialized employees in the field of supply, use and placement of these items, it will improve the performance of the strategies. In order to reach more favorable levels in this field, the university should
•The interoperability models either use complex metrics or separate levels.•The interoperability models concentrate on selective aspects of interoperability.•The interoperability models focus on ...structure and content but not solutions.•Visualization and visual analytics have a potential to assess interoperability.
Cyber-physical systems (CPS) are developed through the cooperation of several engineering disciplines. Powerful software tools are utilized by each individual discipline, but it remains challenging to connect these into tool chains for increased efficiency. To support this endeavour, the literature on interoperability assessment was surveyed to identify concepts valuable to transfer from the interoperability to the tool integration research field.
Implementation options, types of interoperability and domains described in interoperability assessment models were concepts identified as directly transferable. To avoid the problems with uptake that plague the models identified, visual analytics is suggested as a vehicle for the transfer. Furthermore, based on the use of non-functional properties as an underlying motivation for these models, cost, performance and sustainability are suggested as a common base for future research in both discourses.
In this paper, the concept of C4ISR systems interoperability is analyzed and the difference compared with traditional system interoperability is presented. The influencing factors of C4ISR system ...interoperability is analyzed from the overall framework of the US military global information grid. Six attributes including structure, application, facility, security, operation and maintenance, and data are selected. The selection process of evaluation attributes is described. Based on the enhanced interoperability maturity mode, the C4ISR system level attribute model is given by combing with the development trend of C4ISR system technology system. The grade of maturity and an evaluation index system for C4ISR systems interoperability are built and the index level reference model is designed. Combining qualitative assessment with quantitative assessment, an index synthesis criterion based on the mapping model and the corresponding interoperability level evaluation method for C4ISR systems are proposed to provide a specific method model for measuring the interoperability level of the C4ISR system.