...of the 21st Century Cures Act (CURES) a final rule has been issued and relates to EHR blocking, interoperability and the ONC certification program. The penalty has been a mere $156,750 fine for ..."loss of functionality2" With the unforeseen delay, the remainder of the 170 hospitals in the VA system won't go live until 2028.3 The VA EHR modernization executive director announced her resignation this week and an interim director has been appointed effective February 25,2023-Additionally, a bill has been introduced, HR 592, in the US House of Representatives that in effect would prevent the VA from implementing the Oracle Cerner EHR platform at any other VA facilities until senior clinician approval is met.4 This only represents one hospital that mirrors what has happened in many others and with all vendors- not just the VAs. First VA Medical Center (finally) Goes Live on Cerner EHR as part of $16B Project.
•Presents a systematic literature review of Interoperability Assessment Approaches.•Elucidates different interoperability frameworks, assessment approaches and measurement techniques and ...mechanisms.•Follows a scientific approach for selecting relevant Interoperability Assessment Approaches.•Includes a comparative analysis of the selected Approaches.•Discusses the limitations of the main Interoperability Assessment Approaches.
The development of Interoperability is a necessity for organisations to achieve business goals and capture new market opportunities. Indeed, interoperability allows enterprises to exchange information and use it to seize their shared goals. Therefore, it should be verified and continuously improved. This is the main objective of the Interoperability Assessment (INAS). Indeed, such an assessment aims at determining the strengths and weakness of an enterprise in terms of interoperability. Many surveys and reviews have been proposed in the literature to analyse the existing INAS approaches. However, the majority of these reviews are focusing on specific properties rather than a general view of an INAS. Therefore, this paper proposes a systematic literature review of INAS approaches. The objectives are to identify the relevant INAS approaches and to compare them based on a holistic view based on their similar and different properties (e.g. type of assessment, the used measurement mechanism, and the addressed interoperability barriers). A bibliometric analysis of the selected INAS approaches is also conducted with a discussion of their advantages and limitations.
Encrypted database is an emerging and promising technology that is able to run SQL operations on encrypted data. However, most existing encrypted databases have no data interoperability, i.e., the ...output of an operator (e.g., addition) cannot be taken as input of another (e.g., comparison). As a result, these encrypted databases can only support simple queries like addition, multiplication and comparison, but unable to support a composition of these simple queries (e.g., SELECT user id FROM salary WHERE V1 + V2 > 5000). In SIGMOD '14, Wong et al. propose SDB, which to the best of our knowledge is the only encrypted database that achieves data interoperability. Unfortunately, it has recently been broken (VLDB '21). In this paper, we propose a novel encrypted database named SDB+. It achieves data interoperability based on a suit of sophisticated designs. We formally prove that SDB+ achieves indistinguishability under chosen query attacks (IND-CQA). We provide a full-fledged implementation and run it on three benchmarks. Our experimental results show that SDB+ achieves comparable efficiency with SDB, even though the latter is insecure.
Vast databases of billions of contact-based fingerprints have been developed to protect national borders and support e-governance programs. Emerging contactless fingerprint sensors offer better ...hygiene, security, and accuracy. However, the adoption/success of such contactless fingerprint technologies largely depends on advanced capability to match contactless 2D fingerprints with legacy contact-based fingerprint databases. This paper investigates such problem and develops a new approach to accurately match such fingerprint images. Robust thin-plate spline (RTPS) is developed to more accurately model elastic fingerprint deformations using splines. In order to correct such deformations on the contact-based fingerprints, RTPS-based generalized fingerprint deformation correction model (DCM) is proposed. The usage of DCM results in accurate alignment of key minutiae features observed on the contactless and contact-based fingerprints. Further improvement in such cross-matching performance is investigated by incorporating minutiae related ridges. We also develop a new database of 1800 contactless 2D fingerprints and the corresponding contact-based fingerprints acquired from 300 clients which is made publicly accessible for further research. The experimental results presented in this paper, using two publicly available databases, validate our approach and achieve outperforming results for matching contactless 2D and contact-based fingerprint images.
The shop floor or factory floor is the area inside a factory where manufacturing production is executed. The digitalisation of this area has been increasing in the last few years, introducing the ...Digital Twin (DT) and the Industry 4.0 concepts. A DT is the digital representation of a real object or an entire system. A DT includes a high diversity of components from different vendors that need to interact with each other efficiently. In most cases, the development of standards and protocols does not consider the need to operate with other standards and protocols, causing interoperability issues. Transducers (sensors and actuators) use the communication layer to exchange information with digital contra parts, and for this reason, the communication layer is one of the most relevant aspects of development. This paper covers DT development, going from the physical to the visualisation layer. The reference architecture models, standards, and protocols focus on interoperability to reach a syntactic level of communication between the IEEE 1451 and the IEC 61499 standards. A semantic communication layer connects transducer devices to the digital representation, achieving a semantic level of interoperability. This communication layer adds semantics to the communication process, allowing the development of an interoperable DT based on the IEEE 1451 standards. The DT presented reaches the syntactic and semantic levels of interoperability, allowing the monitoring and visualisation of a prototype system.
ABSTRACT
Much biodiversity data is collected worldwide, but it remains challenging to assemble the scattered knowledge for assessing biodiversity status and trends. The concept of Essential ...Biodiversity Variables (EBVs) was introduced to structure biodiversity monitoring globally, and to harmonize and standardize biodiversity data from disparate sources to capture a minimum set of critical variables required to study, report and manage biodiversity change. Here, we assess the challenges of a ‘Big Data’ approach to building global EBV data products across taxa and spatiotemporal scales, focusing on species distribution and abundance. The majority of currently available data on species distributions derives from incidentally reported observations or from surveys where presence‐only or presence–absence data are sampled repeatedly with standardized protocols. Most abundance data come from opportunistic population counts or from population time series using standardized protocols (e.g. repeated surveys of the same population from single or multiple sites). Enormous complexity exists in integrating these heterogeneous, multi‐source data sets across space, time, taxa and different sampling methods. Integration of such data into global EBV data products requires correcting biases introduced by imperfect detection and varying sampling effort, dealing with different spatial resolution and extents, harmonizing measurement units from different data sources or sampling methods, applying statistical tools and models for spatial inter‐ or extrapolation, and quantifying sources of uncertainty and errors in data and models. To support the development of EBVs by the Group on Earth Observations Biodiversity Observation Network (GEO BON), we identify 11 key workflow steps that will operationalize the process of building EBV data products within and across research infrastructures worldwide. These workflow steps take multiple sequential activities into account, including identification and aggregation of various raw data sources, data quality control, taxonomic name matching and statistical modelling of integrated data. We illustrate these steps with concrete examples from existing citizen science and professional monitoring projects, including eBird, the Tropical Ecology Assessment and Monitoring network, the Living Planet Index and the Baltic Sea zooplankton monitoring. The identified workflow steps are applicable to both terrestrial and aquatic systems and a broad range of spatial, temporal and taxonomic scales. They depend on clear, findable and accessible metadata, and we provide an overview of current data and metadata standards. Several challenges remain to be solved for building global EBV data products: (i) developing tools and models for combining heterogeneous, multi‐source data sets and filling data gaps in geographic, temporal and taxonomic coverage, (ii) integrating emerging methods and technologies for data collection such as citizen science, sensor networks, DNA‐based techniques and satellite remote sensing, (iii) solving major technical issues related to data product structure, data storage, execution of workflows and the production process/cycle as well as approaching technical interoperability among research infrastructures, (iv) allowing semantic interoperability by developing and adopting standards and tools for capturing consistent data and metadata, and (v) ensuring legal interoperability by endorsing open data or data that are free from restrictions on use, modification and sharing. Addressing these challenges is critical for biodiversity research and for assessing progress towards conservation policy targets and sustainable development goals.
IndtroductionThe aim of the current research was to identify the factors of interoperability of academic information systems in Islamic Azad University. This research is characterized by its applied ...and exploratory nature, aiming to achieve specific goals within the given context. To effectively accomplish these objectives, a mixed method and approach, combining qualitative and quantitative methodologies, has been employed.Literature ReviewThe information system and information technologies have become an integral part of processes, systems and organizational culture, and information technology as an asset and resource for creating a competitive advantage is a requirement (NooshinFard et al, 1400). In fact, information systems and information and communication technology are the same processes that are embedded in business procedures in order to absorb and use knowledge with the aim of improving organizational performance (Lufman and Lewis, 2016). Factors such as the variety of common information systems in Iranian universities, the use and gradual evolution of these systems, the special characteristics of the academic operating environment, the need to exchange data between different systems and the integration of information require their investigation and study from Various dimensions, including interactivity, which was the focus of the present study (Omidian, 1401). In the age of knowledge explosion, the development of information technology is an essential requirement for the efficiency of the educational system, and the requirement for the effectiveness of new technologies is the transformation in the teaching-learning culture (Manzhuk and Eram, 2015). For educational institutions as well as other modern institutions, the use of information technology does not only mean supporting management; Rather, it is an empowering element that helps to promote and improve the decision-making process at different levels of university management (Indrajit and Jokopranuto, 2006).MethodologyThe utilization of qualitative methods, specifically the Delphi method, allows for the extraction of essential components and indicators related to the research subject. This qualitative approach facilitates a deep and comprehensive understanding of the underlying factors at play. On the other hand, quantitative methods have been employed to validate the measurement models and examine the conceptual model. Through quantitative analysis, the researcher can assess the reliability and validity of the measurements used in the study. By conducting statistical tests and measurements, the researcher gains valuable insights into the relationships and associations between the variables outlined in the conceptual model. By employing a mixed method and approach, this research can harness the strengths of both qualitative and quantitative methodologies. This comprehensive approach provides a more robust and well-rounded understanding of the research subject, enabling the researcher to draw meaningful conclusions and make informed recommendations. The integration of qualitative and quantitative techniques enhances the validity and reliability of the research findings, ensuring a more comprehensive and impactful study. First, by using the meta-combination method, study resources including books, articles, and internet resources were studied in a structured manner in a seven-step process, and in this way, interoperability indicators were identified. Then, the indicators and components obtained from meta-combination analysis in the form of a structured questionnaire including indicators and components of interoperability. The questionnaire was submitted to the experts by using the Delphi method (qualitative approach) in order to explore their opinions in the stage. Then, based on the data collected from the qualitative stage and asking the opinions of the experts in three rounds, a questionnaire was finally compiled and in order to measure and evaluate using the analytical survey method (quantitative approach) and the use of modeling structural equations, which is actually a quantitative method and was examined in terms of the correspondence of the theoretical model with the real data (experimental data) obtained by sampling from the community. The research method in the quantitative part is descriptive-survey. The statistical population of the research in the meta-combined section includes printed and online sources and documents (such as the content of websites, databases such as Civilica, articles and scientific reports of specialized seminars and conferences, etc.) consisting of 100 sources in the form of books, articles, and scientific reports. In order to collect qualitative and quantitative data, a researcher-made questionnaire (50 items) was used, the items of which were taken from the results of the meta-composite analysis in the first stage and based on the Likert scale from 1 very little to 5 very much. In order to validate the meta-synthesis stage, the researcher returns to the previous steps to ensure that the quality is maintained in his study. In order to control the quality and review the articles to match the parameters of the study and to remove the articles that were not trusted in any way in terms of accuracy, validity and importance of its findings, all the articles and scientific reports were categorized, in several stages of study and review, a large number of sources were removed. And 24 sources were selected and exploited in the form of 12 internal sources (for the years 2005 to 1400) and 12 external sources (for the years 2002 to 2022). In the quantitative section, content validity was used from the beginning to measure the validity of the questionnaire, which had good validity, and Krobach's alpha coefficient was used to measure the reliability of the questionnaire, which was 91%, which is an acceptable coefficient for reliability. After studying and checking the details and features of the documents such as abstract, content and based on the goal setting, 24 sources (12 foreign sources and 12 Persian sources) were selected and through them the dimensions, components and interoperability idicators in the study process, were used. The statistical population in the qualitative part of the research are key informants and experts in the field of information systems and senior managers of information technology in Iranian universities. At this stage, 25 experts were selected in the field of the research topic. The statistical population in the quantitative section is made up of managers and employees of the information technology and information systems department at Islamic Azad University. The sample size was selected by simple random sampling with Cochran's formula of 151 people. In order to collect qualitative data, a library method was used, and for quantitative data, a researcher-made questionnaire (50 items) was used, the items of which were taken from the results of meta-composite analysis in the first stage. In order to analyze the data in the meta-composite part, the method of Sandelowski and Barosuke and in the quantitative part the methods of exploratory factor analysis, descriptive analysis, univariate t-test using SPSS and Lisrel software were used.ResultsThe results showed that the indicators of technical interoperability of information systems are the ability to interact and exchange data with information systems, the possibility of connecting and using decision support systems, storing information in a standard format, central security, central monitoring, integrated processing, easy communication with other systems, ability to be used through distance education. Overall, the results of this research have shown that the architecture and structure of university information systems should be such that it provides the integrity and comprehensiveness of processes and information at the level of organizations and provides a smooth flow of information between different departments of the organization. The use of interoperable information systems that can cover all technical, process and semantic interoperability indicators and activities and tasks in an organization and provide necessary information to its users in a timely manner is one of the vital tools in today's organizations. Without the systems having these characteristics (technical, semantic and process) it is impossible to increase the capabilities of the organization, improve performance, make better decisions and achieve interactive, integrated and competitive advantage.Process interoperability indicators, notification mechanism about presentation and update, change and flexibility mechanism for service update, dynamic and flexible organization, change-oriented performance management, effectiveness measurement and feedback, expandable architecture, They are expandable according to new requirements, service-oriented architecture. Also, the results of the research showed that indicators of semantic interoperability, the ability to code educational signs, the ability to interact with various systems, the use of standard terms and codes, XML translation service, mapping service, providing a common message format for communication. Among the different systems, the content-based router, the use of terms and standard codes, paying attention to the understanding of users and systems in the use of vocabulary.ConclusionPaying attention to the optimal use of information systems can make the university succeed in achieving organizational goals and achieving high effectiveness and efficiency. Therefore, the management should always consider the characteristics of these valuable and transformative resources, and with sufficient knowledge and the participation of specialized employees in the field of supply, use and placement of these items, it will improve the performance of the strategies. In order to reach more favorable levels in this field, the university should
•The interoperability models either use complex metrics or separate levels.•The interoperability models concentrate on selective aspects of interoperability.•The interoperability models focus on ...structure and content but not solutions.•Visualization and visual analytics have a potential to assess interoperability.
Cyber-physical systems (CPS) are developed through the cooperation of several engineering disciplines. Powerful software tools are utilized by each individual discipline, but it remains challenging to connect these into tool chains for increased efficiency. To support this endeavour, the literature on interoperability assessment was surveyed to identify concepts valuable to transfer from the interoperability to the tool integration research field.
Implementation options, types of interoperability and domains described in interoperability assessment models were concepts identified as directly transferable. To avoid the problems with uptake that plague the models identified, visual analytics is suggested as a vehicle for the transfer. Furthermore, based on the use of non-functional properties as an underlying motivation for these models, cost, performance and sustainability are suggested as a common base for future research in both discourses.
In this paper, the concept of C4ISR systems interoperability is analyzed and the difference compared with traditional system interoperability is presented. The influencing factors of C4ISR system ...interoperability is analyzed from the overall framework of the US military global information grid. Six attributes including structure, application, facility, security, operation and maintenance, and data are selected. The selection process of evaluation attributes is described. Based on the enhanced interoperability maturity mode, the C4ISR system level attribute model is given by combing with the development trend of C4ISR system technology system. The grade of maturity and an evaluation index system for C4ISR systems interoperability are built and the index level reference model is designed. Combining qualitative assessment with quantitative assessment, an index synthesis criterion based on the mapping model and the corresponding interoperability level evaluation method for C4ISR systems are proposed to provide a specific method model for measuring the interoperability level of the C4ISR system.