Data Quality in Rare Diseases Registries Kodra, Yllka; Posada de la Paz, Manuel; Coi, Alessio ...
Advances in experimental medicine and biology,
01/2017, Letnik:
1031
Journal Article
Recenzirano
In the field of rare diseases, registries are considered power tool to develop clinical research, to facilitate the planning of appropriate clinical trials, to improve patient care and healthcare ...planning. Therefore high quality data of rare diseases registries is considered to be one of the most important element in the establishment and maintenance of a registry. Data quality can be defined as the totality of features and characteristics of data set that bear on its ability to satisfy the needs that result from the intended use of the data. In the context of registries, the 'product' is data, and quality refers to data quality, meaning that the data coming into the registry have been validated, and ready for use for analysis and research. Determining the quality of data is possible through data assessment against a number of dimensions: completeness, validity; coherence and comparability; accessibility; usefulness; timeliness; prevention of duplicate records. Many others factors may influence the quality of a registry: development of standardized Case Report Form and security/safety controls of informatics infrastructure. With the growing number of rare diseases registries being established, there is a need to develop a quality validation process to evaluate the quality of each registry. A clear description of the registry is the first step when assessing data quality or the registry evaluation system. Here we report a template as a guide for helping registry owners to describe their registry.
Abstract
Objective
This study evaluated the degree to which recommendations for demographic data standardization improve patient matching accuracy using real-world datasets.
Materials and Methods
We ...used 4 manually reviewed datasets, containing a random selection of matches and nonmatches. Matching datasets included health information exchange (HIE) records, public health registry records, Social Security Death Master File records, and newborn screening records. Standardized fields including last name, telephone number, social security number, date of birth, and address. Matching performance was evaluated using 4 metrics: sensitivity, specificity, positive predictive value, and accuracy.
Results
Standardizing address was independently associated with improved matching sensitivities for both the public health and HIE datasets of approximately 0.6% and 4.5%. Overall accuracy was unchanged for both datasets due to reduced match specificity. We observed no similar impact for address standardization in the death master file dataset. Standardizing last name yielded improved matching sensitivity of 0.6% for the HIE dataset, while overall accuracy remained the same due to a decrease in match specificity. We noted no similar impact for other datasets. Standardizing other individual fields (telephone, date of birth, or social security number) showed no matching improvements. As standardizing address and last name improved matching sensitivity, we examined the combined effect of address and last name standardization, which showed that standardization improved sensitivity from 81.3% to 91.6% for the HIE dataset.
Conclusions
Data standardization can improve match rates, thus ensuring that patients and clinicians have better data on which to make decisions to enhance care quality and safety.
Purpose of review
The purpose of this review is to provide an overview of updates in data standardization and data privacy in ophthalmology. These topics represent two key aspects of medical ...information sharing and are important knowledge areas given trends in data-driven healthcare.
Recent findings
Standardization and privacy can be seen as complementary aspects that pertain to data sharing. Standardization promotes the ease and efficacy through which data is shared. Privacy considerations ensure that data sharing is appropriate and sufficiently controlled. There is active development in both areas, including government regulations and common data models to advance standardization, and application of technologies such as blockchain and synthetic data to help tackle privacy issues. These advancements have seen use in ophthalmology, but there are areas where further work is required.
Summary
Information sharing is fundamental to both research and care delivery, and standardization/privacy are key constituent considerations. Therefore, widespread engagement with, and development of, data standardization and privacy ecosystems stand to offer great benefit to ophthalmology.
•We investigate digital interoperability (DI) issues in logistics and SCM.•We discuss new and important DI challenges under the Physical Internet (PI) Paradigm.•Bibliometric review shows the current ...solutions are not fully aligned with PI.•Interdisciplinary approaches and solutions are studied for new research perspectives.•New research avenues are suggested to advance the research and application for PI.
Interoperability is playing an increasing role for today’s logistics and supply chain management (LSCM) because of the trends of cooperation or coopetition. Especially, digital interoperability concerning data or information exchange becomes a key enabler for the next evolutions that will massively rely upon digitalization, artificial intelligence, and autonomous systems. The notion of Physical Internet (PI) is one such evolution, an innovative worldwide logistic paradigm aimed at interconnecting and coordinating logistics networks for efficiency and sustainability. This paper investigates how digital interoperability can help interconnect logistics and supply networks as well as the operational solutions for sustainable development, and examines the new challenges and research opportunities for digital interoperability under the PI paradigm. To this end, we study the most relevant technologies for digital interoperability in LSCM, via a bibliometric analysis based on 208 papers published during 2010−2020. The results reveal that the present state-of-the-art solutions of digital interoperability are not fully aligned with PI requirements and show new challenges, research gaps and opportunities that need further discussion. Accordingly, several research avenues are suggested to advance research and applications in this area, and to achieve interconnection in logistics and supply networks for sustainability.
•Power quality impact of EV fast charging and standard compliance is investigated.•THD, TDD, individual harmonic amplitude and phase angles were analysed.•Standard compliance depend on short circuit ...values and TDD use is advised over THD.•Two EV simulation show no cancellation effect as phase angle range is within 90°.•Fast chargers connection limit is set by harmonic limits and not only by power.
Fast charging is perceived by users as a preferred method for extending the average daily mobility of electric vehicles (EV). The rated power of fast chargers, their expected operation during peak hours, and clustering in designated stations, raise significant concerns. On one hand it raises concerns about standard requirements for power quality, especially harmonic distortion due to the use of power electronics connecting to high loads, typically ranging from 18 to 24kWh. On the other hand, infrastructure dimensioning and design limitations for those investing in such facilities need to be considered. Four sets of measurements were performed during the complete charging cycle of an EV, and individual harmonic's amplitude and phase angles behaviour were analysed. In addition, the voltage and current total harmonic distortion (THD) and Total Demand Distortion (TDD) were calculated and the results compared with the IEEE519, IEC 61000/EN50160 standards. Additionally, two vehicles being fast charged while connected to the same feeder were simulated and an analysis was carried out on how the harmonic phase angles would relate. The study concluded that the use of TDD was a better indicator than THD, since the former uses the maximum current (IL) and the latter uses the fundamental current, sometimes misleading conclusions, hence it is suggested it should be included in IEC/EN standard updates. Voltage THD and TDD for the charger analysed, were within the standard's limits of 1.2% and 12% respectively, however individual harmonics (11th and 13th) failed to comply with the 5.5% limit in IEEE 519 (5% and 3% respectively in IEC61000). Phase angles tended to have preferential range differences from the fundamental wave. It was found that the average difference between the same harmonic order phase angles was lower than 90°, meaning that when more than one vehicle is connected to the same feeder the amplitudes will add. Since the limits are dependable on the upstream short circuit current (ISC), if the number of vehicles increases (i.e. IL), the standard limits will decrease and eventually be exceeded. The harmonic limitation is hence the primary binding condition, certainly before the power limitation. The initial limit to the number of chargers is not the power capacity of the upstream power circuit but the harmonic limits for electricity pollution.
•An ontology for describing and supporting an interoperability assessment is presented.•The proposed ontology provides a sound description of the relevant concepts, relationships, and logic rules ...related to interoperability assessment.•The development of the proposed ontology follows a system engineering approach.•The proposed ontology is applied in a real case scenario.
Enterprise Interoperability is a requirement for ensuring an effective collaboration within a network of enterprises. Therefore, interoperability should be continuously assessed and improved for avoiding collaboration issues. To do so, an interoperability assessment can be performed by the concerned enterprises. Such an assessment provides an overview of the enterprise systems’ strengths and weaknesses regarding interoperability. A plethora of assessment approaches are proposed in the literature. The majority of them focus on one single aspect of interoperability. In general, to have a holistic view of the assessed systems, i.e. consider different aspects, enterprises have to apply different approaches. However, the application of multiple approaches may cause redundancy and confusion when assessing the same system using different metrics and viewpoints. Therefore, this article is to propose an ontology for interoperability assessment. The main objective of such an ontology is to provide a sound description of all relevant concepts and relationships regarding an interoperability assessment. Inference rules are also provided for reasoning on interoperability problems. A case study based on a real enterprise in presented to evaluate the proposed ontology.
eHealth has an enormous potential to improve healthcare cost, effectiveness, and quality of care. However, there seems to be a gap between the foreseen benefits of research and clinical reality.
Our ...objective was to systematically review the factors influencing the outcome of eHealth interventions in terms of success and failure.
We searched the PubMed database for original peer-reviewed studies on implemented eHealth tools that reported on the factors for the success or failure, or both, of the intervention. We conducted the systematic review by following the patient, intervention, comparison, and outcome framework, with 2 of the authors independently reviewing the abstract and full text of the articles. We collected data using standardized forms that reflected the categorization model used in the qualitative analysis of the outcomes reported in the included articles.
Among the 903 identified articles, a total of 221 studies complied with the inclusion criteria. The studies were heterogeneous by country, type of eHealth intervention, method of implementation, and reporting perspectives. The article frequency analysis did not show a significant discrepancy between the number of reports on failure (392/844, 46.5%) and on success (452/844, 53.6%). The qualitative analysis identified 27 categories that represented the factors for success or failure of eHealth interventions. A quantitative analysis of the results revealed the category quality of healthcare (n=55) as the most mentioned as contributing to the success of eHealth interventions, and the category costs (n=42) as the most mentioned as contributing to failure. For the category with the highest unique article frequency, workflow (n=51), we conducted a full-text review. The analysis of the 23 articles that met the inclusion criteria identified 6 barriers related to workflow: workload (n=12), role definition (n=7), undermining of face-to-face communication (n=6), workflow disruption (n=6), alignment with clinical processes (n=2), and staff turnover (n=1).
The reviewed literature suggested that, to increase the likelihood of success of eHealth interventions, future research must ensure a positive impact in the quality of care, with particular attention given to improved diagnosis, clinical management, and patient-centered care. There is a critical need to perform in-depth studies of the workflow(s) that the intervention will support and to perceive the clinical processes involved.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK