M-Passport is a mobile application developed for Indonesians to request for passport online. The applicants independently input all required data using this application, so the quality of data ...entered must be considered to ensure the passport’s validity as an official state document. However, input errors increase the time needed for the interview process and make the data verification procedure inefficient. The research aims to assess the data quality of M-Passport for organizations to take deliberate actions to enhance the data quality. The research applies the Total Data Quality Management (TDQM) method and the Data Management Body of Knowledge (DMBOK). Six data quality dimensions are used. It consists of completeness, validity, accuracy, timeliness, uniqueness, and consistency. The measure phase is carried out on 17 entities in the M-Passport database through a query process in the production environment. Then, the analysis phase observes the problems based on the pre-determined dimensional classification groups. The result indicates that the average values of completeness, validity, accuracy, consistency, timeliness, and uniqueness are 99.20%, 99.41%, 100%, 90.68%, 78.52%, and 99.98%, respectively. According to the findings, timeliness and consistency are the lowest dimensions in fulfilling business rules. It indicates that organizations need to focus more on improving data quality in these dimensions. Then, based on the DMBOK, the research also generates recommendations for resolving technical and operational issues.
Data Quality Management in Educational Data Wilantika, Nori; Wibowo, Wahyu Catur
Jurnal sistem informasi (Journal of information system) (Online),
10/2019, Letnik:
15, Številka:
2
Journal Article
Recenzirano
Odprti dostop
Every varsity in Indonesia is responsible for ensuring the completeness, the validity, the accuracy, and the currency of its educational data. The educational data is used for implementing ...higher-education quality assurance system and formulating policies related to universities and majors in Indonesia. Data quality assessment result indicates that educational data in Statistics Polytechnic did not meet completeness, validity, accuracy, and currency criteria. Data quality management maturity has been measured using Loshin’s Data Quality Maturity Model which result is in level 1 to level 2 of maturity. Only the data quality dimensions component has achieved the expected target. Thus, recommendations have been proposed based on the DAMA-DMBOK framework. The activities needed to be carried out are developing and promoting awareness of data quality; defining data quality requirements; profiling, analyzing, and evaluating data quality; define business rules for data quality, establish, and evaluate the data quality services levels, manage problems related to data quality, design and implement operational procedures for data quality management, and monitor operations and performance of data quality management procedures.
Data Quality Management in Educational Data Nori Wilantika; Wahyu Catur Wibowo
Jurnal sistem informasi (Journal of information system) (Online),
10/2019, Letnik:
15, Številka:
2
Journal Article
Recenzirano
Odprti dostop
Every varsity in Indonesia is responsible for ensuring the completeness, the validity, the accuracy, and the currency of its educational data. The educational data is used for implementing ...higher-education quality assurance system and formulating policies related to universities and majors in Indonesia. Data quality assessment result indicates that educational data in Statistics Polytechnic did not meet completeness, validity, accuracy, and currency criteria. Data quality management maturity has been measured using Loshin’s Data Quality Maturity Model which result is in level 1 to level 2 of maturity. Only the data quality dimensions component has achieved the expected target. Thus, recommendations have been proposed based on the DAMA-DMBOK framework. The activities needed to be carried out are developing and promoting awareness of data quality; defining data quality requirements; profiling, analyzing, and evaluating data quality; define business rules for data quality, establish, and evaluate the data quality services levels, manage problems related to data quality, design and implement operational procedures for data quality management, and monitor operations and performance of data quality management procedures.
•A firm's competence in maintaining quality (i.e., consistency and completeness) of corporate data positively affects the firm's adoption intention for big data analytics.•A firm's positive ...experience (i.e., benefit perceptions) in utilizing external source data could encourage its adoption intention for big data analytics.•A firm's positive experience (i.e., benefit perceptions) in utilizing internal source data could hamper its adoption intention for big data analytics.
Big data analytics associated with database searching, mining, and analysis can be seen as an innovative IT capability that can improve firm performance. Even though some leading companies are actively adopting big data analytics to strengthen market competition and to open up new business opportunities, many firms are still in the early stage of the adoption curve due to lack of understanding of and experience with big data. Hence, it is interesting and timely to understand issues relevant to big data adoption. In this study, a research model is proposed to explain the acquisition intention of big data analytics mainly from the theoretical perspectives of data quality management and data usage experience. Our empirical investigation reveals that a firm's intention for big data analytics can be positively affected by its competence in maintaining the quality of corporate data. Moreover, a firm's favorable experience (i.e., benefit perceptions) in utilizing external source data could encourage future acquisition of big data analytics. Surprisingly, a firm's favorable experience (i.e., benefit perceptions) in utilizing internal source data could hamper its adoption intention for big data analytics.
Although business analytics (BA) have been increasingly adopted into businesses, there is limited empirical research examining the drivers of each stage of BA adoption in organizations. Drawing upon ...technological-organizational-environmental framework and innovation diffusion process, we developed an integrative model to examine BA adoption processes and tested with 170 Korean firms. The analysis shows data-related technological characteristics derive all stages of BA adoption: initiation, adoption and assimilation. While organizational characteristics are associated with adoption and assimilation stage, only competition intensity in environmental characteristics is associated with initiation stage. Our findings help practitioners and researchers to understand what factors can enable companies to adopt BA in each stage.
In the realm of smart manufacturing, predictive maintenance plays a pivotal role in ensuring equipment reliability, minimizing downtime, optimizing costs, and reducing product failure rate by ...detecting faulty products in the early stage. However, the efficacy of predictive maintenance hinges on the quality of training data employed for predictive modeling. Inadequate data quality can lead to erroneous predictions and, consequently, ineffective maintenance strategies. This research paper introduces a solution that leverages deep learning and data quality management to enhance predictive maintenance in smart manufacturing. Our proposed methodology encompasses two major components such as data quality management and faulty product detection. Data quality management includes data preprocessing, feature reduction, and data balancing, whereas faulty product detection is done using deep learning techniques. By combining these elements, a predictive model capable of accurately forecasting faulty products in early stages to reduce economic losses is developed. The proposed approach effectively addresses data quality issues and is tested on the SECOM dataset which indicates that it surpasses traditional models with an accuracy of 96.4% and perfect recall. Ultimately, this research contributes to the advancement of predictive maintenance and deep learning in the context of smart manufacturing, benefiting both industrial practitioners and researchers alike.
Data quality plays a key role in big data management today. With the explosive growth of data from a variety of sources, the quality of data is faced with multiple problems. Motivated by this, we ...study the multiple data cleaning on incompleteness and inconsistency with currency reasoning and determination in this paper. We introduce a 4-step framework, named <inline-formula><tex-math notation="LaTeX">{\sf Imp3C}</tex-math> <mml:math><mml:mrow><mml:mi mathvariant="sans-serif">Imp</mml:mi><mml:mn mathvariant="sans-serif">3</mml:mn><mml:mi mathvariant="sans-serif">C</mml:mi></mml:mrow></mml:math><inline-graphic xlink:href="ding-ieq1-2992456.gif"/> </inline-formula>, for errors detection and quality improvement in incomplete and inconsistent data without timestamps. We achieve an integrated currency determining method to compute the currency orders among tuples, according to currency constraints. Thus, the inconsistent data and missing values are repaired effectively considering the temporal impact. For both effectiveness and efficiency consideration, we carry out inconsistency repair ahead of incompleteness repair. A currency-related consistency distance metric is defined to measure the similarity between dirty tuples and clean ones more accurately. In addition, currency orders are treated as an important feature in the missing imputation training process. The solution algorithms are introduced in detail with case studies. A thorough experiment on three real-life datasets verifies our method <inline-formula><tex-math notation="LaTeX">{\sf Imp3C}</tex-math> <mml:math><mml:mrow><mml:mi mathvariant="sans-serif">Imp</mml:mi><mml:mn mathvariant="sans-serif">3</mml:mn><mml:mi mathvariant="sans-serif">C</mml:mi></mml:mrow></mml:math><inline-graphic xlink:href="ding-ieq2-2992456.gif"/> </inline-formula> improves the performance of data repairing with multiple quality problems. <inline-formula><tex-math notation="LaTeX">{\sf Imp3C}</tex-math> <mml:math><mml:mrow><mml:mi mathvariant="sans-serif">Imp</mml:mi><mml:mn mathvariant="sans-serif">3</mml:mn><mml:mi mathvariant="sans-serif">C</mml:mi></mml:mrow></mml:math><inline-graphic xlink:href="ding-ieq3-2992456.gif"/> </inline-formula> outperforms the existing advanced methods, especially in the datasets with complex currency orders.
The most successful organizations in the world are data-driven businesses. Data is at the core of the business of many organizations as one of the most important assets, since the decisions they make ...cannot be better than the data on which they are based. Due to this reason, organizations need to be able to trust their data. One important activity that helps to achieve data reliability is the evaluation and certification of the quality level of organizational data repositories. This paper describes the results of the application of a data quality evaluation and certification process to the repositories of three European organizations belonging to different sectors. We present findings from the point of view of both the data quality evaluation team and the organizations that underwent the evaluation process. In this respect, several benefits have been explicitly recognized by the involved organizations after achieving the data quality certification for their repositories (e.g., long-term organizational sustainability better internal knowledge of data, and a more efficient management of data quality). As a result of this experience, we have also identified a set of best practices aimed to enhance the data quality evaluation process.
•Evaluation and certification contribute to data reliability and trust.•ISO/IEC 25012, 25024 and 25040 are a very good base for a data quality assessment framework.•Data quality evaluation allows organizations to acquire a deeper knowledge of their business processes.•Risks related to data quality could be mitigated by a data quality evaluation process.
Nowadays, IoT is being used in more and more application areas and the importance of IoT data quality is widely recognized by practitioners and researchers. The requirements for data and its quality ...vary from application to application or organization in different contexts. Many methodologies and frameworks include techniques for defining, assessing, and improving data quality. However, due to the diversity of requirements, it can be a challenge to choose the appropriate technique for the IoT system. This paper surveys data quality frameworks and methodologies for IoT data, and related international standards, comparing them in terms of data types, data quality definitions, dimensions and metrics, and the choice of assessment dimensions. The survey is intended to help narrow down the possible choices of IoT data quality management technique.