91 research outputs found

    METODE PENILAIAN KUALITAS DATA SEBAGAI REKOMENDASI SISTEM REPOSITORI ILMIAH NASIONAL

    Get PDF
    High quality data and data quality assessment which efficiently needed to data standardization in the research data repository. Three attributes most used i.e: completeness, accuracy, and timeliness are dimensions to data quality assessment. The purposes of the research are to increase knowledge and discuss in depth of research done. To support the research, we are using traditional review method on the Scopus database to identify relevant research. The literature review is limited for the type of documents i.e: articles, books, proceedings, and reviews. The result of document searching is filtered using some keywords i.e: data quality, data quality assessment, data quality dimensions, quality assessment, data accuracy, dan data completeness. The document that found be analyzed based on relevant research. Then, these documents compare to find out different of concept and method which used in the data quality metric. The result of analysis could be used as a recommendation to implement in the data quality assessment in the National Scientific Repository

    Collaboratively Assessing Information Quality on the Web

    Get PDF
    The Web has become a large repository of information with varying qualities. Many users often consume information without knowing its quality. Although automatic methods can be used to obtain measurements of certain aspects of quality, they are not reliable and cannot measure all aspects of quality. Users can detect errors and reliably assess aspects of quality that cannot be measured by automatic methods. However, there is a lack of technology support for users to record and share their feedback. This research aims to develop technologies to allow users to collaboratively assess information quality on the Web. The solution combines the capabilities of machines and humans to obtain comprehensive, reliable, and scalable measurements of information quality. In this paper, the crucial user interaction component of the solution is presented. It uses a browser plug-in to allow users to rate and annotate any Web page and share ratings and annotations with other users

    Information Resilience in a Digital Built Environment

    Get PDF
    Information is the underpinning driver in the Digitised Built Environment and crucial to the Centre for Digital Built Britain’s agenda. Threats to information affect the intrinsic, relational and security dimensions of information quality. Therefore, the DBE requires capabilities of people, and requirements of the process, software and hardware for threat prevention and reduction. Existing research and protocols seldomly outline the capabilities and requirements needed to reduce threats to information. The aim of this report is to develop an information resilience framework which outlines the capabilities and requirements needed to ensure the resilience of information throughout its lifecycle; creation, use, storage, reuse, preserve and destroy. The findings highlight the need for people’s (stakeholder) competencies and behaviours which are driven by cognitive abilities such as attention, learning, reasoning and perception. Furthermore, process’ requirements such as embedding validation check process, standard requirements for Level of Detail, digital upskilling, among others, were identified. Additionally, identified software requirements include its ability to be customised to meet the project needs, detect conflicts and provide context of information. Finally, hardware requirements encompass facilitating backup, having a high capacity system and being inaccessible to peripherals. This research will be further extended to the development of a decision-making assessment tool to measure capabilities and requirements in the entire lifecycle of built assets

    Matching data detection for the integration system

    Get PDF
    The purpose of data integration is to integrate the multiple sources of heterogeneous data available on the internet, such as text, image, and video. After this stage, the data becomes large. Therefore, it is necessary to analyze the data that can be used for the efficient execution of the query. However, we have problems with solving entities, so it is necessary to use different techniques to analyze and verify the data quality in order to obtain good data management. Then, when we have a single database, we call this mechanism deduplication. To solve the problems above, we propose in this article a method to calculate the similarity between the potential duplicate data. This solution is based on graphics technology to narrow the search field for similar features. Then, a composite mechanism is used to locate the most similar records in our database to improve the quality of the data to make good decisions from heterogeneous sources

    Information resilience in a digital built environment.

    Get PDF
    Information is the underpinning driver in the Digitised Built Environment and crucial to the Centre for Digital Built Britain’s agenda. Threats to information affect the intrinsic, relational and security dimensions of information quality. Therefore, the DBE requires capabilities of people, and requirements of the process, software and hardware for threat prevention and reduction. Existing research and protocols seldomly outline the capabilities and requirements needed to reduce threats to information. The aim of this report is to develop an information resilience framework which outlines the capabilities and requirements needed to ensure the resilience of information throughout its lifecycle; creation, use, storage, reuse, preserve and destroy. The findings highlight the need for people’s (stakeholder) competencies and behaviours which are driven by cognitive abilities such as attention, learning, reasoning and perception. Furthermore, process’ requirements such as embedding validation check process, standard requirements for Level of Detail, digital upskilling, among others, were identified. Additionally, identified software requirements include its ability to be customised to meet the project needs, detect conflicts and provide context of information. Finally, hardware requirements encompass facilitating backup, having a high capacity system and being inaccessible to peripherals. This research will be further extended to the development of a decision-making assessment tool to measure capabilities and requirements in the entire lifecycle of built assets

    Data quality management and evolution of information systems

    Get PDF
    Information systems have been rapidly evolving from monolithic/ transactional to network/service based systems. The issue of data quality is becoming increasingly important, since information in new information systems is ubiquitous, diverse, uncontrolled. In the paper we examine data quality from the point of view of dimensions and methodologies proposed for data quality measurement and improvement. Dimensions and methodologies are examined in their relationship with the different types of data, from structured to unstructured, the evolution of information systems, and the diverse application areas.The past and the future of information systems: 1976-2006 and beyondRed de Universidades con Carreras en Informática (RedUNCI

    Decisison Useful Financial Reporting Information Characteristics: An Empirical Validation of the Proposed FASB/IASB International Accounting Model

    Get PDF
    As part of a future international accounting standard, the USA Financial Accounting Standards Board and UK International Accounting Standards Board recently updated their description of the financial reporting information characteristics that determine its decision usefulness for end users. Yet the relationships inherent in the description have not been empirically validated. If invalid, the description may globally misguide future professional information efforts for a multitude of business users and decisions. A causal model is created of decision-useful financial reporting information characteristics from the description, then evaluated using partial least squares and survey data from business information users as defined by the international standard. The model significantly predicted user perceptions of key information constructs (Decision Usefulness [76%], Relevance [62%], and Faithful Representativeness [57%]; R2 values, p\u3c0.01). However, theoretically and practically important constructs (Verifiability, Completeness, Faithful Representativeness) did not significantly contribute to the model

    MODELING THE INFORMATION QUALITY OF OBJECT TRACKING SYSTEMS

    Get PDF
    Advances in information and communication technologies, such as Radio Frequency Identification (RFID), mobile and wireless mesh networks, bring us closer to the vision of “Internet of Things”, a global network of people, products or objects that can be easily readable, recognizable, locatable, and manageable over the world wide web. Such a network can provide ubiquitous and real-time information on movements of objects; and object tracking systems monitor the moving objects and register their on-going location in the context of higher-level applications, such as supply chain management, food traceability and retail, where monitoring of objects is required. This paper investigates information quality of object tracking systems and proposes an analytical model that measures the degree of information completeness of object tracking systems based on the scope and depth of their data capturing capabilities. We demonstrate that the information completeness of object tracking systems is influenced by the configuration of object tracking systems. The model may be used for both ex-ante and ex-post evaluations of object tracking systems, under the auspices of their information quality requirements, considering that their use is expected to blossom in the “Internet-of- Things” era

    INFORMATION QUALITY ASSESSMENT: VALIDATING MEASUREMENT DIMENSIONS AND PROCESSES

    Get PDF
    Over the last two decades information quality has emerged as a critical concern for most organisations. Foremost research provides several approaches to measure information quality and many case studies constantly illustrate the difficulties in assessing information quality. In this paper, we tackle the problem of assessing information quality and we propose a framework to implement information quality assessment in practice. Our framework incorporates two major components: a set of valid measurement dimensions and a measurement process. We have tested the validity, reliability and usefulness of the dimensions and applied the measurement process to an example dataset. In addition, our study demonstrates typical information quality problems in the example dataset and their potential impact to organisations
    • …
    corecore