181,283 research outputs found

    Integrity Coded Databases: Ensuring Correctness and Freshness of Outsourced Databases

    Get PDF
    In recent years, cloud storage has become an inexpensive and convenient option for individuals and businesses to store and retrieve information. The cloud releases the data owner from the financial burden of hiring professionals to create, update and maintain local databases. The advancements in the field of networking and the growing need for computing resources for various applications have made cloud computing more demanding. Its positive aspects make the cloud an attractive option for data storage, but this service comes with a cost that it requires the data owner to relinquish control of their information to the cloud service provider. So, there remains the possibility for malicious insider attacks on the data that may involve addition, omission, or manipulation of data. This paper presents a novel Integrity Coded Database (ICDB) approach for ensuring data correctness and freshness in the cloud. Various options for verifying the integrity of queried data in different granularities are provided, such as the coarse-grained integrity protection for the entire returned dataset or a more fine-grained integrity protection down to each tuple or even each attribute. ICDB allows data owners to insert integrity codes into a database, outsource the database to the cloud, run queries against the cloud database server, and verify that the queried information from the cloud is both correct and fresh. An ICDB prototype has been developed in order to benchmark several ICDB schemes to evaluate their performance

    Extensions to the self protecting object model to facilitate integrity in stationary and mobile hosts

    Get PDF
    M.Sc. (Computer Science)In this dissertation we propose extensions to the Self Protecting Object (SPO) model to facilitate the sharing of information in a more effective manner. We see the sharing ofinformation as the sharing of objects that provide services. Sharing objects effectively is allowing the objects to be used in a secure environment, independent of their location, in a manner usage was intended. The SPO model proposed by Olivier [32] allows for objects in a federated database to be moved from one site to another and ensures that the security policy of the object will always be respected and implemented, regardless of its location. Although the SPO model does indeed allow for objects (information) to be shared effectively, it fails to address issues of maintaining integrity within objects. We therefore define the notion of maintaining integrity within the spa model and propose a model to achieve it. We argue that ensuring an SPO is only used in a way usage was intended does not suffice to ensure integrity. The model we propose is based on ensuring that modifications to an SPO are only executed if the modification does not violate the constraints defined for the Sf'O, The model" allows for an spa to maintain its unique identity in addition to maintaining its integrity. The SPO model is designed to be used in a federated database on sites that are stationary. Therefore, having addressed the issue of maintaining integrity within SPOs on stationary sites in the federated database, we then introduce the notion of a mobile site: a site that will eventually disconnect from the federated database and become unreachable for some time. Introducing the mobile site into the federated database allows us to propose the Mobile Self Protecting Object (MSPO) and its associated architecture. Because of the nature of mobile sites, the original model for maintaining integrity can not be applied to the MSPO architecture. We therefore propose a mechanism (to be implemented in unison with the original model) to ensure the integrity of MSPOs on mobile sites. We then discuss the JASPO prototype. The aim of the prototype was to determine if the Self Protecting Object model was feasible using current development technologies. We examine the requirements identified in order for the prototype to be successful and discuss how these were satisfied. Several modifications were made to the original spa model, including the addition of a new module and the exclusion of others, we discuss these modifications and examine why they were necessary

    The Lung Image Database Consortium (LIDC):ensuring the integrity of expert-defined "truth"

    Get PDF
    RATIONALE AND OBJECTIVES: Computer-aided diagnostic (CAD) systems fundamentally require the opinions of expert human observers to establish “truth” for algorithm development, training, and testing. The integrity of this “truth,” however, must be established before investigators commit to this “gold standard” as the basis for their research. The purpose of this study was to develop a quality assurance (QA) model as an integral component of the “truth” collection process concerning the location and spatial extent of lung nodules observed on computed tomography (CT) scans to be included in the Lung Image Database Consortium (LIDC) public database. MATERIALS AND METHODS: One hundred CT scans were interpreted by four radiologists through a two-phase process. For the first of these reads (the “blinded read phase”), radiologists independently identified and annotated lesions, assigning each to one of three categories: “nodule ≥ 3mm,” “nodule < 3mm,” or “non-nodule ≥ 3mm.” For the second read (the “unblinded read phase”), the same radiologists independently evaluated the same CT scans but with all of the annotations from the previously performed blinded reads presented; each radiologist could add marks, edit or delete their own marks, change the lesion category of their own marks, or leave their marks unchanged. The post-unblinded-read set of marks was grouped into discrete nodules and subjected to the QA process, which consisted of (1) identification of potential errors introduced during the complete image annotation process (such as two marks on what appears to be a single lesion or an incomplete nodule contour) and (2) correction of those errors. Seven categories of potential error were defined; any nodule with a mark that satisfied the criterion for one of these categories was referred to the radiologist who assigned that mark for either correction or confirmation that the mark was intentional. RESULTS: A total of 105 QA issues were identified across 45 (45.0%) of the 100 CT scans. Radiologist review resulted in modifications to 101 (96.2%) of these potential errors. Twenty-one lesions erroneously marked as lung nodules after the unblinded reads had this designation removed through the QA process. CONCLUSION: The establishment of “truth” must incorporate a QA process to guarantee the integrity of the datasets that will provide the basis for the development, training, and testing of CAD systems

    A transportation security system applying RFID and GPS

    Get PDF
    Purpose: This paper is about developing a centralized, internet based security tool which utilizes RFID and GPS technology to identify drivers and track the load integrity. Design/methodology/approach: The system will accomplish the security testing in real-time using the internet and the U.S. Customs’ database (ACE). A central database and the interfaces and communication between the database and ACE will be established. After the vehicle is loaded, all openings of the tanker are sealed with disposable RFID tag seals. Findings/value: An RFID reader and GPS tracker wirelessly connected with the databases will serve as testing grounds for the implementation of security measures that can help prevent future terrorist attacks and help in ensuring that the goods and products are not compromised while in transit. The system will also reduce the labor work of security check to its minimum.Peer Reviewe

    Improving data integrity and performance of Cryptographic Structured Log File Systems

    Get PDF
    Modern File systems like CLFS (Cryptographic Log Structured File System)are aimed to provide security and confidentiality. Current deployments of such FileSystems do not currently ensure data integrity of the encrypted data that is stored ondisk. Due to Kernel bugs, racing conditions and arbitrary dead-locks, CLFS data onthe disc can be damaged, also there is always the possibility that system users canmodify the encrypted data. Our study aims toward ensuring data integrity on CLFSwithout compromising on overall performance. This paper considers the standardmethods using file metadata check-summing in CLFS with the main goal toovercome one of its major limitations, low performance of File-System checksumming.CLFS matches our performance expectations, as it performs close enoughto non-cryptographic file systems. To improve the performance of the checksummingprocess we try to study and examine various design choices and proposean in-kernel database for storage and reduction of check-sum verification once in Nread requests

    An OCL-Based approach to derive constraint test cases for database applications

    Get PDF
    The development of database applications in most CASE tools has been insufficient because most of these tools do not provide the software necessary to validate these appli-cations. Validation means ensuring whether a given application fulfils the user require-ments. We suggest validation of database applications by using the functional testing technique, which is a fundamental black-box testing technique for checking the software without being concerned about its implementation and structure. Our main contribu-tion to this work is in providing a MDA approach for deriving testing software from the OCL specification of the integrity constraints. This testing software is used to validate the database applications, which are used to enforce these constraints. The generated testing software includes three components: validation queries, test cases and initial data inserted before the testing process. Our approach is implemented as an add-in tool in Rational Rose called OCL2TestSW.This work has been partially supported by the project Thuban: Natural Interaction Platform for Virtual Attending in Real Environments (TIN2008-02711), and also by the Spanish research projects: MA2VICMR: Improving the access, analysis and visibility of the multilingual and multimedia information in web for the Region of Madrid (S2009/TIC-1542).Publicad

    Failings in the Treatment of Electronic Signatures

    Get PDF
    Original article can be found at: http://www.herts.ac.uk/courses/schools-of-study/law/hertfordshire-law-journal/home.cfmPeer reviewe

    Ensuring Cyber-Security in Smart Railway Surveillance with SHIELD

    Get PDF
    Modern railways feature increasingly complex embedded computing systems for surveillance, that are moving towards fully wireless smart-sensors. Those systems are aimed at monitoring system status from a physical-security viewpoint, in order to detect intrusions and other environmental anomalies. However, the same systems used for physical-security surveillance are vulnerable to cyber-security threats, since they feature distributed hardware and software architectures often interconnected by ‘open networks’, like wireless channels and the Internet. In this paper, we show how the integrated approach to Security, Privacy and Dependability (SPD) in embedded systems provided by the SHIELD framework (developed within the EU funded pSHIELD and nSHIELD research projects) can be applied to railway surveillance systems in order to measure and improve their SPD level. SHIELD implements a layered architecture (node, network, middleware and overlay) and orchestrates SPD mechanisms based on ontology models, appropriate metrics and composability. The results of prototypical application to a real-world demonstrator show the effectiveness of SHIELD and justify its practical applicability in industrial settings

    From European integration to European integrity: case of Latvia, Lithuania, Poland

    Get PDF
    Globalization processes create a number of serious internal and external challenges for academic integrity which should not be solved in the framework of one country. Ensuring the effective operation of such a space is not possible without the integration of all its elements, including scientific ones, into a single system. The scientific integration in the European Union is fueled by the functioning of single scientific space of the European Research Area. Such system has already become an effective mechanism to overcome issued which relevant to academic integrity. However, the question is how effective this system is for countries that have recently joined the EU. Among such countries, we have chosen Latvia, Lithuania and Poland, since they are all ex-members of the Warsaw Pact, and thus have approximately similar problems in the academic community. The analyses was conducted using the dataset from scientific database Scopus for two periods 1995-2003 and 2004-2017. The findigs proved the number of publications in Latvia, Poland, and Slovakia indexed by Scopus were increased after EU integration. Besides, most of the publication analysed the issued which connected with academic integrity in research. At the same time, the citation of the scientists also increased. The findigs proved, that the impact from scientific integration into a single European scientific space exists, but its significance is negligible
    corecore