6,944 research outputs found

    A Robust Data Hiding Process Contributing to the Development of a Semantic Web

    No full text
    International audienceIn this paper, a novel steganographic scheme based on chaotic iterations is proposed. This research work takes place into the information hiding framework, and focus more specifically on robust steganography. Steganographic algorithms can participate in the development of a semantic web: medias being on the Internet can be enriched by information related to their contents, authors, etc., leading to better results for the search engines that can deal with such tags. As media can be modified by users for various reasons, it is preferable that these embedding tags can resist to changes resulting from some classical transformations as for example cropping, rotation, image conversion, and so on. This is why a new robust watermarking scheme for semantic search engines is proposed in this document. For the sake of completeness, the robustness of this scheme is finally compared to existing established algorithms

    Transparent authentication methodology in electronic education

    No full text
    In the context of on-line assessment in e-learning, a problem arises when a student taking an exam may wish to cheat by handing over personal credentials to someone else to take their place in an exam, Another problem is that there is no method for signing digital content as it is being produced in a computerized environment. Our proposed solution is to digitally sign the participant’s work by embedding voice samples in the transcript paper at regular intervals. In this investigation, we have demonstrated that a transparent stenographic methodology will provide an innovative and practical solution for achieving continuous authentication in an online educational environment by successful insertion and extraction of audio digital signatures

    User-Friendly Ontology Creation Methodologies - A Survey

    Get PDF
    The convergence of the semantic web and the social web to the social semantic web leads to new challenges in ontology engineering. As of today ontologies are created by highly specialized ontology engineers. In order to unite the wisdom of crowds and ontologies new approaches to ontology creation are necessary to bridge the ontology gap and enable novice users to create formalized knowledge. Numerous ontology development methodologies are available; in this paper we will briefly present 16 ontology development methodologies and evaluate them against criteria for their user-friendliness and their suitability for usage by novice users and domain experts. Our eventual goal is the identification of best practices and required research for user-friendly ontology design

    A Unified Forensics Analysis Approach to Digital Investigation

    Get PDF
    Digital forensics is now essential in addressing cybercrime and cyber-enabled crime but potentially it can have a role in almost every other type of crime. Given technology's continuous development and prevalence, the widespread adoption of technologies among society and the subsequent digital footprints that exist, the analysis of these technologies can help support investigations. The abundance of interconnected technologies and telecommunication platforms has significantly changed the nature of digital evidence. Subsequently, the nature and characteristics of digital forensic cases involve an enormous volume of data heterogeneity, scattered across multiple evidence sources, technologies, applications, and services. It is indisputable that the outspread and connections between existing technologies have raised the need to integrate, harmonise, unify and correlate evidence across data sources in an automated fashion. Unfortunately, the current state of the art in digital forensics leads to siloed approaches focussed upon specific technologies or support of a particular part of digital investigation. Due to this shortcoming, the digital investigator examines each data source independently, trawls through interconnected data across various sources, and often has to conduct data correlation manually, thus restricting the digital investigator’s ability to answer high-level questions in a timely manner with a low cognitive load. Therefore, this research paper investigates the limitations of the current state of the art in the digital forensics discipline and categorises common investigation crimes with the necessary corresponding digital analyses to define the characteristics of the next-generation approach. Based on these observations, it discusses the future capabilities of the next-generation unified forensics analysis tool (U-FAT), with a workflow example that illustrates data unification, correlation and visualisation processes within the proposed method.</jats:p
    • 

    corecore