598 research outputs found

    Sensor Data Integrity Verification for Real-time and Resource Constrained Systems

    Full text link
    Sensors are used in multiple applications that touch our lives and have become an integral part of modern life. They are used in building intelligent control systems in various industries like healthcare, transportation, consumer electronics, military, etc. Many mission-critical applications require sensor data to be secure and authentic. Sensor data security can be achieved using traditional solutions like cryptography and digital signatures, but these techniques are computationally intensive and cannot be easily applied to resource constrained systems. Low complexity data hiding techniques, on the contrary, are easy to implement and do not need substantial processing power or memory. In this applied research, we use and configure the established low complexity data hiding techniques from the multimedia forensics domain. These techniques are used to secure the sensor data transmissions in resource constrained and real-time environments such as an autonomous vehicle. We identify the areas in an autonomous vehicle that require sensor data integrity and propose suitable water-marking techniques to verify the integrity of the data and evaluate the performance of the proposed method against different attack vectors. In our proposed method, sensor data is embedded with application specific metadata and this process introduces some distortion. We analyze this embedding induced distortion and its impact on the overall sensor data quality to conclude that watermarking techniques, when properly configured, can solve sensor data integrity verification problems in an autonomous vehicle.Ph.D.College of Engineering & Computer ScienceUniversity of Michigan-Dearbornhttp://deepblue.lib.umich.edu/bitstream/2027.42/167387/3/Raghavendar Changalvala Final Dissertation.pdfDescription of Raghavendar Changalvala Final Dissertation.pdf : Dissertatio

    Data Hiding and Its Applications

    Get PDF
    Data hiding techniques have been widely used to provide copyright protection, data integrity, covert communication, non-repudiation, and authentication, among other applications. In the context of the increased dissemination and distribution of multimedia content over the internet, data hiding methods, such as digital watermarking and steganography, are becoming increasingly relevant in providing multimedia security. The goal of this book is to focus on the improvement of data hiding algorithms and their different applications (both traditional and emerging), bringing together researchers and practitioners from different research fields, including data hiding, signal processing, cryptography, and information theory, among others

    Towards Secure Online Distribution of Multimedia Codestreams

    Get PDF

    Advanced Cryptographic Techniques for Protecting Log Data

    Get PDF
    This thesis examines cryptographic techniques providing security for computer log files. It focuses on ensuring authenticity and integrity, i.e. the properties of having been created by a specific entity and being unmodified. Confidentiality, the property of being unknown to unauthorized entities, will be considered, too, but with less emphasis. Computer log files are recordings of actions performed and events encountered in computer systems. While the complexity of computer systems is steadily growing, it is increasingly difficult to predict how a given system will behave under certain conditions, or to retrospectively reconstruct and explain which events and conditions led to a specific behavior. Computer log files help to mitigate the problem of retracing a system’s behavior retrospectively by providing a (usually chronological) view of events and actions encountered in a system. Authenticity and integrity of computer log files are widely recognized security requirements, see e.g. [Latham, ed., "Department of Defense Trusted Computer System Evaluation Criteria", 1985, p. 10], [Kent and Souppaya, "Guide to Computer Security Log Management", NIST Special Publication 800-92, 2006, Section 2.3.2], [Guttman and Roback, "An Introduction to Computer Security: The NIST Handbook", superseded NIST Special Publication 800-12, 1995, Section 18.3.1], [Nieles et al., "An Introduction to Information Security" , NIST Special Publication 800-12, 2017, Section 9.3], [Common Criteria Editorial Board, ed., "Common Criteria for Information Technology Security Evaluation", Part 2, Section 8.6]. Two commonly cited ways to ensure integrity of log files are to store log data on so-called write-once-read-many-times (WORM) drives and to immediately print log records on a continuous-feed printer. This guarantees that log data cannot be retroactively modified by an attacker without physical access to the storage medium. However, such special-purpose hardware may not always be a viable option for the application at hand, for example because it may be too costly. In such cases, the integrity and authenticity of log records must be ensured via other means, e.g. with cryptographic techniques. Although these techniques cannot prevent the modification of log data, they can offer strong guarantees that modifications will be detectable, while being implementable in software. Furthermore, cryptography can be used to achieve public verifiability of log files, which may be needed in applications that have strong transparency requirements. Cryptographic techniques can even be used in addition to hardware solutions, providing protection against attackers who do have physical access to the logging hardware, such as insiders. Cryptographic schemes for protecting stored log data need to be resilient against attackers who obtain control over the computer storing the log data. If this computer operates in a standalone fashion, it is an absolute requirement for the cryptographic schemes to offer security even in the event of a key compromise. As this is impossible with standard cryptographic tools, cryptographic solutions for protecting log data typically make use of forward-secure schemes, guaranteeing that changes to log data recorded in the past can be detected. Such schemes use a sequence of authentication keys instead of a single one, where previous keys cannot be computed efficiently from latter ones. This thesis considers the following requirements for, and desirable features of, cryptographic logging schemes: 1) security, i.e. the ability to reliably detect violations of integrity and authenticity, including detection of log truncations, 2) efficiency regarding both computational and storage overhead, 3) robustness, i.e. the ability to verify unmodified log entries even if others have been illicitly changed, and 4) verifiability of excerpts, including checking an excerpt for omissions. The goals of this thesis are to devise new techniques for the construction of cryptographic schemes that provide security for computer log files, to give concrete constructions of such schemes, to develop new models that can accurately capture the security guarantees offered by the new schemes, as well as to examine the security of previously published schemes. This thesis demands that cryptographic schemes for securely storing log data must be able to detect if log entries have been deleted from a log file. A special case of deletion is log truncation, where a continuous subsequence of log records from the end of the log file is deleted. Obtaining truncation resistance, i.e. the ability to detect truncations, is one of the major difficulties when designing cryptographic logging schemes. This thesis alleviates this problem by introducing a novel technique to detect log truncations without the help of third parties or designated logging hardware. Moreover, this work presents new formal security notions capturing truncation resistance. The technique mentioned above is applied to obtain cryptographic logging schemes which can be shown to satisfy these notions under mild assumptions, making them the first schemes with formally proven truncation security. Furthermore, this thesis develops a cryptographic scheme for the protection of log files which can support the creation of excerpts. For this thesis, an excerpt is a (not necessarily contiguous) subsequence of records from a log file. Excerpts created with the scheme presented in this thesis can be publicly checked for integrity and authenticity (as explained above) as well as for completeness, i.e. the property that no relevant log entry has been omitted from the excerpt. Excerpts provide a natural way to preserve the confidentiality of information that is contained in a log file, but not of interest for a specific public analysis of the log file, enabling the owner of the log file to meet confidentiality and transparency requirements at the same time. The scheme demonstrates and exemplifies the technique for obtaining truncation security mentioned above. Since cryptographic techniques to safeguard log files usually require authenticating log entries individually, some researchers [Ma and Tsudik, "A New Approach to Secure Logging", LNCS 5094, 2008; Ma and Tsudik, "A New Approach to Secure Logging", ACM TOS 2009; Yavuz and Peng, "BAF: An Efficient Publicly Verifiable Secure Audit Logging Scheme for Distributed Systems", ACSAC 2009] have proposed using aggregatable signatures [Boneh et al., "Aggregate and Verifiably Encrypted Signatures from Bilinear Maps", EUROCRYPT 2003] in order to reduce the overhead in storage space incurred by using such a cryptographic scheme. Aggregation of signatures refers to some “combination” of any number of signatures (for distinct or equal messages, by distinct or identical signers) into an “aggregate” signature. The size of the aggregate signature should be less than the total of the sizes of the orginal signatures, ideally the size of one of the original signatures. Using aggregation of signatures in applications that require storing or transmitting a large number of signatures (such as the storage of log records) can lead to significant reductions in the use of storage space and bandwidth. However, aggregating the signatures for all log records into a single signature will cause some fragility: The modification of a single log entry will render the aggregate signature invalid, preventing the cryptographic verification of any part of the log file. However, being able to distinguish manipulated log entries from non-manipulated ones may be of importance for after-the-fact investigations. This thesis addresses this issue by presenting a new technique providing a trade-off between storage overhead and robustness, i.e. the ability to tolerate some modifications to the log file while preserving the cryptographic verifiability of unmodified log entries. This robustness is achieved by the use of a special kind of aggregate signatures (called fault-tolerant aggregate signatures), which contain some redundancy. The construction makes use of combinatorial methods guaranteeing that if the number of errors is below a certain threshold, then there will be enough redundancy to identify and verify the non-modified log entries. Finally, this thesis presents a total of four attacks on three different schemes intended for securely storing log files presented in the literature [Yavuz et al., "Efficient, Compromise Resilient and Append-Only Cryptographic Schemes for Secure Audit Logging", Financial Cryptography 2012; Ma, "Practical Forward Secure Sequential Aggregate Signatures", ASIACCS 2008]. The attacks allow for virtually arbitrary log file forgeries or even recovery of the secret key used for authenticating the log file, which could then be used for mostly arbitrary log file forgeries, too. All of these attacks exploit weaknesses of the specific schemes. Three of the attacks presented here contradict the security properties of the schemes claimed and supposedly proven by the respective authors. This thesis briefly discusses these proofs and points out their flaws. The fourth attack presented here is outside of the security model considered by the scheme’s authors, but nonetheless presents a realistic threat. In summary, this thesis advances the scientific state-of-the-art with regard to providing security for computer log files in a number of ways: by introducing a new technique for obtaining security against log truncations, by providing the first scheme where excerpts from log files can be verified for completeness, by describing the first scheme that can achieve some notion of robustness while being able to aggregate log record signatures, and by analyzing the security of previously proposed schemes

    Verifying RADAR Data Using Two-Dimensional QIM-based Data Hiding

    Full text link
    Modern vehicles have evolved into supporting advanced internal networks and connecting System Based Chips (SBC), System in a Package (SiP) solutions or traditional micro controllers to foster an electronic ecosystem for high speed data transfers, precision and real-time control. The use of Controller Area Networks (CAN) is widely adopted as the backbone of internal vehicle communication infrastructure. Automotive applications such as ADAS, autonomous driving, battery management systems, power train systems, telematics and infotainment, all utilize CAN transmissions directly or through gateway management. The network transmissions lack robust integrity verification mechanisms to validate authentic data payloads, making it vulnerable to packet replay, spoofing, insertion, deletion and denial of service attacks. Additional methods exist to secure network data such as traditional cryptography. Utilizing this method will increase the computational complexity, processing latency and increase overall system cost. This thesis proposes a robust, light and adaptive solution to validate the authenticity of automotive sensor data using CAN network protocol. We propose using a two-dimensional Quantization Index Modulation (QIM) data hiding technique, to create a means of verification. Analysis of the proposed framework will be conducted in a sensor transmission scenario for RADAR sensors in an autonomous vehicle setting. The detection and effects of distortion on the application are tested through the implementation of sensor fusion algorithms and the results are observed and analyzed. The proposed framework offers a needed capability to maintain transmission integrity without the compromise of data quality and low design complexity. This framework could also be applied to different network architectures, as well as its operational scope could be modified to operate with more abstract types of data.MSEElectrical Engineering, College of Engineering & Computer ScienceUniversity of Michigan-Dearbornhttp://deepblue.lib.umich.edu/bitstream/2027.42/167354/1/Brandon Fedoruk - Final Thesis.pd

    Command & Control: Understanding, Denying and Detecting - A review of malware C2 techniques, detection and defences

    Full text link
    In this survey, we first briefly review the current state of cyber attacks, highlighting significant recent changes in how and why such attacks are performed. We then investigate the mechanics of malware command and control (C2) establishment: we provide a comprehensive review of the techniques used by attackers to set up such a channel and to hide its presence from the attacked parties and the security tools they use. We then switch to the defensive side of the problem, and review approaches that have been proposed for the detection and disruption of C2 channels. We also map such techniques to widely-adopted security controls, emphasizing gaps or limitations (and success stories) in current best practices.Comment: Work commissioned by CPNI, available at c2report.org. 38 pages. Listing abstract compressed from version appearing in repor

    Cryptographic Primitives from Physical Variables

    Get PDF
    In this dissertation we explore a new paradigm emerging from the subtleties of cryptographic implementations and relating to theoretical aspects of cryptography. This new paradigm, namely physical variables (PVs), simply describes properties of physical objects designed to be identical but are not due to manufacturing variability. In the first part of this dissertation, we focus our attention on scenarios which require the unique identification of physical objects and we show how Gaussian PVs can be used to fulfill such a requirement. Using this framework we present and analyze a new technique for fingerprinting compact discs (CDs) using the manufacturing variability found in the length of the CDs\u27 lands and pits. Although the variability measured is on the order of 20 nm, the technique does not require the use of microscopes or any advanced equipment. Instead, the electrical signal produced by the photo-detector inside the CD reader will be sufficient to measure the desired variability. We thoroughly investigate the new technique by analyzing data collected from 100 identical CDs and show how to extract a unique fingerprint for each CD. In the second part, we shift our attention to physically parameterized functions (PPFs). Although all the constructions we provide are centered around delay-based physically unclonable functions (PUFs), we stress that the use of the term PUF could be misleading as most circuits labeled with the term PUF are in reality clonable on the protocol level. We argue that using a term like PPFs to describe functions parameterized by a PV is a more accurate description. Herein, we thoroughly analyze delay-PUFs and use a mathematical framework to construct two authentication protocols labeled PUF-HB and HB+PUF. Both these protocols merge the known HB authentication family with delay-based PUFs. The new protocols enjoy the security reduction put forth by the HB portion of the protocol and at the same time maintain a level of hardware security provided by the use of PUFs. We present a proof of concept implementation for HB+PUF which takes advantage of the PUF circuit in order to produce the random bits typically needed for an HB-based authentication scheme. The overall circuit is shown to occupy a few thousand gates. Finally, we present a new authentication protocol that uses 2-level PUF circuits and enables a security reduction which, unlike the previous two protocols, stems naturally from the usage of PVs

    Emerging Informatics

    Get PDF
    The book on emerging informatics brings together the new concepts and applications that will help define and outline problem solving methods and features in designing business and human systems. It covers international aspects of information systems design in which many relevant technologies are introduced for the welfare of human and business systems. This initiative can be viewed as an emergent area of informatics that helps better conceptualise and design new world-class solutions. The book provides four flexible sections that accommodate total of fourteen chapters. The section specifies learning contexts in emerging fields. Each chapter presents a clear basis through the problem conception and its applicable technological solutions. I hope this will help further exploration of knowledge in the informatics discipline
    • …
    corecore