253 research outputs found

    Enabling Program Analysis Through Deterministic Replay and Optimistic Hybrid Analysis

    Full text link
    As software continues to evolve, software systems increase in complexity. With software systems composed of many distinct but interacting components, today’s system programmers, users, and administrators find themselves requiring automated ways to find, understand, and handle system mis-behavior. Recent information breaches such as the Equifax breach of 2017, and the Heartbleed vulnerability of 2014 show the need to understand and debug prior states of computer systems. In this thesis I focus on enabling practical entire-system retroactive analysis, allowing programmers, users, and system administrators to diagnose and understand the impact of these devastating mishaps. I focus primarly on two techniques. First, I discuss a novel deterministic record and replay system which enables fast, practical recollection of entire systems of computer state. Second, I discuss optimistic hybrid analysis, a novel optimization method capable of dramatically accelerating retroactive program analysis. Record and replay systems greatly aid in solving a variety of problems, such as fault tolerance, forensic analysis, and information providence. These solutions, however, assume ubiquitous recording of any application which may have a problem. Current record and replay systems are forced to trade-off between disk space and replay speed. This trade-off has historically made it impractical to both record and replay large histories of system level computation. I present Arnold, a novel record and replay system which efficiently records years of computation on a commodity hard-drive, and can efficiently replay any recorded information. Arnold combines caching with a unique process-group granularity of recording to produce both small, and quickly recalled recordings. My experiments show that under a desktop workload, Arnold could store 4 years of computation on a commodity 4TB hard drive. Dynamic analysis is used to retroactively identify and address many forms of system mis-behaviors including: programming errors, data-races, private information leakage, and memory errors. Unfortunately, the runtime overhead of dynamic analysis has precluded its adoption in many instances. I present a new dynamic analysis methodology called optimistic hybrid analysis (OHA). OHA uses knowledge of the past to predict program behaviors in the future. These predictions, or likely invariants are speculatively assumed true by a static analysis. This creates a static analysis which can be far more accurate than its traditional counterpart. Once this predicated static analysis is created, it is speculatively used to optimize a final dynamic analysis, creating a far more efficient dynamic analysis than otherwise possible. I demonstrate the effectiveness of OHA by creating an optimistic hybrid backward slicer, OptSlice, and optimistic data-race detector OptFT. OptSlice and OptFT are just as accurate as their traditional hybrid counterparts, but run on average 8.3x and 1.6x faster respectively. In this thesis I demonstrate that Arnold’s ability to record and replay entire computer systems, combined with optimistic hybrid analysis’s ability to quickly analyze prior computation, enable a practical and useful entire system retroactive analysis that has been previously unrealized.PHDComputer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/144052/1/ddevec_1.pd

    Accomplishing Technical and Investigative Expertise in Everyday Crime Scene Investigation

    Get PDF
    This research, situated at the intersection of sociology, science and technology studies and police studies, provides the first sociological account of Crime Scene Investigator (CSI) training in England and Wales. Focusing on the acquisition and everyday enactment of CSI expertise, this qualitative, ethnographic investigation asks (1) what are the roles, practices and expertise of the CSI and (2) how is the CSI’s expertise developed in training and enacted in everyday work. These questions are explored through participant observation at the main training centre for UK CSIs, observation at crime scenes, interviews with trainees during and after their training and visual methods. By unpicking the visible and invisible components of CSI work, I analyse how CSIs are trained to document crime scenes and explore the practices of transforming a potentially relevant object from these locations into artefacts that meet the requirements of courtroom scrutiny. I demonstrate how CSIs engage actively and reflexively with the requirements of different conceptions of objectivity and the changing demands placed on them. They continually and performatively negotiate and delimit multiple boundaries, from the very literal in demarcating a crime scene to claiming their position within the investigative hierarchy in each interaction. Unlike other discussions of boundary work, for the CSI this is iterative, requires constant effort and is embedded in their routine practice. Within police environments, the CSI has scope for such boundary work. In the courtroom, however, crime scene investigation is narrowly defined. This thesis develops our understanding of the CSI and crime scene investigation as a practice. It stresses the significance of taking this actor seriously in any account of forensic science and investigative practices. By viewing the CSI as simply an evidence collector, or not considering her work at all, the expertise and pivotal role of this actor in the meaningful and efficient use of science in policing is blackboxed. My detailed qualitative analysis of the CSI’s role, work and specialist expertise contributes a necessary account of a key actor in the police and criminal justice system.ESR

    Digital forensics: an integrated approach for the investigation of cyber/computer related crimes

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of PhilosophyDigital forensics has become a predominant field in recent times and courts have had to deal with an influx of related cases over the past decade. As computer/cyber related criminal attacks become more predominant in today’s technologically driven society the need for and use of, digital evidence in courts has increased. There is the urgent need to hold perpetrators of such crimes accountable and successfully prosecuting them. The process used to acquire this digital evidence (to be used in cases in courts) is digital forensics. The procedures currently used in the digital forensic process were developed focusing on particular areas of the digital evidence acquisition process. This has resulted in very little regard being made for the core components of the digital forensics field, for example the legal and ethical along with other integral aspects of investigations as a whole. These core facets are important for a number of reasons including the fact that other forensic sciences have included them, and to survive as a true forensics discipline digital forensics must ensure that they are accounted for. This is because, digital forensics like other forensics disciplines must ensure that the evidence (digital evidence) produced from the process is able to withstand the rigors of a courtroom. Digital forensics is a new and developing field still in its infancy when compared to traditional forensics fields such as botany or anthropology. Over the years development in the field has been tool centered, being driven by commercial developers of the tools used in the digital investigative process. This, along with having no set standards to guide digital forensics practitioners operating in the field has led to issues regarding the reliability, verifiability and consistency of digital evidence when presented in court cases. Additionally some developers have neglected the fact that the mere mention of the word forensics suggests courts of law, and thus legal practitioners will be intimately involved. Such omissions have resulted in the digital evidence being acquired for use in various investigations facing major challenges when presented in a number of cases. Mitigation of such issues is possible with the development of a standard set of methodologies flexible enough to accommodate the intricacies of all fields to be considered when dealing with digital evidence. This thesis addresses issues regarding digital forensics frameworks, methods, methodologies and standards for acquiring digital evidence using the grounded theory approach. Data was gathered using literature surveys, questionnaires and interviews electronically. Collecting data using electronic means proved useful when there is need to collect data from different jurisdictions worldwide. Initial surveys indicated that there were no existing standards in place and that the terms models/frameworks and methodologies were used interchangeably to refer to methodologies. A framework and methodology have been developed to address the identified issues and represent the major contribution of this research. The dissertation outlines solutions to the identified issues and presents the 2IR Framework of standards which governs the 2IR Methodology supported by a mobile application and a curriculum of studies. These designs were developed using an integrated approach incorporating all four core facets of the digital forensics field. This research lays the foundation for a single integrated approach to digital forensics and can be further developed to ensure the robustness of process and procedures used by digital forensics practitioners worldwide

    Human and Black Bear Interactions in Buncombe County, North Carolina, from 1993–2013

    Get PDF
    Over the past 20 years the frequency of interactions between humans and black bears in Buncombe County, North Carolina has been increasing, posing threats to human safety, black bear populations, ecological stability, and conservation support. During this time, both the human population and the American black bear population increased in southern Appalachia, which, combined with both urban expansion and landscape fragmentation, led to an increase in human and black bear interactions. Reducing future interactions with black bears is important as these interactions put support for conservation at risk. I performed a landscape analysis to better understand where human and black bear interactions occurred in this county from 1993–2013. After performing statistical analyses, I concluded that landscape fragmentation and urban characteristics likely played a role in where human and black bear interactions took place. Results of this statistical analysis were that human population density, proportion of forested landscape per block group, urban edge density, and the effective forest mesh size per census tract had statistically significant relationships with the geographic distribution of human and black bear interactions. This research can assist planning and conservation initiatives that aim to reduce human and wildlife interactions. This research will also contribute to the growing literature on human and wildlife interactions and the spatial analysis techniques employed to understand them

    GRS 2023 Program Booklet

    Get PDF

    Identity Politics and the New Genetics

    Get PDF
    Racial and ethnic categories have appeared in recent scientific work in novel ways and in relation to a variety of disciplines: medicine, forensics, population genetics and also developments in popular genealogy. Once again, biology is foregrounded in the discussion of human identity. Of particular importance is the preoccupation with origins and personal discovery and the increasing use of racial and ethnic categories in social policy. This new genetic knowledge, expressed in technology and practice, has the potential to disrupt how race and ethnicity are debated, managed and lived. The contributors include medical researchers, anthropologists, historians of science and sociologists of race relations; together, they explore the new and challenging landscape where biology becomes the stuff of identity

    Centennial Symposia Abstracts

    Get PDF

    Individual Differences in Speech Production and Perception

    Get PDF
    Inter-individual variation in speech is a topic of increasing interest both in human sciences and speech technology. It can yield important insights into biological, cognitive, communicative, and social aspects of language. Written by specialists in psycholinguistics, phonetics, speech development, speech perception and speech technology, this volume presents experimental and modeling studies that provide the reader with a deep understanding of interspeaker variability and its role in speech processing, speech development, and interspeaker interactions. It discusses how theoretical models take into account individual behavior, explains why interspeaker variability enriches speech communication, and summarizes the limitations of the use of speaker information in forensics
    • 

    corecore