6,470 research outputs found

    A Behavioral Analysis of EPA's MOBILE Emission Factor Model

    Get PDF
    This paper examines the behavioral and stochastic aspects of modeling emission reductions from vehicle Inspection and Maintenance (I/M) programs. Forecasts of the potential emission reductions from such programs have been modeled by the use of the Environmental Protection Agency's MOBILE Model, EPA's computer model for estimating emission factors for mobile sources. We examine the structure of this Model and review the way behavior of drivers, mechanics and state regulatory authorities is incorporated in the current generation of the Model. We focus particularly on assumptions about vehicle repair under I/M, compliance with I/M requirements, and the impact of test measurement error on predicted I/M effectiveness. We also include some preliminary comparisons of the Model's outcomes to results of the I/M program in place in Arizona. Finally, we perform some sensitivity analyses to determine the most influential underlying parameters of the Model. We find that many of the assumptions of the I/M component of the Model are based on relatively small data sets on vehicle done in a laboratory setting, and that the output of the Model makes it difficult to compare the results against real world data from on-going state programs. In addition, the Model assumes that vehicles will either be repaired or receive a waiver. In the Arizona program there appears to be a third category of vehicles � those which fail the test and do not receive passes. This share may be as high as a third of all failing vehicles. Vehicles which do not eventually pass the test would be treated in the Model as non-compliant. However, in current programs, states do not seem to be measuring and entering the compliance rate correctly. The paper also examines the evidence about whether emissions deteriorate over the life of vehicle in a grams per mile basis (as assumed by the Model) or a grams per gallon basis. It finds support for the argument that emissions deteriorate on a grams per gallon basis. We find through sensitivity analysis that the repair effectiveness assumed by the Model to occur in an IM240 test are much greater than for the idle test, and that identification rates and repair effectiveness vary a great deal according to the cutpoint. These results are based on small numbers of vehicle tests in a laboratory setting and could be compared to real world evidence. Examining costs and cost-effectiveness of variations in I/M programs is important for determining improvements in I/M programs. States may not have incentives to develop cost-effective programs based on current Model that forecast emission reduction "credits" that are overly optimistic.

    Resiliency Assessment and Enhancement of Intrinsic Fingerprinting

    Get PDF
    Intrinsic fingerprinting is a class of digital forensic technology that can detect traces left in digital multimedia data in order to reveal data processing history and determine data integrity. Many existing intrinsic fingerprinting schemes have implicitly assumed favorable operating conditions whose validity may become uncertain in reality. In order to establish intrinsic fingerprinting as a credible approach to digital multimedia authentication, it is important to understand and enhance its resiliency under unfavorable scenarios. This dissertation addresses various resiliency aspects that can appear in a broad range of intrinsic fingerprints. The first aspect concerns intrinsic fingerprints that are designed to identify a particular component in the processing chain. Such fingerprints are potentially subject to changes due to input content variations and/or post-processing, and it is desirable to ensure their identifiability in such situations. Taking an image-based intrinsic fingerprinting technique for source camera model identification as a representative example, our investigations reveal that the fingerprints have a substantial dependency on image content. Such dependency limits the achievable identification accuracy, which is penalized by a mismatch between training and testing image content. To mitigate such a mismatch, we propose schemes to incorporate image content into training image selection and significantly improve the identification performance. We also consider the effect of post-processing against intrinsic fingerprinting, and study source camera identification based on imaging noise extracted from low-bit-rate compressed videos. While such compression reduces the fingerprint quality, we exploit different compression levels within the same video to achieve more efficient and accurate identification. The second aspect of resiliency addresses anti-forensics, namely, adversarial actions that intentionally manipulate intrinsic fingerprints. We investigate the cost-effectiveness of anti-forensic operations that counteract color interpolation identification. Our analysis pinpoints the inherent vulnerabilities of color interpolation identification, and motivates countermeasures and refined anti-forensic strategies. We also study the anti-forensics of an emerging space-time localization technique for digital recordings based on electrical network frequency analysis. Detection schemes against anti-forensic operations are devised under a mathematical framework. For both problems, game-theoretic approaches are employed to characterize the interplay between forensic analysts and adversaries and to derive optimal strategies. The third aspect regards the resilient and robust representation of intrinsic fingerprints for multiple forensic identification tasks. We propose to use the empirical frequency response as a generic type of intrinsic fingerprint that can facilitate the identification of various linear and shift-invariant (LSI) and non-LSI operations

    Reviewing the Drivers and Challenges in RFID Implementation in the Pharmaceutical Supply Chain

    Get PDF
    Counterfeiting is a global phenomenon that poses a serious financial threat to the pharmaceutical industry and more importantly jeopardizes public safety and security. Different measures, including new laws and regulations, have been put in place to mitigate the threat and tighten control in the pharmaceuticals supply chain. However, it appears that the most promising countermeasure is track-and-trace technology such as electronic-pedigree (E-pedigree) with Radio Frequency Identification (RFID) technology. In this study we present a framework exploring the antecedents and consequences of RFID applications in the pharmaceutical supply chain. The framework proposes that counterfeiting and E-pedigree regulation will drive the implementation of RFID in the pharmaceutical supply chain, which in turn provides strategic and operational benefits that enable competitive advantage. Meanwhile, the implementation of RFID requires overcoming many operational, technical and financial challenges. The framework provides a springboard that future study can explore using empirical data

    Consumers' Perception of Food-System Vulnerability to an Agroterrorist Attack

    Get PDF
    This paper uses results from a 2004 survey (N=1,010) on consumer attitudes toward agroterrorism and food-system security to investigate heterogeneous attributes affecting vulnerability including risk perceptions and fear. Using 15 separate multinomial PROBIT regressions we distinguish between vulnerability on a number of aspects of food-system security including food type and position in the food-supply chain. Vulnerability is not found to be common across food groups or respondents, and a variety of distinguishing characteristics can be used to investigate how individuals might perceive vulnerability.Agribusiness, Food Consumption/Nutrition/Food Safety,

    FPGA based remote code integrity verification of programs in distributed embedded systems

    Get PDF
    The explosive growth of networked embedded systems has made ubiquitous and pervasive computing a reality. However, there are still a number of new challenges to its widespread adoption that include scalability, availability, and, especially, security of software. Among the different challenges in software security, the problem of remote-code integrity verification is still waiting for efficient solutions. This paper proposes the use of reconfigurable computing to build a consistent architecture for generation of attestations (proofs) of code integrity for an executing program as well as to deliver them to the designated verification entity. Remote dynamic update of reconfigurable devices is also exploited to increase the complexity of mounting attacks in a real-word environment. The proposed solution perfectly fits embedded devices that are nowadays commonly equipped with reconfigurable hardware components that are exploited to solve different computational problems

    Criminal poisoning and product tampering: Toward an operational definition of malicious contamination

    Get PDF
    ‘Malicious contamination’ encompasses multiple crimes which have received little previous academic attention, including poisoning and product tampering. While these acts may seem easy to distinguish, there are many areas of overlap, and so before these crimes and those who commit them can be understood clear definitions must be introduced. The presence or absence of 14 behavioural variables is proposed as a way of distinguishing product tamperings from poisonings, with the empirical definition then tested on 384 malicious contamination incidents. The operational definition successfully distinguishes 92.7% of the cases and allows for a comparison of the differences between poisoning and tampering

    Wide-Area Measurement-Based Applications for Power System Monitoring and Dynamic Modeling

    Get PDF
    Due to the increasingly complex behavior exhibited by large-scale power systems with more uncertain renewables introduced to the grid, wide-area measurement system (WAMS) has been utilized to complement the traditional supervisory control and data acquisition (SCADA) system to improve operators’ situational awareness. By providing wide-area GPS-time-synchronized measurements of grid status at high time-resolution, it is able to reveal power system dynamics which cannot be captured before and has become an essential tool to deal with current and future power grid challenges. According to the time requirements of different power system applications, the applications can be roughly divided into online applications (e.g., data visualization, fast disturbance and oscillation detection, and system response prediction and reduction) and offline applications (e.g., measurement-driven dynamic modeling and validation, post-event analysis, and statistical analysis of historical data). In this dissertation, various wide-area measurement-based applications are presented. Firstly a pioneering WAMS deployed at the distribution level, the frequency monitoring network (FNET/GridEye) is introduced. For conventional large-scale power grid dynamic simulation, two major challenges are 1) accuracy of detailed dynamic models, and 2) computation burden for online dynamic assessment. To overcome the restrictions of the traditional approach, a measurement-based system response prediction tool using a Multivariate AutoRegressive (MAR) model is developed. It is followed by a measurement-based power system dynamic reduction tool using an autoregressive model vi to represent the external system. In addition, phasor measurement unit (PMU) data are employed to perform the generator dynamic model validation study. It utilizes both simulation data and measurement data to explore the potentials and limitations of the proposed approach. As an innovative application of using wide-area power system measurement, digital recordings could be authenticated by comparing the extracted frequency and phase angle from recordings with power system measurement database. It includes four research studies, i.e., oscillator error removal, ENF phenomenology, tampering detection, and frequency localization. Finally, several preliminary data analytics studies including inertia estimation and analysis, fault-induced delayed voltage recovery (FIDVR) detection, and statistical analysis of oscillation database, are presented

    State of Alaska Election Security Project Phase 2 Report

    Get PDF
    A laska’s election system is among the most secure in the country, and it has a number of safeguards other states are now adopting. But the technology Alaska uses to record and count votes could be improved— and the state’s huge size, limited road system, and scattered communities also create special challenges for insuring the integrity of the vote. In this second phase of an ongoing study of Alaska’s election security, we recommend ways of strengthening the system—not only the technology but also the election procedures. The lieutenant governor and the Division of Elections asked the University of Alaska Anchorage to do this evaluation, which began in September 2007.Lieutenant Governor Sean Parnell. State of Alaska Division of Elections.List of Appendices / Glossary / Study Team / Acknowledgments / Introduction / Summary of Recommendations / Part 1 Defense in Depth / Part 2 Fortification of Systems / Part 3 Confidence in Outcomes / Conclusions / Proposed Statement of Work for Phase 3: Implementation / Reference
    corecore