8 research outputs found

    Abstract PinOS: A Programmable Framework for Whole-System Dynamic Instrumentation

    No full text
    PinOS is an extension of the Pin dynamic instrumentation framework for whole-system instrumentation, i.e., to instrument both kernel and user-level code. It achieves this by interposing between the subject system and hardware using virtualization techniques. Specifically, PinOS is built on top of the Xen virtual machine monitor with Intel VT technology to allow instrumentation of unmodified OSes. PinOS is based on software dynamic translation and hence can perform pervasive fine-grain instrumentation. By inheriting the powerful instrumentation API from Pin, plus introducing some new API for system-level instrumentation, PinOS can be used to write system-wide instrumentation tools for tasks like program analysis and architectural studies. As of today, PinOS can boot Linux on IA-32 in uniprocessor mode, and can instrument complex applications such as database and web servers. Categories and Subject Descriptors D.2.5 [Software Engineering]: Testing and Debugging- code inspections and walk-throughs

    A Poll-Free, Low-Latency Approach to Process State Capture/Recovery in Heterogeneous Computing Systems

    No full text
    heterogeneous computing systems is that it cannot simply be initiated instantaneously, once a request for capture has been received. This is because the capture can be initiated only at certain points -- at points which have equivalent points in the other instances of the computation on different architectures -- so that the process can be restarted at exactly the same point at which it was paused. For ensuring minimum latency, the state capture should be initiated at the very next point of equivalence encountered, once requested. At the same time, it should be ensured that the performance overhead incurred during normal execution should be kept at acceptable levels. This paper proposes a fundamentally new approach to process state capture and recovery which achieves the above objectives

    Electronic Voting- A Survey

    No full text
    As the world watched the electoral drama unfold in Florida at the end of 2000, people started wondering, “Wouldn’t all our problems be solved if they just used Internet Voting?”. People all over the world soon started taking a hard look at their voting equipment and procedures, and trying to figure out how to improve them [1]. There is a strong inclination towards moving to Remote Internet Voting – at least among the politicians – in order to enhance voter convenience, increase voter confidence and voter turnout. However, as will be seen later in this paper, there are serious technological and social aspects that make Remote Internet Voting infeasible in the visible future. Therefore, many technologists have suggested that remote poll-site electronic voting, where the voter can vote at any poll-site (not only his home county poll-site), seems to be the best step forward as it provides better voter convenience, but at the same time, does not compromise security. This paper presents a survey of the state of the art in Electronic Voting, including the various works done in Internet Voting (and the arguments against its use), as well as in electronic poll-site voting. Electronic voting refers to the use of computers or computerized voting equipment to cast ballots in an election. Sometimes, this term is used more specifically to refer to voting that take

    Conservation vs. Consensus in Peer-to-Peer Preservation Systems

    No full text
    Abstract— The problem of digital preservation is widely acknowledged, but the underlying assumptions implicit to the design of systems that address this problem have not been analyzed explicitly. We identify two basic approaches to address the problem of digital preservation using peer-to-peer systems: conservation and consensus. We highlight the design tradeoffs involved in using the two general approaches, and we provide a framework for analyzing the characteristics of peer-to-peer preservation systems in general. In addition, we propose a novel conservation-based protocol for achieving preservation and we analyze its effectiveness with respect to our framework. 1

    Hdtrans: A low-overhead dynamic translator

    No full text
    Dynamic translation is a general purpose tool used for instrumenting programs at run time. Many current translators perform substantial rewriting during translation in an attempt to reduce execution time. When dynamic translation is used as a ubiquitous policy enforcement mechanism, the majority of program executions have no dominating inner loop that can be used to amortize the cost of translation. Even under more favorable usage assumptions, our measurements show that such optimizations offer no significant benefit in most cases. A simpler, more maintainable, adaptable, and smaller translator may be preferable to more complicated designs. In this paper, we present HDTrans, a light-weight IA-32 to IA-32 binary translation system that uses some simple and effective translation techniques in combination with established trace linearization and code caching optimizations. We also present an evaluation of translation overhead under non-ideal conditions, showing that conventional benchmarks do not provide a good prediction of translation overhead when used pervasively. A further contribution of this paper is an analysis of the effectiveness of post-compile static pre-translation techniques for overhead reduction. Our results indicate that static pre-translation is effective only when expensive instrumentation or optimization is performed, and that efficient reload of pre-translated code incurs a substantial execution-time penalty.

    HDTrans

    No full text
    corecore