5 research outputs found

    An Instrumenting Compiler for Enforcing Confidentiality in Low-Level Code

    No full text
    We present an instrumenting compiler for enforcing data confidentiality in low-level applications (e.g. those written in C) in the presence of an active adversary. In our approach, the programmer marks secret data by writing lightweight annotations on top-level definitions in the source code. The compiler then uses a static flow analysis coupled with efficient runtime instrumentation, a custom memory layout, and custom control-flow integrity checks to prevent data leaks even in the presence of low-level attacks. We have implemented our scheme as part of the LLVM compiler. We evaluate it on the SPEC micro-benchmarks for performance, and on larger, real-world applications (including OpenLDAP, which is around 300KLoC) for programmer overhead required to restructure the application when protecting the sensitive data such as passwords. We find that performance overheads introduced by our instrumentation are moderate (average 12% on SPEC), and the programmer effort to port OpenLDAP is only about 160 LoC

    Efficient and extensible security enforcement using dynamic data flow analysis

    Full text link

    Automatically identifying critical behaviors in programs

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 60-63).The large size of modern software systems has led to an increase in the complexity of the interaction between a system's code, its input, and its output. I propose the following classifications for the regions of a system's input: * Critical control: data that influences the internal operation and output of the system. * Critical payload: data that heavily contributes to the output of the program but does not substantially influence the internal operation of the program. * Benign control: data that influences the internal operation of the system, but does not contribute to the output of the system. * Benign payload: data that neither contributes to the output nor substantially influences the internal operation of the program. In this thesis, I present Chaos, a system designed to automatically infer these classifications for a program's inputs and code. Chaos monitors the execution trace and dynamic taint trace of an application over a suite of inputs to determine how regions of the programs' code and input influence its behavior and output. This thesis demonstrates the accuracy of Chaos's classifications for a set of imaging applications and their support libraries. These automatically inferred classifications are relevant to a variety of software engineering tasks, including program understanding, maintenance, debugging, testing, and defect correction triage.by Michael Carbin.S.M

    A General Dynamic Information Flow Tracking Framework for Security Applications

    No full text
    Many software security solutions require accurate tracking of control/data dependencies among information objects in network applications. This paper presents a general dynamic information flow tracking framework (called GIFT) for C programs that allows an application developer to associate applicationspecific tags with input data, instruments the application to propagate these tags to all the other data that are control/data-dependent on them, and invokes application-specific processing on output data according to their tag values. To use GIFT, an application developer only needs to implement input and output proxy functions to tag input data and to perform tag-dependent processing on output data, respectively. To demonstrate the usefulness of GIFT, we implement a complete GIFT application called Aussum, which allows selective sandboxing of network client applications based on whether their inputs are ”tainted ” or not. For a set of computation-intensive test applications, the measured elapsed time overhead of GIFT is less than 35%.
    corecore