23,028 research outputs found

    Strongly Secure and Efficient Data Shuffle On Hardware Enclaves

    Full text link
    Mitigating memory-access attacks on the Intel SGX architecture is an important and open research problem. A natural notion of the mitigation is cache-miss obliviousness which requires the cache-misses emitted during an enclave execution are oblivious to sensitive data. This work realizes the cache-miss obliviousness for the computation of data shuffling. The proposed approach is to software-engineer the oblivious algorithm of Melbourne shuffle on the Intel SGX/TSX architecture, where the Transaction Synchronization eXtension (TSX) is (ab)used to detect the occurrence of cache misses. In the system building, we propose software techniques to prefetch memory data prior to the TSX transaction to defend the physical bus-tapping attacks. Our evaluation based on real implementation shows that our system achieves superior performance and lower transaction abort rate than the related work in the existing literature.Comment: Systex'1

    Contour: A Practical System for Binary Transparency

    Full text link
    Transparency is crucial in security-critical applications that rely on authoritative information, as it provides a robust mechanism for holding these authorities accountable for their actions. A number of solutions have emerged in recent years that provide transparency in the setting of certificate issuance, and Bitcoin provides an example of how to enforce transparency in a financial setting. In this work we shift to a new setting, the distribution of software package binaries, and present a system for so-called "binary transparency." Our solution, Contour, uses proactive methods for providing transparency, privacy, and availability, even in the face of persistent man-in-the-middle attacks. We also demonstrate, via benchmarks and a test deployment for the Debian software repository, that Contour is the only system for binary transparency that satisfies the efficiency and coordination requirements that would make it possible to deploy today.Comment: International Workshop on Cryptocurrencies and Blockchain Technology (CBT), 201

    Building Responsive Systems from Physically-correct Specifications

    Full text link
    Predictability - the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems - possessing properties such as clairvoyance, caprice, in finite capacity, or perfect timing - cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the CLEOPATRA programming language. CLEOPATRA features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. CLEOPATRA is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of CLEOPATRA has been in use as a specification and simulation language for embedded time-critical robotic processes.Harvard University; DARPA (N00039-88-C-0163

    Politics, transaction costs, and the design of regulatory institutions

    Get PDF
    Providing a more complete framework for assessing the efficiency of government intervention requires moving away from the idealistic perspective typically found in the normative approach to traditional public economics, contend the authors. Such a move requires viewing the government not as a monolithic entity but as many different government bodies, each with its own constituency and regulatory tools. Not only is the"multitiered"government limited in its ability to commit, but interest groups influence the regulatory process and impose significant transaction costs on government interventions and on their outcome. The authors discuss the nature of those transaction costs and argue that the overall design of the government is the result of their minimization. Among the points they make in their conclusions: 1) Safeguards built into regulatory contracts sometimes reflect and sometimes imply transactions costs which influence, or should influence, the optimal tradeoff between rent and efficient in ways practitioners sometimes ignore. 2) Most of the literature on transaction costs arising from government failures would agree that to be sustainable, regulatory institutions should be independent, autonomous, and accountable. How these criteria are met is determined by the way transaction costs are minimized, which in turn drives the design of the regulatory framework. In practice, for example, if there at commitment problems, short-term institutional contracts between players are more likely to ensure autonomy and independence. This affects the duration of the nomination of the regulators. Short-term contracts may be best, but contracts for regulators typically last four to eight years and are often renewable. The empirical debate about the design of regulators'jobs is a possible source of tension. Practitioners typically recommend choosing regulators based on professional rather than political criteria, but that may not be the best way to minimize regulatory capture. Professional experts are likely to come from the sector they are supposed to regulate and are likely to return to it sooner or later (as typically happens in developing countries). On the other hand, electedregulators are unlikely to be much more independent than professional regulators; they will simply represent different interests. Practitioners and theorists alike emphasize different sources of capture and agree that one way to deal with its risk is to make sure the selection process involves both executive and legislative branches.Environmental Economics&Policies,Economic Theory&Research,Labor Policies,Decentralization,Banks&Banking Reform,Economic Theory&Research,Environmental Economics&Policies,National Governance,Administrative&Regulatory Law,Banks&Banking Reform
    corecore