5,696 research outputs found

    Verifying constant-time implementations

    Get PDF
    The constant-time programming discipline is an effective countermeasure against timing attacks, which can lead to complete breaks of otherwise secure systems. However, adhering to constant-time programming is hard on its own, and extremely hard under additional efficiency and legacy constraints. This makes automated verification of constant-time code an essential component for building secure software. We propose a novel approach for verifying constant- time security of real-world code. Our approach is able to validate implementations that locally and intentionally violate the constant-time policy, when such violations are benign and leak no more information than the pub- lic outputs of the computation. Such implementations, which are used in cryptographic libraries to obtain impor- tant speedups or to comply with legacy APIs, would be declared insecure by all prior solutions. We implement our approach in a publicly available, cross-platform, and fully automated prototype, ct-verif, that leverages the SMACK and Boogie tools and verifies optimized LLVM implementations. We present verifica- tion results obtained over a wide range of constant-time components from the NaCl, OpenSSL, FourQ and other off-the-shelf libraries. The diversity and scale of our ex- amples, as well as the fact that we deal with top-level APIs rather than being limited to low-level leaf functions, distinguishes ct-verif from prior tools. Our approach is based on a simple reduction of constant-time security of a program P to safety of a prod- uct program Q that simulates two executions of P. We formalize and verify the reduction for a core high-level language using the Coq proof assistant.The first two authors were funded by Project “TEC4Growth - Pervasive Intelligence, Enhancers and Proofs of Concept with Industrial Impact/NORTE-01-0145-FEDER-000020”, which is fi- nanced by the North Portugal Regional Operational Programme (NORTE 2020), under the PORTUGAL 2020 Partnership Agreement, and through the European Regional Development Fund (ERDF). The third and fourth authors were supported by projects S2013/ICE2731 N-GREENS Software-CM and ONR Grants N000141210914 (AutoCrypt) and N000141512750 (SynCrypt). The fourth author was also supported by FP7 Marie Cure Actions-COFUND 291803 (Amarout II). We thank Peter Schwabe for providing us with a collection of negative examples. We thank Hovav Shacham, Craig Costello and Patrick Longa for helpful observations on our verification results. TEC4Growth - Pervasive Intelligence, Enhancers and Proofs of Concept with Industrial Impact/NORTE-01-0145-FEDER-000020info:eu-repo/semantics/publishedVersio

    Towards returns management strategies in internet retailing

    Get PDF
    The digital transformation of the retailing industry in recent years has had a profound effect on consumers’ behaviour on a global scale. When shopping and browsing online, consumers are not able to “touch and feel”, which means that product returns are inevitable. The fashion industry has particularly suffered from high return rates, which fluctuate between 30% and 50%. The industry has been struggling to strike a balance between competitive customer service, profitability and company sustainability targets.Against this backdrop, the purpose of this thesis is to contribute to the development of returns management strategies in internet retailing. Returns management in an online environment encapsulates both the return policy and the return process. The former has an impact on consumers, whilst the latter refers to the company itself. Four studies provide evidence to serve the purpose of the thesis. First, the author investigates how the return policy affects purchase decisions (Study I and Study II) and second, how internet retailers manage their return processes (Study III). Finally, the author sheds light on the way effective strategies for returns management can be established (Study IV).Two quantitative studies and two qualitative studies were conducted. More specifically, in Study I and Study II, data were collected from consumers through an online survey. Study III followed an exploratory multiple case study design, while in Study IV, data were collected through a confirmatory multiple case study.The findings of this thesis have significant implications for theory and practice. This research extends the returns management literature by uncovering mediating and moderating mechanisms of interest. The notion of fit between returns management and business intent can prove to be a valuable tool with extensive applicability to a wide range of returns-related decisions. This research also presents an array of identified misalignments that can assist supply chain managers in designing effective and robust returns management strategies in the internet retailing domain

    Detecting Dissimilar Classes of Source Code Defects

    Get PDF
    Software maintenance accounts for the most part of the software development cost and efforts, with its major activities focused on the detection, location, analysis and removal of defects present in the software. Although software defects can be originated, and be present, at any phase of the software development life-cycle, implementation (i.e., source code) contains more than three-fourths of the total defects. Due to the diverse nature of the defects, their detection and analysis activities have to be carried out by equally diverse tools, often necessitating the application of multiple tools for reasonable defect coverage that directly increases maintenance overhead. Unified detection tools are known to combine different specialized techniques into a single and massive core, resulting in operational difficulty and maintenance cost increment. The objective of this research was to search for a technique that can detect dissimilar defects using a simplified model and a single methodology, both of which should contribute in creating an easy-to-acquire solution. Following this goal, a ‘Supervised Automation Framework’ named FlexTax was developed for semi-automatic defect mapping and taxonomy generation, which was then applied on a large-scale real-world defect dataset to generate a comprehensive Defect Taxonomy that was verified using machine learning classifiers and manual verification. This Taxonomy, along with an extensive literature survey, was used for comprehension of the properties of different classes of defects, and for developing Defect Similarity Metrics. The Taxonomy, and the Similarity Metrics were then used to develop a defect detection model and associated techniques, collectively named Symbolic Range Tuple Analysis, or SRTA. SRTA relies on Symbolic Analysis, Path Summarization and Range Propagation to detect dissimilar classes of defects using a simplified set of operations. To verify the effectiveness of the technique, SRTA was evaluated by processing multiple real-world open-source systems, by direct comparison with three state-of-the-art tools, by a controlled experiment, by using an established Benchmark, by comparison with other tools through secondary data, and by a large-scale fault-injection experiment conducted using a Mutation-Injection Framework, which relied on the taxonomy developed earlier for the definition of mutation rules. Experimental results confirmed SRTA’s practicality, generality, scalability and accuracy, and proved SRTA’s applicability as a new Defect Detection Technique

    First Steps Towards an Ethics of Robots and Artificial Intelligence

    Get PDF
    This article offers an overview of the main first-order ethical questions raised by robots and Artificial Intelligence (RAIs) under five broad rubrics: functionality, inherent significance, rights and responsibilities, side-effects, and threats. The first letter of each rubric taken together conveniently generates the acronym FIRST. Special attention is given to the rubrics of functionality and inherent significance given the centrality of the former and the tendency to neglect the latter in virtue of its somewhat nebulous and contested character. In addition to exploring some illustrative issues arising under each rubric, the article also emphasizes a number of more general themes. These include: the multiplicity of interacting levels on which ethical questions about RAIs arise, the need to recognise that RAIs potentially implicate the full gamut of human values (rather than exclusively or primarily some readily identifiable sub-set of ethical or legal principles), and the need for practically salient ethical reflection on RAIs to be informed by a realistic appreciation of their existing and foreseeable capacities

    Globalized Corporate Prosecutions

    Get PDF
    In the past, domestic prosecutions of foreign corporations were not noteworthy. Federal prosecutors now advertise a muscular approach targeting major foreign firms and even entire industries. High-profile prosecutions of foreign firms have shaken the international business community. Not only is the approach federal prosecutors have taken novel, but corporate criminal liability is itself a form of American Exceptionalism, and few other countries hold corporations broadly criminally accountable. To study U.S. prosecutions of foreign firms, I assembled a database of publicly reported corporate guilty plea agreements from the past decade. I analyzed U.S. Sentencing Commission data archives on federal corporate prosecutions and also data concerning federal deferred and non-prosecution agreements with corporations. Not only are large foreign firms prosecuted with some frequency, but they typically plead guilty, are convicted, and then receive far higher fines than otherwise comparable domestic firms. In this Article, I develop how foreign corporate convictions have become common in distinct substantive criminal areas, and how they share important features. The prosecutions are concentrated in crimes prosecuted by Main Justice, and international treaties and cooperation agreements have facilitated extraterritorial prosecutions. Larger and public foreign firms are prosecuted, and the typical resolution involves not only higher fines, but also a guilty plea and not pre-indictment leniency. I argue that due to their new prominence, we should consider foreign corporation prosecutions as a group so that we can better evaluate and define the emerging prosecution approach

    Personal Data Security: Divergent Standards in the European Union and the United States

    Get PDF
    This Note argues that the U.S. Government should discontinue all attempts to establish EES as the de facto encryption standard in the United States because the economic disadvantages associated with widespread implementation of EES outweigh the advantages this advanced data security system provides. Part I discusses the EU\u27s legislative efforts to ensure personal data security and analyzes the evolution of encryption technology in the United States. Part II examines the methods employed by the U.S. Government to establish EES as the de facto U.S. encryption standard. Part III argues that the U.S. Government should terminate its effort to establish EES as the de facto U.S. encryption standard and institute an alternative standard that ensures continued U.S. participation in the international marketplace

    Wireless sensor data security

    Get PDF
    Wireless Sensor Network (WSNs) is a network of sensors deployed in places unsuitable for human beings and where constant monitoring is required. They work with low power, low cost smart devices having limited computing resources. They have a crucial role to play in battle surveillance, border control and infrastructure protection. Keeping in view the precious data they transmit, their security from active or passive attacks is very crucial. We came to know about LOCK model implementing novel Distributed Key Management Exclusion Basis (EBS) System is very efficient in providing with Network Security. Keeping in view the importance of Data Security we preferred to secure WSN data through Public Key Encryption methods like RSA. We also discussed and implemented Elliptic Curve Cryptography (ECC) and its advantages over RSA. However our novel Spiral Encryption Technique implemented along with ECC algorithm, has shown how it helped in making the transmitted message more secure and less informative for the eavesdropper

    Progress on probabilistic encryption schemes

    Get PDF
    The purpose of this master\u27s project is to study different probabilistic cryptography schemes. The older probabilistic schemes, Goldwasser-Micali and Blum-Goldwasser, will only be covered briefly for a historical perspective. Several new and promising schemes have appeared in the last 7 years, generating interest. I will be examining the Paillier and Damgard-Jurik schemes in depth. This report explains the mathematics behind the schemes along with their inherent benefits, while also suggesting some potential uses. Details are given on how I optimized the algorithms, with special emphasis on using the Chinese Remainder Theorem (CRT) in the Damgard-Jurik algorithm as well as the other algorithms. One of the main benefits these schemes posses is the additively homomorphic property. I explain the homomorphic properties in the description of the schemes and give an overview of these properties in Appendix A. I create software based in the Java Cryptography Extension (JCE) that is used to do a comparative study. This includes a simple message passing program for encrypted text. I create my own implementations of Paillier, Damgard-Jurik, and a variation of Paillier\u27s scheme as a Provider using the JCE. These implementations use the CRT along with other methods to increase performance and create optimized algorithms. The implementations are plugged into the message passing program with an implementation of RSA from another Provider. A comparative study of the timings of these three schemes is done to show which one performs better in different circumstances. Conclusions are drawn based on the results of the tests and my final opinions are stated
    corecore