331 research outputs found

    Compulsory Licensing, Innovation and Welfare

    Full text link
    This paper develops a three-stage model of innovation, fixed-fee licensing and production to evaluate the welfare effects of compulsory licensing, taking into account both static (information sharing) and dynamic (innovation incentive) effects. Compulsory licensing is shown to have an unambiguously positive impact on consumer surplus. Compulsory licensing has an ambiguous effect on total welfare, but it is more likely to increase total welfare in industries which are naturally less competitive. Furthermore, compulsory licensing can be an effective policy to safeguard the competitive process per se. These welfare results hold independently of whether R&D incentives in the absence of licensing favour the leading firm ('persistent dominance') or predict that the follower will overtake the incumbent ('action-reaction')

    Spatial competition and social welfare in the presence of non-monotonic network effects

    Get PDF
    We study a spatial duopoly and extend the literature by giving joint consideration to non-monotonic network effects and endogenous firm location decisions. We show that the presence of network effects (capturing, for example, in-store rather than online sales) improves welfare whenever the total market size is not too large. This effect is lost if network effects are specified in a monotonic fashion, in which case isolating consumers from one another always reduces welfare. We also provide a new rationale for a duopoly to be welfare-preferred to monopoly: in large markets, splitting demand between two firms can reduce utility losses due to crowding.Publisher PD

    Essays in competition policy, innovation and banking regulation

    Get PDF
    This thesis investigates the optimal enforcement of competition policy in innovative industries and in the banking sector. Chapter 2 analyses the welfare impact of compulsory licensing in the context of unilateral refusals to license intellectual property. When the risk-free rate is low, compulsory licensing is shown unambiguously to increase consumer surplus. Compulsory licensing has an ambiguous effect on total welfare, but is more likely to increase total welfare in industries that are naturally less competitive. Compulsory licensing is also shown to be an effective policy to protect competition per se. The chapter also demonstrates the robustness of these results to alternative settings of R&D competition. Chapter 3 develops a much more general framework for the study of optimal competition policy enforcement in innovative industries. A major contribution of this chapter is to separate carefully a firm's decision to innovate from its decision to take some generic anti-competitive action. This allows us to differentiate between firms' counterfactual behaviour, according to whether or not they would have innovated in the absence of any potentially anti-competitive conduct. In contrast to the existing literature, it is shown that the stringency of optimal policy will be harsher towards firms that have innovated in addition to taking a given anticompetitive action. Chapter 4 develops a framework for competition policy in the banking sector, which takes explicit account of capital regulation. In particular, conditions are derived under which increases in the capital requirement increase the incentives of banks to engage in a generic abuse of dominance in the loan market, and to exploit depositors through the sale of ancillary financial products. Thus the central contribution of this chapter is to clarify the conditions under which stability-focused capital regulation conflicts with competition and consumer protection policy in the banking sector

    Optimizing illumination for precise multi-parameter estimations in coherent diffractive imaging

    Get PDF
    Coherent diffractive imaging (CDI) is widely used to characterize structured samples from measurements of diffracting intensity patterns. We introduce a numerical framework to quantify the precision that can be achieved when estimating any given set of parameters characterizing the sample from measured data. The approach, based on the calculation of the Fisher information matrix, provides a clear benchmark to assess the performance of CDI methods. Moreover, by optimizing the Fisher information metric using deep learning optimization libraries, we demonstrate how to identify the optimal illumination scheme that minimizes the estimation error under specified experimental constrains. This work paves the way for an efficient characterization of structured samples at the sub-wavelength scale

    Noise-robust latent vector reconstruction in ptychography using deep generative models

    Full text link
    Computational imaging is increasingly vital for a broad spectrum of applications, ranging from biological to material sciences. This includes applications where the object is known and sufficiently sparse, allowing it to be described with a reduced number of parameters. When no explicit parameterization is available, a deep generative model can be trained to represent an object in a low-dimensional latent space. In this paper, we harness this dimensionality reduction capability of autoencoders to search for the object solution within the latent space rather than the object space. We demonstrate a novel approach to ptychographic image reconstruction by integrating a deep generative model obtained from a pre-trained autoencoder within an Automatic Differentiation Ptychography (ADP) framework. This approach enables the retrieval of objects from highly ill-posed diffraction patterns, offering an effective method for noise-robust latent vector reconstruction in ptychography. Moreover, the mapping into a low-dimensional latent space allows us to visualize the optimization landscape, which provides insight into the convexity and convergence behavior of the inverse problem. With this work, we aim to facilitate new applications for sparse computational imaging such as when low radiation doses or rapid reconstructions are essential

    Efficient and flexible approach to ptychography using an optimization framework based on automatic differentiation

    Get PDF
    Ptychography is a lensless imaging method that allows for wavefront sensing and phase-sensitive microscopy from a set of diffraction patterns. Recently, it has been shown that the optimization task in ptychography can be achieved via automatic differentiation (AD). Here, we propose an open-access AD-based framework implemented with TensorFlow, a popular machine learning library. Using simulations, we show that our AD-based framework performs comparably to a state-of-the-art implementation of the momentum-accelerated ptychographic iterative engine (mPIE) in terms of reconstruction speed and quality. AD-based approaches provide great flexibility, as we demonstrate by setting the reconstruction distance as a trainable parameter. Lastly, we experimentally demonstrate that our framework faithfully reconstructs a biological specimen

    faulTPM: Exposing AMD fTPMs' Deepest Secrets

    Full text link
    Trusted Platform Modules constitute an integral building block of modern security features. Moreover, as Windows 11 made a TPM 2.0 mandatory, they are subject to an ever-increasing academic challenge. While discrete TPMs - as found in higher-end systems - have been susceptible to attacks on their exposed communication interface, more common firmware TPMs (fTPMs) are immune to this attack vector as they do not communicate with the CPU via an exposed bus. In this paper, we analyze a new class of attacks against fTPMs: Attacking their Trusted Execution Environment can lead to a full TPM state compromise. We experimentally verify this attack by compromising the AMD Secure Processor, which constitutes the TEE for AMD's fTPMs. In contrast to previous dTPM sniffing attacks, this vulnerability exposes the complete internal TPM state of the fTPM. It allows us to extract any cryptographic material stored or sealed by the fTPM regardless of authentication mechanisms such as Platform Configuration Register validation or passphrases with anti-hammering protection. First, we demonstrate the impact of our findings by - to the best of our knowledge - enabling the first attack against Full Disk Encryption solutions backed by an fTPM. Furthermore, we lay out how any application relying solely on the security properties of the TPM - like Bitlocker's TPM- only protector - can be defeated by an attacker with 2-3 hours of physical access to the target device. Lastly, we analyze the impact of our attack on FDE solutions protected by a TPM and PIN strategy. While a naive implementation also leaves the disk completely unprotected, we find that BitLocker's FDE implementation withholds some protection depending on the complexity of the used PIN. Our results show that when an fTPM's internal state is compromised, a TPM and PIN strategy for FDE is less secure than TPM-less protection with a reasonable passphrase.Comment: *Both authors contributed equally. We publish all code necessary to mount the attack under https://github.com/PSPReverse/ftpm_attack. The repository further includes several intermediate results, e.g., flash memory dumps, to retrace the attack process without possessing the target boards and required hardware tool

    EM-Fault It Yourself: Building a Replicable EMFI Setup for Desktop and Server Hardware

    Full text link
    EMFI has become a popular fault injection (FI) technique due to its ability to inject faults precisely considering timing and location. Recently, ARM, RISC-V, and even x86 processing units in different packages were shown to be vulnerable to electromagnetic fault injection (EMFI) attacks. However, past publications lack a detailed description of the entire attack setup, hindering researchers and companies from easily replicating the presented attacks on their devices. In this work, we first show how to build an automated EMFI setup with high scanning resolution and good repeatability that is large enough to attack modern desktop and server CPUs. We structurally lay out all details on mechanics, hardware, and software along with this paper. Second, we use our setup to attack a deeply embedded security co-processor in modern AMD systems on a chip (SoCs), the AMD Secure Processor (AMD-SP). Using a previously published code execution exploit, we run two custom payloads on the AMD-SP that utilize the SoC to different degrees. We then visualize these fault locations on SoC photographs allowing us to reason about the SoC's components under attack. Finally, we show that the signature verification process of one of the first executed firmware parts is susceptible to EMFI attacks, undermining the security architecture of the entire SoC. To the best of our knowledge, this is the first reported EMFI attack against an AMD desktop CPU.Comment: This is the authors' version of the article accepted for publication at IEEE International Conference on Physical Assurance and Inspection of Electronics (PAINE 2022

    Wavelength-multiplexed Multi-mode EUV Reflection Ptychography based on Automatic-Differentiation

    Full text link
    Ptychographic extreme ultraviolet (EUV) diffractive imaging has emerged as a promising candidate for the next-generation metrology solutions in the semiconductor industry, as it can image wafer samples in reflection geometry at the nanoscale. This technique has surged attention recently, owing to the significant progress in high-harmonic generation (HHG) EUV sources and advancements in both hardware and software for computation. In this study, a novel algorithm is introduced and tested, which enables wavelength-multiplexed reconstruction that enhances the measurement throughput and introduces data diversity, allowing the accurate characterisation of sample structures. To tackle the inherent instabilities of the HHG source, a modal approach was adopted, which represents the cross-density function of the illumination by a series of mutually incoherent and independent spatial modes. The proposed algorithm was implemented on a mainstream machine learning platform, which leverages automatic differentiation to manage the drastic growth in model complexity and expedites the computation using GPU acceleration. By optimising over 200 million parameters, we demonstrate the algorithm's capacity to accommodate experimental uncertainties and achieve a resolution approaching the diffraction limit in reflection geometry. The reconstruction of wafer samples with 20-nm heigh patterned gold structures on a silicon substrate highlights our ability to handle complex physical interrelations involving a multitude of parameters. These results establish ptychography as an efficient and accurate metrology tool
    corecore