285 research outputs found

    Learning a Static Analyzer from Data

    Full text link
    To be practically useful, modern static analyzers must precisely model the effect of both, statements in the programming language as well as frameworks used by the program under analysis. While important, manually addressing these challenges is difficult for at least two reasons: (i) the effects on the overall analysis can be non-trivial, and (ii) as the size and complexity of modern libraries increase, so is the number of cases the analysis must handle. In this paper we present a new, automated approach for creating static analyzers: instead of manually providing the various inference rules of the analyzer, the key idea is to learn these rules from a dataset of programs. Our method consists of two ingredients: (i) a synthesis algorithm capable of learning a candidate analyzer from a given dataset, and (ii) a counter-example guided learning procedure which generates new programs beyond those in the initial dataset, critical for discovering corner cases and ensuring the learned analysis generalizes to unseen programs. We implemented and instantiated our approach to the task of learning JavaScript static analysis rules for a subset of points-to analysis and for allocation sites analysis. These are challenging yet important problems that have received significant research attention. We show that our approach is effective: our system automatically discovered practical and useful inference rules for many cases that are tricky to manually identify and are missed by state-of-the-art, manually tuned analyzers

    Suggestions for improving the design of clinical trials in multiple sclerosis - results of a systematic analysis of completed phase III trials

    Get PDF
    This manuscript reviews the primary and secondary endpoints of pivotal phase III trials with immunomodulatory drugs in multiple sclerosis (MS). Considering the limitations of previous trial designs, we propose new standards for the planning of clinical trials, taking into account latest insights into MS pathophysiology and patient-relevant aspects. Using a systematic overview of published phase III (pivotal) trials performed as part of application for drug market approval, we evaluate the following characteristics: trial duration, number of trial participants, comparators, and endpoints (primary, secondary, magnetic resonance imaging outcome, and patient-reported outcomes). From a patient perspective, the primary and secondary endpoints of clinical trials are only partially relevant. High-quality trial data pertaining to efficacy and safety that stretch beyond the time frame of pivotal trials are almost non-existent. Understanding of long-term benefits and risks of disease-modifying MS therapy is largely lacking. Concrete proposals for the trial designs of relapsing (remitting) multiple sclerosis/clinically isolated syndrome, primary progressive multiple sclerosis, and secondary progressive multiple sclerosis (e.g., study duration, mechanism of action, and choice of endpoints) are presented based on the results of the systematic overview. Given the increasing number of available immunotherapies, the therapeutic strategy in MS has shifted from a mere "relapse-prevention" approach to a personalized provision of medical care as to the choice of the appropriate drugs and their sequential application over the course of the disease. This personalized provision takes patient preferences as well as disease-related factors into consideration such as objective clinical and radiographic findings but also very burdensome symptoms such as fatigue, depression, and cognitive impairment. Future trial designs in MS will have to assign higher relevance to these patient-reported outcomes and will also have to implement surrogate measures that can serve as predictive markers for individual treatment response to new and investigational immunotherapies. This is an indispensable prerequisite to maximize the benefit of individual patients when participating in clinical trials. Moreover, such appropriate trial designs and suitable enrolment criteria that correspond to the mode of action of the study drug will facilitate targeted prevention of adverse events, thus mitigating risks for individual study participants

    Formal Verification of Neural Network Controlled Autonomous Systems

    Full text link
    In this paper, we consider the problem of formally verifying the safety of an autonomous robot equipped with a Neural Network (NN) controller that processes LiDAR images to produce control actions. Given a workspace that is characterized by a set of polytopic obstacles, our objective is to compute the set of safe initial conditions such that a robot trajectory starting from these initial conditions is guaranteed to avoid the obstacles. Our approach is to construct a finite state abstraction of the system and use standard reachability analysis over the finite state abstraction to compute the set of the safe initial states. The first technical problem in computing the finite state abstraction is to mathematically model the imaging function that maps the robot position to the LiDAR image. To that end, we introduce the notion of imaging-adapted sets as partitions of the workspace in which the imaging function is guaranteed to be affine. We develop a polynomial-time algorithm to partition the workspace into imaging-adapted sets along with computing the corresponding affine imaging functions. Given this workspace partitioning, a discrete-time linear dynamics of the robot, and a pre-trained NN controller with Rectified Linear Unit (ReLU) nonlinearity, the second technical challenge is to analyze the behavior of the neural network. To that end, we utilize a Satisfiability Modulo Convex (SMC) encoding to enumerate all the possible segments of different ReLUs. SMC solvers then use a Boolean satisfiability solver and a convex programming solver and decompose the problem into smaller subproblems. To accelerate this process, we develop a pre-processing algorithm that could rapidly prune the space feasible ReLU segments. Finally, we demonstrate the efficiency of the proposed algorithms using numerical simulations with increasing complexity of the neural network controller

    Quantum interference from remotely trapped ions

    Full text link
    We observe quantum interference of photons emitted by two continuously laser-excited single ions, independently trapped in distinct vacuum vessels. High contrast two-photon interference is observed in two experiments with different ion species, calcium and barium. Our experimental findings are quantitatively reproduced by Bloch equation calculations. In particular, we show that the coherence of the individual resonance fluorescence light field is determined from the observed interference

    On the lease rate, convenience yield and speculative effects in the gold futures market

    Get PDF
    By examining data on the gold forward offered rate (GOFO) and lease rates over the period 1996- 2009, we conclude that the convenience yield of gold is better approximated by the lease rate than the interest-adjusted spread of Fama & French (1983). Using the latter quantity, we study the relationship between gold leasing and the level of COMEX discretionary inventory and exhibit that lease rates are negatively related to inventories. We also show that Futures prices have increasingly exceeded forward prices over the period, and this effect increases with the speculative pressure and the maturity of the contracts

    Optomagnetic composite medium with conducting nanoelements

    Full text link
    A new type of metal-dielectric composites has been proposed that is characterised by a resonance-like behaviour of the effective permeability in the infrared and visible spectral ranges. This material can be referred to as optomagnetic medium. The analytical formalism developed is based on solving the scattering problem for considered inclusions with impedance boundary condition, which yields the current and charge distributions within the inclusions. The presence of the effective magnetic permeability and its resonant properties lead to novel optical effects and open new possible applications.Comment: 48 pages, 13 figures. accepted to Phys. Rev. B; to appear vol. 66, 200

    Observation of squeezed light from one atom excited with two photons

    Full text link
    Single quantum emitters like atoms are well-known as non-classical light sources which can produce photons one by one at given times, with reduced intensity noise. However, the light field emitted by a single atom can exhibit much richer dynamics. A prominent example is the predicted ability for a single atom to produce quadrature-squeezed light, with sub-shot-noise amplitude or phase fluctuations. It has long been foreseen, though, that such squeezing would be "at least an order of magnitude more difficult" to observe than the emission of single photons. Squeezed beams have been generated using macroscopic and mesoscopic media down to a few tens of atoms, but despite experimental efforts, single-atom squeezing has so far escaped observation. Here we generate squeezed light with a single atom in a high-finesse optical resonator. The strong coupling of the atom to the cavity field induces a genuine quantum mechanical nonlinearity, several orders of magnitude larger than for usual macroscopic media. This produces observable quadrature squeezing with an excitation beam containing on average only two photons per system lifetime. In sharp contrast to the emission of single photons, the squeezed light stems from the quantum coherence of photon pairs emitted from the system. The ability of a single atom to induce strong coherent interactions between propagating photons opens up new perspectives for photonic quantum logic with single emittersComment: Main paper (4 pages, 3 figures) + Supplementary information (5 pages, 2 figures). Revised versio

    Fast cavity-enhanced atom detection with low noise and high fidelity

    Get PDF
    Cavity quantum electrodynamics describes the fundamental interactions between light and matter, and how they can be controlled by shaping the local environment. For example, optical microcavities allow high-efficiency detection and manipulation of single atoms. In this regime fluctuations of atom number are on the order of the mean number, which can lead to signal fluctuations in excess of the noise on the incident probe field. Conversely, we demonstrate that nonlinearities and multi-atom statistics can together serve to suppress the effects of atomic fluctuations when making local density measurements on clouds of cold atoms. We measure atom densities below 1 per cavity mode volume near the photon shot-noise limit. This is in direct contrast to previous experiments where fluctuations in atom number contribute significantly to the noise. Atom detection is shown to be fast and efficient, reaching fidelities in excess of 97% after 10 us and 99.9% after 30 us.Comment: 7 pages, 4 figures, 1 table; extensive changes to format and discussion according to referee comments; published in Nature Communications with open acces
    • …
    corecore