1,765 research outputs found

    Innovation and diffusion in risky industries under liability law: the case of “double-impact” innovations.

    Get PDF
    We suggest a model of innovation and diffusion of a new technology in which two firms, one innovative and one non-innovative, undertake risky activities that are regulated by liability rules. One originality of this study is to consider the presence of a “double-impact” innovation, impacting both the cost of risk prevention and the probability of accident. We compare strict liability and negligence in terms of incentives to innovate, to adopt the new technology and to prevent the risk. We find that the type of innovation and the behavior of the Regulator play key roles: when the Regulator acts as a “leader”, a negligence rule is socially preferable if the innovation mainly impacts the cost of risk prevention. In other cases (Regulator as a “follower” and/or innovation with sufficiently high impact on the probability of accident), strict liability is preferable.Innovation, technological risk, strict liability, negligence.

    Large-scale risks and technological change: What about limited liability?

    Get PDF
    We consider a firm that has to choose a technology to produce a given good. This technology drives a multiplicative large-scale risk of incident for Society: the total potential level of damage increases with the level of activity. Contrary to what is often argued in the literature, we show that limited liability can be more incentive for technical change than an unlimited liability rule, depending on the magnitude of the technological change and on the firm's size. In a second part of the paper, taxes are introduced. We show how manipulating the tax rate with respect to the technological choice made by the firm still enlarges the set of parameters that lead to technological change under a limited liability rule. Our normative results provide some arguments in favor of the limited liability rule, often considered as the main explanation of partial large-scale risk internalization by firms.Technological risk, limited liability, incentives, technical choice, taxes.

    Load management strategy for Particle-In-Cell simulations in high energy particle acceleration

    Full text link
    In the wake of the intense effort made for the experimental CILEX project, numerical simulation cam- paigns have been carried out in order to finalize the design of the facility and to identify optimal laser and plasma parameters. These simulations bring, of course, important insight into the fundamental physics at play. As a by-product, they also characterize the quality of our theoretical and numerical models. In this paper, we compare the results given by different codes and point out algorithmic lim- itations both in terms of physical accuracy and computational performances. These limitations are illu- strated in the context of electron laser wakefield acceleration (LWFA). The main limitation we identify in state-of-the-art Particle-In-Cell (PIC) codes is computational load imbalance. We propose an innovative algorithm to deal with this specific issue as well as milestones towards a modern, accurate high-per- formance PIC code for high energy particle acceleration

    Economic Analysis of Liability Apportionment Among Multiple Tortfeasors: A Survey, and Perspectives in Large-Scale Risks Management

    Get PDF
    The economic analysis of civil liability aims to demonstrate how the civil liability system can be set to provide the potential injurers with optimal incentives to regulate the level of risk they bear. However, despite a wide range of applications, there are few studies on the apportionment of liability between several tortfeasors. In this article, we especially focus on the case of an industrial activity involving a firm, whose activity is potentially harmful for the society, and one of its input providers. They both have an impact on the level of risk through an effort in care and quality. After highlighting the originality of our contribution within this literature, we propose an efficient sharing rule. We demonstrate that this rule of apportionment depends on the relative degree of solvency of the agents and, more importantly, it crucially depends on the market relationship that links the two contributors; thus calling for a collaboration between the competition agency, and the legislatures and courts

    La valeur récréative de la forêt en France: une approche par les coûts de déplacement

    Get PDF
    The forest is an important element in most countries and its multifunctional character explains its great value. This study aims at giving a monetary value to one of these functions that occupies a special place in our urban societies: the recreative use. We use the individual approach of the travel-cost method (TCM) from a survey made up of more than 4500 households surveyed by phone on the whole of the French territory in 2001. This type of survey and the zero-inflated count data models used in the econometric analysis make it possible to take into account the non-forest visitors. To reflect the heterogeneity of forests in France, the sample is segmented into nine inter-regions forming coherent forest sets. Individual surplus per visit are calculated for each region, showing variations in the recreational value of forests in France (from 0 to 47 €).forest, national survey, recreative value, travel-cost method, zero-inflated count data models.

    Convolutional Kernel Networks for Graph-Structured Data

    Get PDF
    We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods. Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps, where each node carries information about local graph substructures. On the one hand, the kernel point of view offers an unsupervised, expressive, and easy-to-regularize data representation, which is useful when limited samples are available. On the other hand, our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks. We show that our method achieves competitive performance on several graph classification benchmarks, while offering simple model interpretation. Our code is freely available at https://github.com/claying/GCKN

    How Histone Deacetylases Control Myelination

    Get PDF
    Myelinated axons are a beautiful example of symbiotic interactions between two cell types: Myelinating glial cells organize axonal membranes and build their myelin sheaths to allow fast action potential conduction, while axons regulate myelination and enhance the survival of myelinating cells. Axonal demyelination, occurring in neurodegenerative diseases or after a nerve injury, results in severe motor and/or mental disabilities. Thus, understanding how the myelination process is induced, regulated, and maintained is crucial to develop new therapeutic strategies for regeneration in the nervous system. Epigenetic regulation has recently been recognized as a fundamental contributing player. In this review, we focus on the central mechanisms of gene regulation mediated by histone deacetylation and other key functions of histone deacetylases in Schwann cells and oligodendrocytes, the myelinating glia of the peripheral and central nervous system

    Efficient RNA Isoform Identification and Quantification from RNA-Seq Data with Network Flows

    No full text
    International audienceSeveral state-of-the-art methods for isoform identification and quantification are based on l1- regularized regression, such as the Lasso. However, explicitly listing the--possibly exponentially-- large set of candidate transcripts is intractable for genes with many exons. For this reason, existing approaches using the l1-penalty are either restricted to genes with few exons, or only run the regression algorithm on a small set of pre-selected isoforms. We introduce a new technique called FlipFlop which can efficiently tackle the sparse estimation problem on the full set of candidate isoforms by using network flow optimization. Our technique removes the need of a preselection step, leading to better isoform identification while keeping a low computational cost. Experiments with synthetic and real RNA-Seq data confirm that our approach is more accurate than alternative methods and one of the fastest available. Source code is freely available as an R package from the Bioconductor web site (http://www.bioconductor.org/) and more information is available at http://cbio.ensmp.fr/flipflop
    • …
    corecore