3,964 research outputs found

    Sickness and injury leave in France: moral hazard or strain?

    Get PDF
    From 1997 to 2001, the total payment to compensate for sickness and injury leaves increased dramatically in France. Since this change coincided with a decrease in unemployment rate,three hypothesizes should be proposed as possible explanations consistently with the literature: moral hazard (workers fear less to loose their job, therefore use sickness leave more confidently); strain (workers work longer hours or under more stringent rules); labor-force composition effect (less healthy individuals are incorporated into the labor force). We investigate the first two strands of explanation using a household survey (ESPS) enriched with claims data from compulsory health insurance funds on sickness leaves (EPAS). We model separately number of leaves per individual (cumulative logit) and duration of leaves (random-effect model). According to our findings, in France, the individual propensity to take sickness leave is mainly influenced by strain in the workplace and by a labor-force composition effect. Conditional duration of spells is not well explained at the individual level: the only significant factor is usual weekly work duration. Influence of moral hazard is not clearly ascertained: it has few impact on occurrences of leave and no impact on duration.Sickness, Labour Force

    A refutation of the practice style hypothesis: the case of antibiotics prescription by French general practitioners for acute rhinopharyngitis

    Get PDF
    Many researches in France or abroad have highlighted the medical practice variation (MPV)phenomenon, or even the inappropriateness of certain medical decisions. There is no consensus on the origin of this MPV between preference-centred versus opportunities and constraints approaches. This study principal purpose is to refute hypothesis which assume that physicians adopt for their patient a uniform practice style for each similar clinical decision beyond the time. More specifically, multilevel models are estimated: First to measure variability of antibiotics prescription by French general practitioners for acute rhinopharyngitis, a clinical decision making context with weak uncertainty, and to tests its significance; Second to prioritize its determinants, especially those relating to GP or its practice setting environment, by controlling visit or patient confounders. The study was based on the 2001 activity data, added by an ad hoc questionnaire, of a sample of 778 GPs arising from a panel of 1006 computerized French GPs. We observe that a great part of the total variation was due to intra-physician variability (70%). Hence, in the French general practice context, we find empirical support for the rejection of the ‘practice style’, the ’enthusiasm’ or the ‘surgical signature’ hypothesis. Thus, it is patients' characteristics that largely explain the prescription, even if physicians' characteristics (area of practice, level of activity, network participation, participation in ongoing medical training) and environmental factors (recent visit from pharmaceutical sales representatives) also exert considerable influence. The latter suggest that MPV are partly caused by differences in the type of dissemination or diffusion of information. Such findings may help us to develop and identify facilitators for promoting a better use of antibiotics in France and, more generally, for influencing GPs practice when it is of interest.Medical practice variation, Multilevel analysis, Upper respiratory tract infections, Rhinopharyngitis, Antibiotics, General practitioners, Panel, France

    Interfacing the Network: An Embedded Approach to Network Instrument Creation

    Get PDF
    This paper discusses the design, construction, and development of a multi-site collaborative instrument, The Loop, developed by the JacksOn4 collective during 2009-10 and formally presented in Oslo at the arts.on.wires and NIME conferences in 2011. The development of this instrument is primarily a reaction to historical network performance that either attempts to present traditional acoustic practice in a distributed format or utilises the network as a conduit to shuttle acoustic and performance data amongst participant nodes. In both scenarios the network is an integral and indispensible part of the performance, however, the network is not perceived as an instrument, per se. The Loop is an attempt to create a single, distributed hybrid instrument retaining traditionally acoustic interfaces and resonant bodies that are mediated by the network. The embedding of the network into the body of the instrument raises many practical and theoretical discussions, which are explored in this paper through a reflection upon the notion of the distributed instrument and the way in which its design impacts the behaviour of the participants (performers and audiences); the mediation of musical expression across networks; the bi-directional relationship between instrument and design; as well as how the instrument assists in the realisation of the creators’ compositional and artistic goals

    Twins and their boundaries during homoepitaxy on Ir(111)

    Full text link
    The growth and annealing behavior of strongly twinned homoepitaxial films on Ir(111) has been investigated by scanning tunneling microscopy, low energy electron diffraction and surface X-ray diffraction. In situ surface X-ray diffraction during and after film growth turned out to be an efficient tool for the determination of twin fractions in multilayer films and to uncover the nature of side twin boundaries. The annealing of the twin structures is shown to take place in a two step process, reducing first the length of the boundaries between differently stacked areas and only then the twins themselves. A model for the structure of the side twin boundaries is proposed which is consistent with both the scanning tunneling microscopy and surface X-ray diffraction data.Comment: 13 pages, 11 figure

    Permutation Inference for Canonical Correlation Analysis

    Get PDF
    Canonical correlation analysis (CCA) has become a key tool for population neuroimaging, allowing investigation of associations between many imaging and non-imaging measurements. As other variables are often a source of variability not of direct interest, previous work has used CCA on residuals from a model that removes these effects, then proceeded directly to permutation inference. We show that such a simple permutation test leads to inflated error rates. The reason is that residualisation introduces dependencies among the observations that violate the exchangeability assumption. Even in the absence of nuisance variables, however, a simple permutation test for CCA also leads to excess error rates for all canonical correlations other than the first. The reason is that a simple permutation scheme does not ignore the variability already explained by previous canonical variables. Here we propose solutions for both problems: in the case of nuisance variables, we show that transforming the residuals to a lower dimensional basis where exchangeability holds results in a valid permutation test; for more general cases, with or without nuisance variables, we propose estimating the canonical correlations in a stepwise manner, removing at each iteration the variance already explained, while dealing with different number of variables in both sides. We also discuss how to address the multiplicity of tests, proposing an admissible test that is not conservative, and provide a complete algorithm for permutation inference for CCA.Comment: 49 pages, 2 figures, 10 tables, 3 algorithms, 119 reference

    Bauschinger effect in thin metallic films by fem simulations

    Get PDF
    Unpassivated free-standing gold and aluminum thin films (thickness ~ 200-400 nm, mean grain size dm,Au≈ 70-80nm, dm,Al≈ 120-200nm), subjected to tensile tests show Bauschinger effect (BE) during unloading [1, 2]. The focus of this work is to investigate the effect of microstructural heterogeneity such as grain sizes on the BE and the macroscopic deformation behavior in thin metallic films. The finite element code LAGAMINE is used to model the response of films involving sets of grains with different strengths. The numerical results are compared with experimental results from tensile tests on aluminum thin films from the work of Rajagopalan, et al. [2]

    How well do DRGs for appendectomy explain variations in resource use? : An analysis of patient-level data from 10 European countries

    Get PDF
    Appendectomy is a common and relatively simple procedure to remove an inflamed appendix, but the rate of appendectomy varies widely across Europe. This paper investigates factors that explain differences in resource use for appendectomy. We analysed 106,929 appendectomy patients treated in 939 hospitals in ten European countries. In stage one, we tested the performance of three models in explaining variation in the (log of) cost of the inpatient stay (seven countries) or length-of-stay (three countries). The first model used only the Diagnosis Related Groups (DRGs) to which patients were coded; the second used a core set of general patient-level and appendectomy-specific variables; and the third model combined both sets of variables. In stage two, we investigated hospital-level variation. In classifying appendectomy patients, most DRG systems take account of complex diagnoses and comorbidities, but use different numbers of DRGs (range: 2 to 8). The capacity of DRGs and patient-level variables to explain patient-level cost variation ranges from 34% in Spain to over 60% in England and France. All DRG systems can make better use of administrative data such as the patient’s age, diagnoses and procedures, and all countries have outlying hospitals that could improve their management of resources for appendectomy

    Activity based payment in hospitals: Principles and issues drawn from the economic literature and country experiences

    Get PDF
    In 2005, France joined the ranks of most other developed countries when it introduced an activity based payment system to finance all acute care hospitals. Despite some basic principles in common, the design of these systems can vary significantly across countries. In order to understand better the issues raised by the new system in France, this paper examines the economic rationale for such a system, the key implementation decisions to be made and the challenges involved. The principle of paying hospitals according to their activity in relation to homogeneous groups of patients has some obvious advantages to improve efficiency and the transparency in health care financing. However, the literature and the experience of the other countries presented in this paper show that this mechanism of payment presents a certain number of risks and requires regular and careful adjustments to obtain the benefits expected of such a system. To ensure both the clinical and economic coherence of the classification used to define hospital activity, and to establish the corresponding level of tariffs, constitute two major challenges. The principle of paying a fixed price which is directly indexed on the average costs observed and which remains common to all types of hospitals has been increasingly subject to criticism. Furthermore, activity based payment, by its nature, can induce some perverse effects which requires complementary regulatory mechanisms to guarantee the quality of the care and equitable access. From the point of view of controlling health expenditure, it is equally important to follow closely the evolution of health care activity in different hospital settings, as well as in ambulatory care, since activity based payment may encourage hospitals to increase their activity by inducing greater demand for profitable services while shifting part of their costs towards medium/long-term care settings or to home-based or informal care.Activity based payment, hospital, regulation, international comparison.
    • 

    corecore