1,602 research outputs found

    Who Cares About Inequality of Opportunity and Why?

    Get PDF
    Inequality is one of the most pressing economic issues in today’s world. High levels of economic inequality are associated with a large number of negative economic side effects and with correspondingly high levels of political inequality. This results in the voices of the few being heard more loudly than the voices of the many and a disruption of democracy. However, in the United States there seems to be very little political will to take action that might curb the problem of inequality. Many scholars agree that the defining aspect of a person’s opinion on the subject is their own self-interest, but perception of self-interest is influenced by a number of factors. These include but are not limited to political affiliation, age, income level, and level of education. All of these impact the extent to which an individual might feel that inequality of wealth and power is a problem. Using 2012 ANES election data, I have attempted to determine what impact, if any, income and other variables may or may not have on perceptions of inequality. I hypothesized that, as income increases, worries about the danger of inequality decline. While my hypothesis was confirmed to a limited extent, it has become apparent that other factors, namely political affiliation, are much more important determinants than income, some even by an order of magnitude

    Studying innovative concepts by coupling simplified: Simulation and multizone airflow model

    No full text
    Disponible à l'adresse : http://leso.epfl.ch/files/content/sites/leso/files/download/publications/cisbat_proceedings_final_download.pdfInternational audienceIn order to respond to global warming and natural resources depletion challenges, industrials from the building sector need to propose an adequate offer. Energy simulation tools can support this process. In order to reach high performance level, e.g. primary energy consumption below 50 kWh.m-2 per year (including heating, cooling, domestic hot water, lighting and ventilation), various studies and real cases show that, appropriate architecture, high insulation, free cooling and the use of a heat recovery exchanger for ventilation are needed. This last technology will be particularly affected by airflows across the building envelope caused by a low airtightness. Moreover, free cooling ventilation rate will highly depend on temperature difference between outside and inside. Thermal modelling tools need therefore to deal with those two issues precisely. A multizone model has been developed to compute building airflows in order to evaluate them with a higher degree of precision in the frame of a simplified simulation tool that can be used in early phases of a project. This model is based on well-mixed zones and mass conservation principles. The air flow rate between two zones is expressed as a function of the pressure drop between those two zones. Wind pressure and buoyancy effects are the causes of pressure drops. Several types of connection are implemented: cracks, ventilation inlets, large openings. More types of connection will be added. This model has been implemented in the thermal building simulation tool COMFIE [1]. The airflow model uses the temperatures of the zones as an entry and the thermal model uses the airflows as an entry as well. Both thermal and airflow model run at each time step until convergence is reached using a synchronous coupling method. An algorithm has been developed to ensure the convergence for each time step (from 1/10 to 1 hour). Two case studies are presented. First, the case of a residential building, project of Vinci Construction France where the influence of air tightness on heating loads is being studied. Then the case of a concept building, Effibat, being developed by Vinci Construction France and MINES ParisTech. This building is an urban dwelling building including an atrium. Natural ventilation is used to cool the building at night in summer and the model aims at evaluating the resulting comfort level

    Performances du calorimètre électromagnétique et recherche de nouveaux bosons de jauge dans le canal diélectron auprès du détecteur ATLAS

    Get PDF
    Le XXe siècle a marqué le succès de la construction du modèle standard de la physique des particules. Elaborée entre les années 1930 et 1970, cette théorie des particules élémentaires et des interactions électromagnétique, faible et forte a depuis été abondamment vérifiée auprès des collisionneurs tels que le LEP et le Tevatron. Malgré ce succès, certaines questions laissées en supsens ont nécessité l'élaboration de nouvelles théories permettant de dépasser le cadre du modèle standard. Parmi ces théories nombreuses sont celles prédisant l'existance d'un nouveau boson Z' à l'échelle du TeV. Les données du LHC, recueillies depuis son démarrage à l'automne 2008, offrent une nouvelle fois l'opportunité de confronter le modèle standard à ses prédictions et de rechercher les signatures de l'existence de nouvelle physique jusqu'à des énergies inégalées. Le travail mené au sein de l'expérience ATLAS au cours de ces quatres premières années s'est ainsi orienté autour de la compréhension du détecteur et de l'analyse des premières données. Cette thèse couvre ces deux aspects. La première partie du travail présenté revient ainsi sur la mise en évidence d'une pathologie de l'électronique de lecture du calorimètre à argon liquide d'ATLAS ainsi que sur l'étude de larges déviations cohérentes du bruit observées depuis sa mise en service. La mise en place d'une stratégie de préservation des données collectées y est détaillée. La seconde partie de ce manuscrit se concentre sur la recherche d'un nouveau boson Z'. Si tant est qu'une telle particule existe, sa décroissance en un électron et un positron devrait donner lieu à l'apparition d'une nouvelle résonance massive dans le spectre en masse invariante diélectron. Les performances de reconstruction et d'identification des électrons, particulièrement à haute impulsion transverse, sont étudiées. L'analyse des 4.9 fb-1 de données collectées en 2011 est décrite. En l'absence de déviation significative par rapport aux prédictions du modèle standard, le spectre en masse invariante diélectron est réinterprété afin de dériver les limites sur l'existence de nouveaux bosons issus de théories de grande unification (E6) et sur l'existence d'un boson de type SSM. Ces limites et celles obtenues par l'expérience CMS sont à l'heure actuelle les plus contraignantes quant à l'existence de ces nouveaux bosons.The Standard Model of particle physics has known a tremendous rise during the twentieth century. Built up, from the early 1930s to the 1970s, this theory describing elementary particles and their interactions (electromagnetic, weak, strong) has now been intensivly tested by LEP and Tevatron colliders. Besides its succes, some problems remain and have lead to new theories attempting to go beyond the standard model. Many of them are predicting the existence of a new gauge boson Z', which is supposed to be observed at the TeV scale. Data recorded by the LHC since automn 2008 are a new opportunity to check the consistency of the Standard Model and to search for new physics evidence. Work that has been done by the ATLAS collaboration during the last four years has focused on understanding detector's behaviour and analysing the very first collected collisions. This thesis is reflecting these two aspects. Therefore, the first part of this thesis describes the caracterisation of a pathology of ATLAS liquid argon calorimeter electronics and of coherent noise bursts that have both been observed since the beginning of ATLAS operation. The policy deployed to preserve data quality is also detailled. The second part is focusing on the search for new Z' gauge boson. In case this particle was to exist, its decay into an electron and a positron would lead to a new massive resonance in the dielectron invariant mass spectrum. Therefore electron reconstruction and identification performances are closely looked at, especially at high transverse momentum. Analysis made on the 4.9 fb-1 of collected data is reported. As no significant excess with respect to Standard Model predictions is observed, the dielectron invariant mass spectrum is interpreted to derive mass limits concerning the existence of new Z' gauge bosons appearing in grand unification theories (E6) and effective sequential standard model (SSM). These limits and those derived by the CMS collaboration are the best ever set on such new bosons.SAVOIE-SCD - Bib.électronique (730659901) / SudocGRENOBLE1/INP-Bib.électronique (384210012) / SudocGRENOBLE2/3-Bib.électronique (384219901) / SudocSudocFranceF

    The Efficacy of an Acrylic Intraocular Lens Surface Modified with Polyethylene Glycol in Posterior Capsular Opacification

    Get PDF
    To investigate if the surface modification of intraocular lens (IOL) is efficient in the prevention of posterior capsular opacification (PCO), the acrylic surface of intraocular lens (Acrysof®) was polymerized with polyethylene glycol (PEG-IOL). The human lens epithelial cells (1×104 cells/mL) were inoculated on PEG grafted or unmodified acrylic lenses for the control. The adherent cells on each IOL surface were trypsinized and counted. The every PEG-IOL was implanted in 20 New Zealand rabbits after removal of crystalline lens. The formations of PCO were checked serially through retroilluminated digital photography, and the severity scores were calculated using POCOman®. The cell adherence patterns on each IOL were examined by scanning electron microscopy. As a result, the mean number of adherent cells of PEG-IOL (3.2±1.1×103) tended to be smaller than that of the acrylic controls (3.6±1.9×103) without a statistical significance (p=0.73). However, the mean severity of PCO formation in PEG-IOL was significantly lower than that in the control during the third to sixth weeks after surgery. Scanning electron microscopy revealed that the more patch-like cells were found firmly attached to the IOL surface in control than in the PEG-IOL. Conclusively, PEG polymerization to the acrylic IOL would possibly lessen the formation of PCO after cataract removal

    Cost-effectiveness of non-invasive methods for assessment and monitoring of liver fibrosis and cirrhosis in patients with chronic liver disease: systematic review and economic evaluation

    Get PDF
    BACKGROUND: Liver biopsy is the reference standard for diagnosing the extent of fibrosis in chronic liver disease; however, it is invasive, with the potential for serious complications. Alternatives to biopsy include non-invasive liver tests (NILTs); however, the cost-effectiveness of these needs to be established. OBJECTIVE: To assess the diagnostic accuracy and cost-effectiveness of NILTs in patients with chronic liver disease. DATA SOURCES: We searched various databases from 1998 to April 2012, recent conference proceedings and reference lists. METHODS: We included studies that assessed the diagnostic accuracy of NILTs using liver biopsy as the reference standard. Diagnostic studies were assessed using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) tool. Meta-analysis was conducted using the bivariate random-effects model with correlation between sensitivity and specificity (whenever possible). Decision models were used to evaluate the cost-effectiveness of the NILTs. Expected costs were estimated using a NHS perspective and health outcomes were measured as quality-adjusted life-years (QALYs). Markov models were developed to estimate long-term costs and QALYs following testing, and antiviral treatment where indicated, for chronic hepatitis B (HBV) and chronic hepatitis C (HCV). NILTs were compared with each other, sequential testing strategies, biopsy and strategies including no testing. For alcoholic liver disease (ALD), we assessed the cost-effectiveness of NILTs in the context of potentially increasing abstinence from alcohol. Owing to a lack of data and treatments specifically for fibrosis in patients with non-alcoholic fatty liver disease (NAFLD), the analysis was limited to an incremental cost per correct diagnosis. An analysis of NILTs to identify patients with cirrhosis for increased monitoring was also conducted. RESULTS: Given a cost-effectiveness threshold of £20,000 per QALY, treating everyone with HCV without prior testing was cost-effective with an incremental cost-effectiveness ratio (ICER) of £9204. This was robust in most sensitivity analyses but sensitive to the extent of treatment benefit for patients with mild fibrosis. For HBV [hepatitis B e antigen (HBeAg)-negative)] this strategy had an ICER of £28,137, which was cost-effective only if the upper bound of the standard UK cost-effectiveness threshold range (£30,000) is acceptable. For HBeAg-positive disease, two NILTs applied sequentially (hyaluronic acid and magnetic resonance elastography) were cost-effective at a £20,000 threshold (ICER: £19,612); however, the results were highly uncertain, with several test strategies having similar expected outcomes and costs. For patients with ALD, liver biopsy was the cost-effective strategy, with an ICER of £822. LIMITATIONS: A substantial number of tests had only one study from which diagnostic accuracy was derived; therefore, there is a high risk of bias. Most NILTs did not have validated cut-offs for diagnosis of specific fibrosis stages. The findings of the ALD model were dependent on assuptions about abstinence rates assumptions and the modelling approach for NAFLD was hindered by the lack of evidence on clinically effective treatments. CONCLUSIONS: Treating everyone without NILTs is cost-effective for patients with HCV, but only for HBeAg-negative if the higher cost-effectiveness threshold is appropriate. For HBeAg-positive, two NILTs applied sequentially were cost-effective but highly uncertain. Further evidence for treatment effectiveness is required for ALD and NAFLD. STUDY REGISTRATION: This study is registered as PROSPERO CRD42011001561. FUNDING: The National Institute for Health Research Health Technology Assessment programme

    Evidence for the h_b(1P) meson in the decay Upsilon(3S) --> pi0 h_b(1P)

    Get PDF
    Using a sample of 122 million Upsilon(3S) events recorded with the BaBar detector at the PEP-II asymmetric-energy e+e- collider at SLAC, we search for the hb(1P)h_b(1P) spin-singlet partner of the P-wave chi_{bJ}(1P) states in the sequential decay Upsilon(3S) --> pi0 h_b(1P), h_b(1P) --> gamma eta_b(1S). We observe an excess of events above background in the distribution of the recoil mass against the pi0 at mass 9902 +/- 4(stat.) +/- 2(syst.) MeV/c^2. The width of the observed signal is consistent with experimental resolution, and its significance is 3.1sigma, including systematic uncertainties. We obtain the value (4.3 +/- 1.1(stat.) +/- 0.9(syst.)) x 10^{-4} for the product branching fraction BF(Upsilon(3S)-->pi0 h_b) x BF(h_b-->gamma eta_b).Comment: 8 pages, 4 postscript figures, submitted to Phys. Rev. D (Rapid Communications
    corecore