12,411 research outputs found

    A WIMP Detector with Two-Phase Liquid Xenon

    Get PDF
    We describe the liquid-xenon dark-matter detector program of the UCLA-Torino team. A two-phase detector, ZEPLIN II, for the Boulby Mine is a good match for the current search for WIMP dark matter.Comment: 3 pages with 4 figures; for Proceedings, Sixth Int'l Wksp. On Topics in Astroparticle and Underground Physics, TAUP99 (College de France, Paris, Sept. 6-10, 1999), to be published in Nucl. Phys. B(PS

    The comparative advantage of government : a review

    Get PDF
    In theory, market failures are necessary but not sufficient conditions for justifying government intervention in the production of goods and services. Even without market failures, there might be a case for government intervention on the grounds of poverty reduction or merit goods (for example, mandatory elementary education and mandatory use of seatbelts in cars and of helmets on motorbikes). In every case, contends the author, a case for government intervention must first identify the particular market failure that prevents the private sector from producing the socially optimal quantity of the good or service. Second, it must select the intervention that will most improve welfare. Third, it must show that society will be better off as a result of government involvement--must show that the benefits will outweigh the costs. It is impossible to judge a priori whether or what type of government intervention is appropriate to a particular circumstance or even to a class of situations. Such judgments are both country- and situation-specific and must be made on a case-by-case basis. To be sure, it is easier to make such judgments about market failures based on externalities, public goods, and so on, than about the market failures based on imperfect information. Market failures rooted in incomplete markets and imperfect information are pervasive: Markets are almost always incomplete, and information is always imperfect. This does not mean that there is always a case for government intervention and that further analysis is unnecessary. On the contrary, there is a keener need for analysis. The welfare consequences of the"new market failures"are more difficult to measure so government intervention's contribution to welfare is likely to be more difficult to assess and the case for intervention (especially the provision of goods and services) is more difficult to make. One must also keep in mind that government interventions are often poorly designed and overcostly. Poorly designed interventions may create market failures of their own. Governments concerned about low private investment in high-risk projects, for example, may guarantee them against risk but in the process create problems of moral hazard and induce investors to take no actions to mitigate such risks. And some interventions may turn out to be too costly relative to the posited benefits. In seeking to provide extension services, for example, governments may incur costs that are higher than the benefits farmers receive.Decentralization,Environmental Economics&Policies,Economic Theory&Research,Health Economics&Finance,Labor Policies,Health Economics&Finance,Banks&Banking Reform,Knowledge Economy,Environmental Economics&Policies,Economic Theory&Research

    Is economic analysis of projects still useful?

    Get PDF
    The author argues for a shift in the focus of economic analysis of projects. First, project analysts need to make full use of project information, especially identifying the source of the divergence between market prices and economic costs as well as the source of the divergence between economic and private flows, and the group that pays the cost or enjoys the benefits. This information identifies gainers, and losers, likely project supporters and detractors, and fiscal impact. Second, project analysts need to look at the project from the perspective of the main stakeholders, principally the implementing agency, the government, and the country. Third, they should also assess whether all of the main actors have the economic and financial incentives to implement the project as designed. Fourth, they should take advantage of advancesin technology and attempt to identify and measure any external effects of projects, as well as the benefits of education and health projects. Finally, they should take advantage of the advances in personal computing to provide a more systematic assessment of risk.Economic Theory&Research,Decentralization,Public Health Promotion,Environmental Economics&Policies,Health Economics&Finance,Economic Theory&Research,Health Economics&Finance,Banks&Banking Reform,Health Monitoring&Evaluation,Environmental Economics&Policies

    How adverse selection affects the health insurance market

    Get PDF
    Adverse selection can be defined as strategic behavior by the more informed partner in a contract against the interest of the less informed partner(s). In the health insurance field, this manifests itself through healthy people choosing managed care and less healthy people choosing more generous plans. Drawing on theoretical literature on the problem of adverse selection in the health insurance market, the author synthesizes concepts developed piecemeal over more than 20 years, using two examples and revisiting the classical contribution of Rothschild and Stiglitz. He highlights key insights, especially from the literature on"equilibrium refinements"and on the theory of"second best."The government can correct spontaneous market dynamics in the health insurance market by directly subsidizing insurance or through regulation; the two forms of intervention provide different results. Providing partial public insurance, even supplemented by the possibility of opting out, can lead to second-best equilibria. The same result holds as long as the government can subsidize contracts with higher-than-average premium-benefit ratios and can tax contracts with lower-than-average premium-benefit ratios. The author analyzes the following policy options relating to the public provision of insurance: a) Full public insurance. b) Partial public insurance with or without the possibility of acquiring supplementary insurance and with or without the possibility of opting out. In recent plans implemented in Germany and the Netherlands, where competition among several health funds and insurance companies was promoted, a public fund was created to discourage risk screening practices by providing the necessary compensation across riks groups. But only"objective"risk adjusters (such as age, gender, and region) were used to decide which contracts to subsidize. Those criteria alone cannot correct the effects of adverse selection. Regulation can exacerbate the problem of adverse selection and lead to chronic market instability, so certain steps must be taken to prevent risk screening and preserve competition for the market. The author considers the following three policy options for regulating the private insurance market: 1) A standard contract with full coverage. 2) Imposition of a minimum insurance requirement. 3) Imposition of premium rate restrictions.Health Economics&Finance,Environmental Economics&Policies,Insurance&Risk Mitigation,Insurance Law,Financial Intermediation

    Star formation quenching in massive galaxies

    Full text link
    Understanding how and why star formation turns off in massive galaxies is a major challenge for studies of galaxy evolution. Many theoretical explanations have been proposed, but a definitive consensus is yet to be reached.Comment: Comment published in Nature Astronomy on 3rd September 2018. The full text is publicly available at this link: https://rdcu.be/5KbA. Authors' version, 4 pages and 1 figur

    WIMPs search by scintillators: possible strategy for annual modulation search with large-mass highly-radiopure NaI(Tl)

    Get PDF
    The DAMA experiments are running deep underground in the Gran Sasso National Laboratory. Several interesting results have been achieved so far. Here a maximum likelihood method to search for the WIMP annual modulation signature is discussed and applied to a set of preliminary test data collected with large mass highly radiopure NaI(Tl) detectors. Various related technical arguments are briefly addressed.Comment: 6 pages, 4 figures, LaTex. Contributed paper to TAUP97; to appear in the Proceeding

    Using Data Mining to Predict the Occurrence of Respondent Retrieval Strategies in Calendar Interviewing: The Quality of Retrospective Reports

    Get PDF
    Determining which verbal behaviors of interviewers and respondents are dependent on one another is a complex problem that can be facilitated via data-mining approaches. Data are derived from the interviews of 153 respondents of the Panel Study of Income Dynamics (PSID) who were interviewed about their life-course histories. Behavioral sequences of interviewer-respondent interactions that were most predictive of respondents spontaneously using parallel, timing, duration, and sequential retrieval strategies in their generation of answers were examined. We also examined which behavioral sequences were predictive of retrospective reporting data quality as shown by correspondence between calendar responses with responses collected in prior waves of the PSID. The verbal behaviors of immediately preceding interviewer and respondent turns of speech were assessed in terms of their co-occurrence with each respondent retrieval strategy. Interviewers? use of parallel probes is associated with poorer data quality, whereas interviewers? use of timing and duration probes, especially in tandem, is associated with better data quality. Respondents? use of timing and duration strategies is also associated with better data quality and both strategies are facilitated by interviewer timing probes. Data mining alongside regression techniques is valuable to examine which interviewer-respondent interactions will benefit data quality

    Ascertainment of occupational histories in the working population: The occupational history calendar approach

    Get PDF
    Background Self-reported occupational histories are an important means for collecting historical data in epidemiological studies. An occupational history calendar (OHC) has been developed for use alongside a national occupational hazard surveillance tool. This study presents the systematic development of the OHC and compares work histories collected via this calendar to those collected via a traditional questionnaire. Methods The paper describes the systematic development of an OHC for use in the general working population. A comparison of data quality and recall was undertaken in 51 participants where both tools were administered. Results TheOHCenhanced job recall compared with the traditional questionnaire. Good agreement in the data captured by both tools was observed, with the exception of hazard exposures. Conclusions A calendar approach is suitable for collecting occupational histories from the general working population. Despite enhancing job recall the OHC approach has some shortcomings outweighing this advantage in large-scale population surveillance

    Flame: A Flexible Data Reduction Pipeline for Near-Infrared and Optical Spectroscopy

    Full text link
    We present flame, a pipeline for reducing spectroscopic observations obtained with multi-slit near-infrared and optical instruments. Because of its flexible design, flame can be easily applied to data obtained with a wide variety of spectrographs. The flexibility is due to a modular architecture, which allows changes and customizations to the pipeline, and relegates the instrument-specific parts to a single module. At the core of the data reduction is the transformation from observed pixel coordinates (x, y) to rectified coordinates (lambda, gamma). This transformation consists in the polynomial functions lambda(x,y) and gamma(x,y) that are derived from arc or sky emission lines and slit edge tracing, respectively. The use of 2D transformations allows one to wavelength calibrate and rectify the data using just one interpolation step. Furthermore, the gamma(x,y) transformation includes also the spatial misalignment between frames, which can be measured from a reference star observed simultaneously with the science targets. The misalignment can then be fully corrected during the rectification, without having to further resample the data. Sky subtraction can be performed via nodding and/or modeling of the sky spectrum; the combination of the two methods typically yields the best results. We illustrate the pipeline by showing examples of data reduction for a near-infrared instrument (LUCI at the Large Binocular Telescope) and an optical one (LRIS at the Keck telescope).Comment: 17 pages, 10 figures, published in MNRAS. The pipeline is available at https://github.com/siriobelli/flam

    Infrared attosecond field transients and UV to IR few-femtosecond pulses generated by high-energy soliton self-compression

    Full text link
    Infrared femtosecond laser pulses are important tools both in strong-field physics, driving X-ray high-harmonic generation, and as the basis for widely tuneable, if inefficient, ultrafast sources in the visible and ultraviolet. Although anomalous material dispersion simplifies compression to few-cycle pulses, attosecond pulses in the infrared have remained out of reach. We demonstrate soliton self-compression of 1800 nm laser pulses in hollow capillary fibers to sub-cycle envelope duration (2 fs) with 27 GW peak power, corresponding to attosecond field transients. In the same system, we generate wavelength-tuneable few-femtosecond pulses from the ultraviolet (300 nm) to the infrared (740 nm) with energy up to 25 μ\muJ and efficiency up to 12 %, and experimentally characterize the generation dynamics in the time-frequency domain. A compact second stage generates multi-μ\muJ pulses from 210 nm to 700 nm using less than 200 μ\muJ of input energy. Our results significantly expand the toolkit available to ultrafast science.Comment: 8 pages, 5 figure
    corecore