5,106 research outputs found

    Simple and accurate modelling of the gravitational potential produced by thick and thin exponential discs

    Get PDF
    We present accurate models of the gravitational potential produced by a radially exponential disc mass distribution. The models are produced by combining three separate Miyamoto–Nagai discs. Such models have been used previously to model the disc of the Milky Way, but here we extend this framework to allow its application to discs of any mass, scalelength, and a wide range of thickness from infinitely thin to near spherical (ellipticities from 0 to 0.9). The models have the advantage of simplicity of implementation, and we expect faster run speeds over a double exponential disc treatment. The potentials are fully analytical, and differentiable at all points. The mass distribution of our models deviates from the radial mass distribution of a pure exponential disc by <0.4 per cent out to 4 disc scalelengths, and <1.9 per cent out to 10 disc scalelengths. We tabulate fitting parameters which facilitate construction of exponential discs for any scalelength, and a wide range of disc thickness (a user-friendly, web-based interface is also available). Our recipe is well suited for numerical modelling of the tidal effects of a giant disc galaxy on star clusters or dwarf galaxies. We consider three worked examples; the Milky Way thin and thick disc, and a discy dwarf galaxy

    Is There a Role for Benefit-Cost Analysis in Environmental, Health, and Safety Regulation?

    Get PDF
    Benefit-cost analysis has a potentially important role to play in helping inform regulatory decision-making, although it should not be the sole basis for such decision-making. This paper offers eight principles on the appropriate use of benefit-cost analysis.Environment, Health and Safety, Regulatory Reform

    Benefit-Cost Analysis in Environmental, Health, and Safety Regulation: A Statement of Principles

    Get PDF
    Benefit-cost analysis can play a very important role in legislative and regulatory policy debates on improving the environment, health, and safety. It can help illustrate the tradeoffs that are inherent in public policymaking as well as make those tradeoffs more transparent. It can also help agencies set regulatory priorities. Benefit-cost analysis should be used to help decisionmakers reach a decision. Contrary to the views of some, benefit-cost analysis is neither necessary nor sufficient for designing sensible public policy. If properly done, it can be very helpful to agencies in the decisionmaking process. Decisionmakers should not be precluded from considering the economic benefits and costs of different policies in the development of regulations. Laws that prohibit costs or other factors from being considered in administrative decisionmaking are inimical to good public policy. Currently, several of the most important regulatory statutes have been interpreted to imply such prohibitions. Benefit-cost analysis should be required for all major regulatory decisions, but agency heads should not be bound by a strict benefit-cost test. Instead, they should be required to consider available benefit-cost analyses and to justify the reasons for their decision in the event that the expected costs of a regulation far exceed the expected benefits. Agencies should be encouraged to use economic analysis to help set regulatory priorities. Economic analyses prepared in support of particularly important decisions should be subjected to peer review both inside and outside government. Benefits and costs of proposed major regulations should be quantified wherever possible. Best estimates should be presented along with a description of the uncertainties. Not all benefits or costs can be easily quantified, much less translated into dollar terms. Nevertheless, even qualitative descriptions of the pros and cons associated with a contemplated action can be helpful. Care should be taken to ensure that quantitative factors do not dominate important qualitative factors in decisionmaking. The Office of Management and Budget, or some other coordinating agency, should establish guidelines that agencies should follow in conducting benefit-cost analyses. Those guidelines should specify default values for the discount rate and certain types of benefits and costs, such as the value of a small reduction in mortality risk. In addition, agencies should present their results using a standard format, which summarizes the key results and highlights major uncertainties.

    Andean Land Use And Biodiversity: Humanized Landscapes In A Time Of Change

    Get PDF
    Some landscapes Cannot be understood without reference., to the kinds. degrees, kinds, degrees, and history of human-caused modifications to the Earth's surface. The tropical latitudes of the Andes represent one such place, with agricultural land-use systems appearing in the Early Holocene. Current land use includes both intensive and extensive grazing and crop- or tree-based agricultural systems found across virtually the, entire range of possible elevations and humidity regimes. Biodiversity found in or adjacent to such humanized landscapes will have been altered in abundance. composition, and distribution in relation to the resiliency of the native Species to harvest, hold cover modifications, and other deliberate or inadvertent human land uses. In addition, the geometries of land cover, resulting flout difference among the shapes, sizes, connectivities, and physical structures of the patches, corridors, and matrices that compose landscape mosaics, will constrain biodiversity, often in predictable ways. This article proposes a conceptual model that alter ins that the Continued persistence of native species may depend as much oil the shifting Of Andean landscape mosaics as on species characteristics, themselves. Furthermore, mountains such as the Andes display long gradients of environmental Conditions that after in relation to latitude, soil moisture, aspect, and elevation. Global environmental change will shift these, especially temperature and humidity regimes along elevational gradients, causing Changes outside the historical range of variation for some species. Both land-use systems and Conservation efforts will need to respond spatially to these shifts in the future, at both landscape and regional scales.Geography and the Environmen

    Sharpening the predictions of big-bang nucleosynthesis

    Get PDF
    Motivated by the recent measurement of the primeval abundance of deuterium, we re-examine the nuclear inputs to big-bang nucleosynthesis (BBN). Using Monte-Carlo realization of the nuclear cross-section data to directly estimate the theoretical uncertainties for the yields of D, 3-He and 7-Li, we show that previous estimates were a factor of 2 too large. We sharpen the BBN determination of the baryon density based upon deuterium, rho_B = (3.6 +/- 0.4) * 10^{-31} g/cm^3 (Omega_B h^2 = 0.019 +/- 0.0024), which leads to a predicted 4-He abundance, Y_P = 0.246 +/- 0.0014 and a stringent limit to the equivalent number of light neutrino species: N_nu < 3.20 (all at 95% cl). The predicted 7-Li abundance, 7-Li/H = (3.5 + 1.1 - 0.9) * 10^{-10}, is higher than that observed in pop II stars, (1.7 +/- 0.3) * 10^{-10} (both, 95% cl). We identify key reactions and the energies where further work is needed.Comment: 5 pages, 4 figures (epsfig), REVTeX; submitted to Phys. Rev. Let

    Towards More Precise Survey Photometry for PanSTARRS and LSST: Measuring Directly the Optical Transmission Spectrum of the Atmosphere

    Full text link
    Motivated by the recognition that variation in the optical transmission of the atmosphere is probably the main limitation to the precision of ground-based CCD measurements of celestial fluxes, we review the physical processes that attenuate the passage of light through the Earth's atmosphere. The next generation of astronomical surveys, such as PanSTARRS and LSST, will greatly benefit from dedicated apparatus to obtain atmospheric transmission data that can be associated with each survey image. We review and compare various approaches to this measurement problem, including photometry, spectroscopy, and LIDAR. In conjunction with careful measurements of instrumental throughput, atmospheric transmission measurements should allow next-generation imaging surveys to produce photometry of unprecedented precision. Our primary concerns are the real-time determination of aerosol scattering and absorption by water along the line of sight, both of which can vary over the course of a night's observations.Comment: 41 pages, 14 figures. Accepted PAS

    Lack of Effect of Induction of Hypothermia after Acute Brain Injury

    Get PDF
    Background Induction of hypothermia in patients with brain injury was shown to improve outcomes in small clinical studies, but the results were not definitive. To study this issue, we conducted a multicenter trial comparing the effects of hypothermia with those of normothermia in patients with acute brain injury. Methods The study subjects were 392 patients 16 to 65 years of age with coma after sustaining closed head injuries who were randomly assigned to be treated with hypothermia (body temperature, 33°C), which was initiated within 6 hours after injury and maintained for 48 hours by means of surface cooling, or normothermia. All patients otherwise received standard treatment. The primary outcome measure was functional status six months after the injury. Results The mean age of the patients and the type and severity of injury in the two treatment groups were similar. The mean (±SD) time from injury to randomization was 4.3±1.1 hours in the hypothermia group and 4.1±1.2 hours in the normothermia group, and the mean time from injury to the achievement of the target temperature of 33°C in the hypothermia group was 8.4±3.0 hours. The outcome was poor (defined as severe disability, a vegetative state, or death) in 57 percent of the patients in both groups. Mortality was 28 percent in the hypothermia group and 27 percent in the normothermia group (P=0.79). The patients in the hypothermia group had more hospital days with complications than the patients in the normothermia group. Fewer patients in the hypothermia group had high intracranial pressure than in the normothermia group. Conclusions Treatment with hypothermia, with the body temperature reaching 33°C within eight hours after injury, is not effective in improving outcomes in patients with severe brain injury. (N Engl J Med 2001; 344:556-63.

    Exploring the radio-loudness of SDSS quasars with spectral stacking

    Get PDF
    © 2024 The Author(s). Published by Oxford University Press on behalf of Royal Astronomical Society. This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/We use new 144 MHz observations over 5634 deg 2 from the LOFAR (Low Frequency Array) Two-metre Sky Survey (LoTSS) to compile the largest sample of uniformly selected, spectroscopically confirmed quasars from the 14th data release of the Sloan Digital Sky Survey (SDSS-DR14). Using the classical definition of radio loudness, R = log (L 1.4GHz/L i), we identify 3697 radio-loud (RL) and 111 132 radio-quiet (RQ) sources at 0.6 < z < 3.4. To study their properties, we develop a new rest-frame spectral stacking algorithm, designed with forthcoming massively multiplexed spectroscopic surveys in mind, and use it to create high signal-to-noise composite spectra of each class, matched in redshift and absolute i-band magnitude. We show that RL quasars have redder continuum and enhanced [O II] emission than their RQ counterparts. These results persist when additionally matching in black hole mass, suggesting that this parameter is not the defining factor in making a quasi-stellar object (QSO) RL. We find that these features are not gradually varying as a function of radio loudness, but are maintained even when probing deeper into the RQ population, indicating that a clear-cut division in radio loudness is not apparent. Upon examining the star formation rates (SFRs) inferred from the [O II] emission line, with the contribution from active galactic nucleus removed using the [Ne V] line, we find that RL quasars have a significant excess of star formation relative to RQ quasars out to z = 1.9 at least. Given our findings, we suggest that RL sources either preferably reside in gas-rich systems with rapidly spinning black holes, or represent an earlier obscured phase of QSO evolution.Peer reviewe
    corecore