92 research outputs found

    Hydrographic data from the OPTOMA program OPTOMA21 7 - 20 July 1986

    Get PDF
    The OPTOMA (Ocean Prediction Through Observation, Modeling and Analysis) program, a joint NPS/Harvard program sponsored by ONR, seeks to understand the mesoscale (fronts, eddies, and jets) variability and dynamics of the California current systems and to determine the scientific limits to practical mesoscale ocean forecasting. To help carry out the aims of this project a series of cruises has been planned two subdomains, nocal and cencal. Optoma21 was a multidisciplinary study which took place from 7 to 20 July 1986 aboard the R/V Point Sur in the nocal domain. In addition to conducting a quasi- synoptic CTD/XBT mapping of a cool anomaly, meandering jet, and eddy system, measurements were made to determine: 1) the fine scale variability of the upper ocean mass and velocity fields; 2) the upper ocean nutrient, optical and phytoplankton fields; and 3) the structure of the zooplankton population. In this report, the CTD/XBT data are presentedresearch project "Ocean Prediction Through Observation, Modeling and Analysis" sponsored by the Physical Oceanography Program of the Office of Naval Research under Program Element 61153N.http://archive.org/details/hydrographicda20jul86wittN000146WR24027NAApproved for public release; distribution is unlimited

    Hydrographic data from the OPTOMA program OPTOMA20 OPTOMA20 P 16 March 1986 OPTOMA20 Leg MI 24 March - 3 April 1986 OPTOMA20 Leg MII 7 - 15 April 1986 OPTOMA20 Leg D 25 April - 6 May 1986

    Get PDF
    THe OPTOMA (Ocean Prediction Through Observation, Modeling, and Analysis) Program a joint NPS/Harvard program sponsored by ONR, seeks to understand the mesoscale (fronts, eddies, and jets) variability and dynamics of the California Current System and to determine the scientific limits to practical mesoscale ocean forecasting. To help carry out the aims of this project, a series of cruises has been planned in two subdomains, NOCAL and CENCAL, Three cruises were undertaken during March, April and May 1986: two (Legs Ml and Mll) on the NOAA ship Mc AUTHOR, one (Leg D) on the USNS DE STEIGUER. In addition, one P-3 overflight (Leg P) was made one week before the first cruise. Leg P, on 16 March, sampled a domain approximately 240km square centered about 280 km off the coast between Pt. Arena and Cape Mendocino, with additional transects from and to San Francisco, Leg Ml was carried out from 24 March to 3 April (Figure 8), Leg Mll from 7 to 15 April (Figure 20), and Leg D from 25 April to 6 May (Figure 32). Each cruise sampled the same domain as Leg P. On these cruises, oceanographic stations were occupied at approximately 18km along each trackPrepared for: Office of Naval Research Environmental Sciences Directoratehttp://archive.org/details/hydrographicdata008cianN0001486WR24027NAApproved for public release; distribution is unlimited

    Cluster Lenses

    Get PDF
    Clusters of galaxies are the most recently assembled, massive, bound structures in the Universe. As predicted by General Relativity, given their masses, clusters strongly deform space-time in their vicinity. Clusters act as some of the most powerful gravitational lenses in the Universe. Light rays traversing through clusters from distant sources are hence deflected, and the resulting images of these distant objects therefore appear distorted and magnified. Lensing by clusters occurs in two regimes, each with unique observational signatures. The strong lensing regime is characterized by effects readily seen by eye, namely, the production of giant arcs, multiple-images, and arclets. The weak lensing regime is characterized by small deformations in the shapes of background galaxies only detectable statistically. Cluster lenses have been exploited successfully to address several important current questions in cosmology: (i) the study of the lens(es) - understanding cluster mass distributions and issues pertaining to cluster formation and evolution, as well as constraining the nature of dark matter; (ii) the study of the lensed objects - probing the properties of the background lensed galaxy population - which is statistically at higher redshifts and of lower intrinsic luminosity thus enabling the probing of galaxy formation at the earliest times right up to the Dark Ages; and (iii) the study of the geometry of the Universe - as the strength of lensing depends on the ratios of angular diameter distances between the lens, source and observer, lens deflections are sensitive to the value of cosmological parameters and offer a powerful geometric tool to probe Dark Energy. In this review, we present the basics of cluster lensing and provide a current status report of the field.Comment: About 120 pages - Published in Open Access at: http://www.springerlink.com/content/j183018170485723/ . arXiv admin note: text overlap with arXiv:astro-ph/0504478 and arXiv:1003.3674 by other author

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Use of SMS texts for facilitating access to online alcohol interventions: a feasibility study

    Get PDF
    A41 Use of SMS texts for facilitating access to online alcohol interventions: a feasibility study In: Addiction Science & Clinical Practice 2017, 12(Suppl 1): A4

    Lawson Criterion for Ignition Exceeded in an Inertial Fusion Experiment

    Get PDF

    Lawson criterion for ignition exceeded in an inertial fusion experiment

    Get PDF
    For more than half a century, researchers around the world have been engaged in attempts to achieve fusion ignition as a proof of principle of various fusion concepts. Following the Lawson criterion, an ignited plasma is one where the fusion heating power is high enough to overcome all the physical processes that cool the fusion plasma, creating a positive thermodynamic feedback loop with rapidly increasing temperature. In inertially confined fusion, ignition is a state where the fusion plasma can begin "burn propagation" into surrounding cold fuel, enabling the possibility of high energy gain. While "scientific breakeven" (i.e., unity target gain) has not yet been achieved (here target gain is 0.72, 1.37 MJ of fusion for 1.92 MJ of laser energy), this Letter reports the first controlled fusion experiment, using laser indirect drive, on the National Ignition Facility to produce capsule gain (here 5.8) and reach ignition by nine different formulations of the Lawson criterion

    Controlling Product Risks When Consumers are Heterogeneously Overconfident: Producer Liability vs. Minimum Quality Standard Regulation

    Full text link
    Contributing to the literature on the consequences of behavioral biases for market outcomes and institutional design, we contrast producer liability and minimum quality standard regulation as alternative means of social control of product-related torts when consumers are heterogeneously overconfident about the risk of harm. We elucidate the role of factors shaping the relative desirability of strict liability vis-à-vis minimum quality standard regulation from a social welfare standpoint. We also clarify when and why joint use of strict liability and minimum quality standard regulation welfare dominates the exclusive use of either mode of social control of torts
    corecore