9,961 research outputs found

    LANDSAT/coastal processes

    Get PDF
    The author has identified the following significant results. Correlations between the satellite radiance values water color, Secchi disk visibility, turbidity, and attenuation coefficients were generally good. The residual was due to several factors including systematic errors in the remotely sensed data, errors, small time and space variations in the water quality measurements, and errors caused by experimental design. Satellite radiance values were closely correlated with the optical properties of the water

    Physiology and natural distribution of the bacterium caryophanon latum in fresh waters of Missouri

    Get PDF
    Students supported: 2 B.S. Students SupportedThe objective of this research program was to study the physiology of Caryophanon latum and to develop a selective medium for growing the organism to the exclusion of most, if not all, other organisms found in stream water. After development of this selective medium, the procedure was to be evaluated on sample stream waters with and without inoculation with the organism. The primary goal of this project is the development of a technique for the detection and possible enumeration of organisms of the genus Carophanon in natural waters. Demonstration of the presence of these organisms would prove fecal pollution by ruminants and enumeration methods would make it possible to locate the point of pollution.Project # A-048-MO Agreement # 14-31-0001-352

    Results from the LSND Neutrino Oscillation Search

    Get PDF
    The Liquid Scintillator Neutrino Detector (LSND) at the Los Alamos Meson Physics Facility sets bounds on neutrino oscillations in the appearance channel nu_mu_bar --> nu_e_bar by searching for the signature of the reaction nu_e_bar p --> e^+ n: an e+^+ followed by a 2.2MeV gamma ray from neutron capture. Five e^{+/-} -- gamma coincidences are observed in time with the LAMPF beam, with an estimated background of 6.2 events. The 90\% confidence limits obtained are: Delta (m^2) < 0.07eV^2 for sin^2 (2theta) = 1, and sin^2(2theta) < 6 10^{-3} for Delta (m^2) > 20 eV^2.Comment: 10 pages, uses REVTeX and epsf macro

    Estimating Peak Demand for Beach Parking Spaces

    Get PDF
    The United States Army Corps of Engineers planning guidance stipulates that in order for local beach communities to qualify for Federal cost share funds for Hurricane and Storm Damage Reduction beach renourishment projects, the community must provide public beach access and parking to satisfy peak demand. This study presents a method for estimating peak demand for beach parking spaces in the presence of parking constraints. A Tobit regression model is developed to estimate the number of parking spaces that would be necessary to meet unconstrained demand on a given percentage of peak demand days. For example, the model can be used to estimate the number of parking spaces that would be adequate to meet peak demand on 90% of peak parking days. The Tobit model provides a promising framework for estimating peak parking demand under constrained parking conditions, a situation that characterizes most beach communities.

    CORBA-JS: An Open-Standards Framework for Distributed Object Computing over the Web

    Get PDF
    poster abstractDistributed object computing (DOC) is a well-established software engineering paradigm for implementing distributed real-time and embedded (DRE) systems, such as real-time monitoring systems. Likewise, CORBA is a well-established DOC open-standard used in DRE systems. Due to many technological limitations, DOC was traditionally unavailable in Web-based applications (i.e., stateful applications that communicate over HTTP, and are accessible via a Web browser) without the use of proprietary, custom technologies. The problem with using proprietary, custom technology is it creates fragmentation in the solution space where some solutions are not available to all end-users (e.g., Web sites that only work within a certain Web browser because of the used technology). With the advent of HTML5 and WebSockets, which is an open-standard for enabling two-way communication over HTTP, DOC now has the necessary technological foundations to be realized within Web applications without the use of proprietary, custom technologies. To date, however, no researchers have attempted to apply DOC over HTTP using well-established DOC open-standards, such as CORBA. This research therefore is an initial investigation into implementing CORBA atop of HTML5 and WebSockets. As part of this research, we are investigating the challenges in realizing the solution, and proposing ways to improve the target programming languages and CORBA specification. Doing so will enable developers to create feature-rich real-time Web applications that improve upon current state-of-the-art approaches, e.g., Asynchronous XML and JavaScript (AJAX), that are resource intensive (e.g., use a lot of CPU, network bandwidth, and memory) and hard to program

    The Lennard-Jones-Devonshire cell model revisited

    Full text link
    We reanalyse the cell theory of Lennard-Jones and Devonshire and find that in addition to the critical point originally reported for the 12-6 potential (and widely quoted in standard textbooks), the model exhibits a further critical point. We show that the latter is actually a more appropriate candidate for liquid-gas criticality than the original critical point.Comment: 5 pages, 3 figures, submitted to Mol. Phy

    A Digital Neuromorphic Architecture Efficiently Facilitating Complex Synaptic Response Functions Applied to Liquid State Machines

    Full text link
    Information in neural networks is represented as weighted connections, or synapses, between neurons. This poses a problem as the primary computational bottleneck for neural networks is the vector-matrix multiply when inputs are multiplied by the neural network weights. Conventional processing architectures are not well suited for simulating neural networks, often requiring large amounts of energy and time. Additionally, synapses in biological neural networks are not binary connections, but exhibit a nonlinear response function as neurotransmitters are emitted and diffuse between neurons. Inspired by neuroscience principles, we present a digital neuromorphic architecture, the Spiking Temporal Processing Unit (STPU), capable of modeling arbitrary complex synaptic response functions without requiring additional hardware components. We consider the paradigm of spiking neurons with temporally coded information as opposed to non-spiking rate coded neurons used in most neural networks. In this paradigm we examine liquid state machines applied to speech recognition and show how a liquid state machine with temporal dynamics maps onto the STPU-demonstrating the flexibility and efficiency of the STPU for instantiating neural algorithms.Comment: 8 pages, 4 Figures, Preprint of 2017 IJCN

    Lambda-Cold Dark Matter, Stellar Feedback, and the Galactic Halo Abundance Pattern

    Get PDF
    (Abridged) The hierarchical formation scenario for the stellar halo requires the accretion and disruption of dwarf galaxies, yet low-metallicity halo stars are enriched in alpha-elements compared to similar, low-metallicity stars in dwarf spheroidal (dSph) galaxies. We address this primary challenge for the hierarchical formation scenario for the stellar halo by combining chemical evolution modelling with cosmologically-motivated mass accretion histories for the Milky Way dark halo and its satellites. We demonstrate that stellar halo and dwarf galaxy abundance patterns can be explained naturally within the LCDM framework. Our solution relies fundamentally on the LCDM model prediction that the majority of the stars in the stellar halo were formed within a few relatively massive, ~5 x 10^10 Msun, dwarf irregular (dIrr)-size dark matter halos, which were accreted and destroyed ~10 Gyr in the past. These systems necessarily have short-lived, rapid star formation histories, are enriched primarily by Type II supernovae, and host stars with enhanced [a/Fe] abundances. In contrast, dwarf spheroidal galaxies exist within low-mass dark matter hosts of ~10^9 Msun, where supernovae winds are important in setting the intermediate [a/Fe] ratios observed. Our model includes enrichment from Type Ia and Type II supernovae as well as stellar winds, and includes a physically-motivated supernovae feedback prescription calibrated to reproduce the local dwarf galaxy stellar mass - metallicity relation. We use representative examples of the type of dark matter halos we expect to host a destroyed ``stellar halo progenitor'' dwarf, a surviving dIrr, and a surviving dSph galaxy, and show that their derived abundance patterns, stellar masses, and gas masses are consistent with those observed for each type of system.Comment: 10 pages, 3 figures, version accepted by Ap

    Outlook for tuberculosis elimination in California: An individual-based stochastic model.

    Get PDF
    RationaleAs part of the End TB Strategy, the World Health Organization calls for low-tuberculosis (TB) incidence settings to achieve pre-elimination (&lt;10 cases per million) and elimination (&lt;1 case per million) by 2035 and 2050, respectively. These targets require testing and treatment for latent tuberculosis infection (LTBI).ObjectivesTo estimate the ability and costs of testing and treatment for LTBI to reach pre-elimination and elimination targets in California.MethodsWe created an individual-based epidemic model of TB, calibrated to historical cases. We evaluated the effects of increased testing (QuantiFERON-TB Gold) and treatment (three months of isoniazid and rifapentine). We analyzed four test and treat targeting strategies: (1) individuals with medical risk factors (MRF), (2) non-USB, (3) both non-USB and MRF, and (4) all Californians. For each strategy, we estimated the effects of increasing test and treat by a factor of 2, 4, or 10 from the base case. We estimated the number of TB cases occurring and prevented, and net and incremental costs from 2017 to 2065 in 2015 U.S. dollars. Efficacy, costs, adverse events, and treatment dropout were estimated from published data. We estimated the cost per case averted and per quality-adjusted life year (QALY) gained.Measurements and main resultsIn the base case, 106,000 TB cases are predicted to 2065. Pre-elimination was achieved by 2065 in three scenarios: a 10-fold increase in the non-USB and persons with MRF (by 2052), and 4- or 10-fold increase in all Californians (by 2058 and 2035, respectively). TB elimination was not achieved by any intervention scenario. The most aggressive strategy, 10-fold in all Californians, achieved a case rate of 8 (95% UI 4-16) per million by 2050. Of scenarios that reached pre-elimination, the incremental net cost was 20billion(non−USBandMRF)to20 billion (non-USB and MRF) to 48 billion. These had an incremental cost per QALY of 657,000to657,000 to 3.1 million. A more efficient but somewhat less effective single-lifetime test strategy reached as low as $80,000 per QALY.ConclusionsSubstantial gains can be made in TB control in coming years by scaling-up current testing and treatment in non-USB and those with medical risks

    Biological Records Centre Annual Report 2005-2006

    Get PDF
    The period covered by this report is the first year of a new six-year partnership between CEH and JNCC. For this period, there is increased emphasis on targeted survey, on analysis and interpretation and on communications and outreach. These activities were always part of BRC’s work, but they have been given greater prominence as a result of rapid developments in information technology. Data are increasingly reaching BRC in electronic form, so that the effort of data entry and collation is reduced. The data, collected by many volunteers and then collated and analysed at BRC, document the changing status and distribution of plants and animals in Britain. Distribution maps are published in atlases and are available via the internet through the NBN Gateway. The effects of change or loss of habitats, the influence of climate change and the consequences of changing water quality are all examples of the environmental factors that affect our biodiversity and which BRC aims to document and understand. The results are vital for developing environmental policies, to support conservation, and for fundamental ecological research. BRC is funded jointly by JNCC and NERC through a partnership based on a Memorandum of Agreement (MoA). The partnership started in 1973 when the Nature Conservancy was divided to form the successor bodies Nature Conservancy Council (NCC) and Institute of Terrestrial Ecology (ITE). NCC was in turn divided further to form JNCC and three Country Agencies, while ITE was merged with other NERC units to form CEH. Through all these changes, the partnership has been maintained. A six-year memorandum of agreement ended on 31 January 2005 (Hill et al. 2005). The present report covers the first full year, 2005-6, of the new agreement for 2005-2010. Rapid progress in information technology continues to be highly beneficial for BRC, whose data are increasingly used by the UK country conservation agencies, environmental consultants, NGOs, research workers, policy makers and volunteers. It is gratifying to know that, through our ability to display data on the National Biodiversity Network (NBN) Gateway, some of our data suppliers now have immediate access to their own data in a convenient form. The year 2005-6 has been one of steady progress, with new datasets added to BRC, substantial additions to existing data, and improved communication with the NBN Gateway. The most high profile activity of the year has been the Harlequin Ladybird Survey, which has enabled us to observe the early stages of colonization by a mobile insect in greater detail than has been possible in any previous case
    • 

    corecore