6,803 research outputs found

    On the dialog between experimentalist and modeler in catchment hydrology

    Get PDF
    The dialog between experimentalist and modeler in catchment hydrology has been minimal to date. The experimentalist often has a highly detailed yet highly qualitative understanding of dominant runoff processes—thus there is often much more information content on the catchment than we use for calibration of a model. While modelers often appreciate the need for 'hard data' for the model calibration process, there has been little thought given to how modelers might access this 'soft' or process knowledge. We present a new method where soft data (i.e., qualitative knowledge from the experimentalist that cannot be used directly as exact numbers) are made useful through fuzzy measures of model-simulation and parameter-value acceptability. We developed a three-box lumped conceptual model for the Maimai catchment in New Zealand, a particularly well-studied process-hydrological research catchment. The boxes represent the key hydrological reservoirs that are known to have distinct groundwater dynamics, isotopic composition and solute chemistry. The model was calibrated against hard data (runoff and groundwater-levels) as well as a number of criteria derived from the soft data (e.g. percent new water, reservoir volume, etc). We achieved very good fits for the three-box model when optimizing the parameter values with only runoff (Reff=0.93). However, parameter sets obtained in this way showed in general a poor goodness-of-fit for other criteria such as the simulated new-water contributions to peak runoff. Inclusion of soft-data criteria in the model calibration process resulted in lower Reff-values (around 0.84 when including all criteria) but led to better overall performance, as interpreted by the experimentalist’s view of catchment runoff dynamics. The model performance with respect to soft data (like, for instance, the new water ratio) increased significantly and parameter uncertainty was reduced by 60% on average with the introduction of the soft data multi-criteria calibration. We argue that accepting lower model efficiencies for runoff is 'worth it' if one can develop a more 'real' model of catchment behavior. The use of soft data is an approach to formalize this exchange between experimentalist and modeler and to more fully utilize the information content from experimental catchments

    Hypervelocity impact microfoil perforations in the LEO space environment (LDEF, MAP AO-023 experiment)

    Get PDF
    The Microabrasion Foil Experiment comprises arrays of frames, each supporting two layers of closely spaced metallic foils and a back-stop plate. The arrays, deploying aluminum and brass foil ranging from 1.5 to some 30 microns were exposed for 5.78 years on NASA's LDEF at a mean altitude of 458 km. They were deployed on the North, South, East, West, and Space pointing faces; results presented comprise the perforation rates for each location as a function of foil thickness. Initial results refer primarily to aluminum of 5 microns thickness or greater. This penetration distribution, comprising 2,342 perforations in total, shows significantly differing characteristics for each detector face. The anisotropy confirms, incorporating the dynamics of particulate orbital mechanics, the dominance of incorporating extraterrestrial particulates penetrating thicknesses greater than 20 microns in Al foil, yielding fluxes compatible with hyperbolic geocentric velocities. For thinner foils, a disproportionate increase in flux of particles on the East, North, and South faces shows the presence of orbital particulates which exceed the extraterrestrial component perforation rate at 5 micron foil thickness by a factor of approx. 4

    Microscopic Description of Nuclear Fission: Fission Barrier Heights of Even-Even Actinides

    Full text link
    We evaluate the performance of modern nuclear energy density functionals for predicting inner and outer fission barrier heights and energies of fission isomers of even-even actinides. For isomer energies and outer barrier heights, we find that the self-consistent theory at the HFB level is capable of providing quantitative agreement with empirical data.Comment: 8 pages, 6 figures, 1 table; Proceedings of the 5th International Conference on "Fission and properties of neutron-rich nuclei" (ICFN5), Sanibel Island, Nov. 4-10, 201

    Intercomparison of soil pore water extraction methods for stable isotope analysis

    Get PDF
    Funded by NSERC Discovery Grant U.S. Forest Service U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy, Bioenergy Technologies OfficePeer reviewedPostprin

    The in-situ cometary particulate size distribution measured for one comet: P/Halley

    Get PDF
    The close approach of Giotto to comet Halley during its 1986 apparition offered an opportunity to study the particulate mass distribution to masses of up to one gram. Data acquired by the front end channels of the highly sensitive mass spectrometer PIA and the dust shield detector system, DIDSY, provide definition to the detected distribution as close as 1000 km to the nucleus. Dynamic motion of the particulates after emission leads to a spatial differentiation affecting the size distribution in several forms: (1) ejecta velocity dispersion; (2) radiation pressure; (3) varying heliocentric distance; and (4) anisotropic nucleus emission. Transformation of the in-situ distribution from PIA and DIDSY weighted heavily by the near-nucleus fluxes leads to a presumed nucleus distribution. The data lead to a puzzling distribution at large masses, not readily explained in an otherwise monotonous power law distribution. Although temporal changes in nucleus activity could and do modify the in-situ size distribution, such an explanation is not wholly possible, because the same form is observed at differing locations in the coma where the time of flight from the nucleus greatly varies. Thus neither a general change in comet activity nor spatial variations lead to a satisfactory explanation

    Local Newspaper Agenda-Setting as Reflected in Letters to the Editor

    Get PDF
    In an effort to test the basic agenda-setting theory, a content analysis of three small to mid-size Illinois newspapers was done to determine what correlation, if any, existed between the content of the front page and the issues addressed in the letters to the editor section. Pearson product moment correlations were calculated for all issues addressed by the papers and the public. No support was found for basic agenda-setting in this study. There was partial support the hypothesis that local newspapers would be more effective setting the local issue agenda
    corecore