69 research outputs found
09. Matching Long-Term Fire Effects Research to Pressing Questions Facing Tallgrass Prairie Managers across the Upper Midwest
The goal for this paper is to explore how a network of coordinated prescribed fire experiments could be developed and applied to tallgrass prairie management. In a 2011 survey conducted by the Tallgrass Prairie and Oak Savanna Fire Science Consortium in their region, 61% of 207 land managers indicated that their greatest need with respect to fire regimes was information on the outcome of variations in fire frequency and season, with information on these variables ranging from limited to completely lacking. Need for this kind of information was echoed during a breakout discussion session at the 2016 North American Prairie Conference where researchers and land managers shared their opinions on how the potential costs and benefits of developing a research network with experimental treatments could be relevant to management needs. The discussion was encouraging, although researchers noted funding as an important barrier. An example of the informative nature of long-term fire studies is ongoing at the University of Nebraska at Omaha where an experiment established in 1978 has shown strong differences among vegetation and soils in plots burned in different seasons and with different frequencies. A network of sites replicating this type of experiment across the region would inform land management decisions at a broad array of sites that are represented by a variety of soils, weather, climate, and plant species, including invasive plants. All these variables have been hypothesized to be important predictors of fire effects at some location, but the relative importance of different variables across the region has not been quantified through monitoring or research. In this paper, we outline potential steps for a sustained effort to investigate the benefits and risks of engaging in and funding a regional fire research network
Investigating the parametric dependence of the impact of two-way coupling on inertial particle settling in turbulence
Tom et al.\ (J.\ Fluid Mech.\ 947, A7, 2022) investigated the impact of
two-way coupling (2WC) on particle settling in turbulence. For the limited
parameter choices explored, it was found that 2WC substantially enhances
particle settling compared to the one-way coupled (1WC) case, even at low mass
loading . Moreover, contrary to previous claims, it was demonstrated
that preferential sweeping remains the mechanism responsible for the particles
settling faster than the Stokes settling velocity in 2WC flows. However,
crucial questions remain: 1) how small must be for the effects of 2WC
on particle settling to be negligible? 2) does the preferential sweeping
mechanism remain relevant in 2WC flows as is increased? To answer
these, we explore a much broader portion of the parameter space, and our
simulations cover cases where the impact of 2WC on the global fluid statistics
ranges from negligible to strong. We find that even for , 2WC can noticeably increase the settling for some choices of the
Stokes and Froude numbers. We also demonstrate that even when is large
enough for the global fluid statistics to be strongly affected by the
particles, preferential sweeping is still the mechanism responsible for the
enhanced particle settling. The difference between the 1WC and 2WC cases is
that, in the latter the particles are not merely swept around the
downward-moving side of vortices, but they also drag the fluid with them as
they move down
ARIES: A Corpus of Scientific Paper Edits Made in Response to Peer Reviews
Revising scientific papers based on peer feedback is a challenging task that
requires not only deep scientific knowledge and reasoning, but also the ability
to recognize the implicit requests in high-level feedback and to choose the
best of many possible ways to update the manuscript in response. We introduce
this task for large language models and release ARIES, a dataset of review
comments and their corresponding paper edits, to enable training and evaluating
models. We study two versions of the task: comment-edit alignment and edit
generation, and evaluate several baselines, including GPT-4. We find that
models struggle even to identify the edits that correspond to a comment,
especially in cases where the comment is phrased in an indirect way or where
the edit addresses the spirit of a comment but not the precise request. When
tasked with generating edits, GPT-4 often succeeds in addressing comments on a
surface level, but it rigidly follows the wording of the feedback rather than
the underlying intent, and includes fewer technical details than human-written
edits. We hope that our formalization, dataset, and analysis will form a
foundation for future work in this area.Comment: 11 pages, 2 figure
The Presence of Persistent Bovine Viral Diarrhea Virus Infection and a Novel Bosavirus in a Bison Herd
Objective Bovine viral diarrhea virus (BVDV) is a significant pathogen of cattle, leading to losses due to reproductive failure, respiratory disease and immune dysregulation. An investigation was conducted in an American bison (Bison bison) herd dealing with reproductive issues in 2018-2019 calving season to determine likely cause of the losses
Options for Affordable Fission Surface Power Systems
Fission surface power systems could provide abundant power anywhere on free surface of the moon or Mars. Locations could include permanently shaded regions on the moon and high latitudes on Mars. To be fully utilized; however, fission surface power systems must be safe, have adequate performance, and be affordable. This paper discusses options for the design and development of such systems
Options for Affordable Planetary Fission Surface Power Systems
Nuclear fission systems could serve as "workhorse" power plants for the Vision for Space Exploration. In this context, the "workhorse" power plant is defined as a system that could provide power anywhere on the surface of the moon or Mars, land on the moon using a Robotic Lunar Exploration Program (RLEP)-developed lander, and would be a viable, affordable option once power requirements exceed that which can be provided by existing energy systems
Testing in Support of Fission Surface Power System Qualification
The strategy for qualifying a FSP system could have a significant programmatic impact. The US has not qualified a space fission power system since launch of the SNAP-10A in 1965. This paper explores cost-effective options for obtaining data that would be needed for flight qualification of a fission system. Qualification data could be obtained from both nuclear and non-nuclear testing. The ability to perform highly realistic nonnuclear testing has advanced significantly throughout the past four decades. Instrumented thermal simulators were developed during the 1970s and 1980s to assist in the development, operation, and assessment of terrestrial fission systems. Instrumented thermal simulators optimized for assisting in the development, operation, and assessment of modern FSP systems have been under development (and utilized) since 1998. These thermal simulators enable heat from fission to be closely mimicked (axial power profile, radial power profile, temperature, heat flux, etc.) and extensive data to be taken from the core region. For transient testing, pin power during a transient is calculated based on the reactivity feedback that would occur given measured values of test article temperature and/or dimensional changes. The reactivity feedback coefficients needed for the test are either calculated or measured using cold/warm zero-power criticals. In this way non-nuclear testing can be used to provide very realistic information related to nuclear operation. Non-nuclear testing can be used at all levels, including component, subsystem, and integrated system testing. FSP fuels and materials are typically chosen to ensure very high confidence in operation at design burnups, fluences, and temperatures. However, facilities exist (e.g. ATR, HFIR) for affordably performing in-pile fuel and materials irradiations, if such testing is desired. Ex-core materials and components (such as alternator materials, control drum drives, etc.) could be irradiated in university or DOE reactors to ensure adequate radiation resistance. Facilities also exist for performing warm and cold zero-power criticals
Global mean surface temperature and climate sensitivity of the early Eocene Climatic Optimum (EECO), Paleocene–Eocene Thermal Maximum (PETM), and latest Paleocene
Accurate estimates of past global mean surface temperature (GMST) help to contextualise future climate change and are required to estimate the sensitivity of the climate system to CO2 forcing through Earth's history. Previous GMST estimates for the latest Paleocene and early Eocene (∼57 to 48 million years ago) span a wide range (∼9 to 23 ∘C higher than pre-industrial) and prevent an accurate assessment of climate sensitivity during this extreme greenhouse climate interval. Using the most recent data compilations, we employ a multi-method experimental framework to calculate GMST during the three DeepMIP target intervals: (1) the latest Paleocene (∼57 Ma), (2) the Paleocene–Eocene Thermal Maximum (PETM; 56 Ma), and (3) the early Eocene Climatic Optimum (EECO; 53.3 to 49.1 Ma). Using six different methodologies, we find that the average GMST estimate (66 % confidence) during the latest Paleocene, PETM, and EECO was 26.3 ∘C (22.3 to 28.3 ∘C), 31.6 ∘C (27.2 to 34.5 ∘C), and 27.0 ∘C (23.2 to 29.7 ∘C), respectively. GMST estimates from the EECO are ∼10 to 16 ∘C warmer than pre-industrial, higher than the estimate given by the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (9 to 14 ∘C higher than pre-industrial). Leveraging the large “signal” associated with these extreme warm climates, we combine estimates of GMST and CO2 from the latest Paleocene, PETM, and EECO to calculate gross estimates of the average climate sensitivity between the early Paleogene and today. We demonstrate that “bulk” equilibrium climate sensitivity (ECS; 66 % confidence) during the latest Paleocene, PETM, and EECO is 4.5 ∘C (2.4 to 6.8 ∘C), 3.6 ∘C (2.3 to 4.7 ∘C), and 3.1 ∘C (1.8 to 4.4 ∘C) per doubling of CO2. These values are generally similar to those assessed by the IPCC (1.5 to 4.5 ∘C per doubling CO2) but appear incompatible with low ECS values (<1.5 per doubling CO2)
CfA3: 185 Type Ia Supernova Light Curves from the CfA
We present multi-band photometry of 185 type-Ia supernovae (SN Ia), with over
11500 observations. These were acquired between 2001 and 2008 at the F. L.
Whipple Observatory of the Harvard-Smithsonian Center for Astrophysics (CfA).
This sample contains the largest number of homogeneously-observed and reduced
nearby SN Ia (z < 0.08) published to date. It more than doubles the nearby
sample, bringing SN Ia cosmology to the point where systematic uncertainties
dominate. Our natural system photometry has a precision of 0.02 mag or better
in BVRIr'i' and roughly 0.04 mag in U for points brighter than 17.5 mag. We
also estimate a systematic uncertainty of 0.03 mag in our SN Ia standard system
BVRIr'i' photometry and 0.07 mag for U. Comparisons of our standard system
photometry with published SN Ia light curves and comparison stars, where
available for the same SN, reveal agreement at the level of a few hundredths
mag in most cases. We find that 1991bg-like SN Ia are sufficiently distinct
from other SN Ia in their color and light-curve-shape/luminosity relation that
they should be treated separately in light-curve/distance fitter training
samples. The CfA3 sample will contribute to the development of better
light-curve/distance fitters, particularly in the few dozen cases where
near-infrared photometry has been obtained and, together, can help disentangle
host-galaxy reddening from intrinsic supernova color, reducing the systematic
uncertainty in SN Ia distances due to dust.Comment: Accepted to the Astrophysical Journal. Minor changes from last
version. Light curves, comparison star photometry, and passband tables are
available at http://www.cfa.harvard.edu/supernova/CfA3
- …