3,625 research outputs found
Abiotic O Levels on Planets around F, G, K, and M Stars: Possible False Positives for Life?
In the search for life on Earth-like planets around other stars, the first
(and likely only) information will come from the spectroscopic characterization
of the planet's atmosphere. Of the countless number of chemical species
terrestrial life produces, only a few have the distinct spectral features and
the necessary atmospheric abundance to be detectable. The easiest of these
species to observe in Earth's atmosphere is O (and its photochemical
byproduct, O). But O can also be produced abiotically by photolysis
of CO, followed by recombination of O atoms with each other. CO is
produced in stoichiometric proportions. Whether O and CO can accumulate
to appreciable concentrations depends on the ratio of far-UV to near-UV
radiation coming from the planet's parent star and on what happens to these
gases when they dissolve in a planet's oceans. Using a one-dimensional
photochemical model, we demonstrate that O derived from CO
photolysis should not accumulate to measurable concentrations on planets around
F- and G-type stars. K-star, and especially M-star planets, however, may build
up O because of the low near-UV flux from their parent stars, in
agreement with some previous studies. On such planets, a 'false positive' for
life is possible if recombination of dissolved CO and O in the oceans is
slow and if other O sinks (e.g., reduced volcanic gases or dissolved
ferrous iron) are small. O, on the other hand, could be detectable at UV
wavelengths ( < 300 nm) for a much broader range of boundary
conditions and stellar types.Comment: 20 pages text, 9 figure
Does Long-Term Macrophyte Management in Lakes Affect Biotic Richness and Diversity?
We hypothesize that the richness and diversity of the biota
in Lake Moraine (42°50’47”N, 75°31’39”W) in New York have
been negatively impacted by 60 years of macrophyte and algae
management to control Eurasian watermilfoil (
Myriophyllum
spicatum
L.) and associated noxious plants. To test this
hypothesis we compare water quality characteristics, richness
and selected indicators of plant diversity, zooplankton, benthic
macroinvertebrates and fish in Lake Moraine with those in
nearby Hatch Lake (42°50’06”N, 75°40’67”W). The latter is
of similar size and would be expected to have similar biota,
but has not been subjected to management. Measurements of
temperature, pH, oxygen, conductivity, Secchi transparency,
calcium, total phosphorus and nitrites + nitrates are comparable.
Taxa richness and the diversity indices applied to the
aquatic macrophytes are similar in both lakes. (PDF has 8 pages.
mesas.py v1.0: a flexible Python package for modeling solute transport and transit times using StorAge Selection functions
StorAge Selection (SAS) transport theory has recently emerged as a framework for representing material transport through a control volume. It can be seen as a generalization of transit time theories and lumped-parameter models to allow for arbitrary temporal variability in the rate of material flow in and out of the control volume, and in the transport dynamics. SAS is currently the state-of-the-art approach to interpreting tracer transport. Here, we present mesas.py, a Python package implementing the SAS framework. mesas.py allows SAS functions to be specified using several built-in common distributions, as a piecewise linear cumulative distribution function (CDF), or as a weighted sum of any number of such distributions. The distribution parameters and weights used to combine them can be allowed to vary in time, permitting SAS functions of arbitrary complexity to be specified. mesas.py simulates tracer transport using a novel mass-tracking scheme and can account for first-order reactions and fractionation. We present a number of analytical solutions to the governing equations and use these to validate the code. For a benchmark problem the time-step-averaging approach of the mesas.py implementation provides a reduction in mass balance errors of up to 15 times in some cases compared with a previous implementation of SAS.</p
Resonance fluorescence in ultrafast and intense x-ray free-electron-laser pulses
The spectrum of resonance fluorescence is calculated for a two-level system excited by an intense, ultrashort x-ray pulse made available for instance by free-electron lasers such as the Linac Coherent Light Source. We allow for inner-shell hole decay widths and destruction of the system by further photoionization. This two-level description is employed to model neon cations strongly driven by x rays tuned to the 1s 2p-1 --> 1s-1 2p transition at 848 eV; the x rays induce Rabi oscillations which are so fast that they compete with Ne 1s-hole decay. We predict resonance fluorescence spectra for two different scenarios: first, chaotic pulses based on the self-amplified spontaneous emission principle, like those presently generated at x-ray free-electron-laser facilities and, second, Gaussian pulses which will become available in the foreseeable future with self-seeding techniques. As an example of the exciting opportunities derived from the use of seeding methods, we predict, in spite of above obstacles, the possibility to distinguish at x-ray frequencies a clear signature of Rabi flopping in the spectrum of resonance fluorescence
Improved Thermoelectric Cooling Based on the Thomson Effect
Traditional thermoelectric Peltier coolers exhibit a cooling limit which is
primarily determined by the figure of merit, zT. Rather than a fundamental
thermodynamic limit, this bound can be traced to the difficulty of maintaining
thermoelectric compatibility. Self-compatibility locally maximizes the cooler's
coefficient of performance for a given zT and can be achieved by adjusting the
relative ratio of the thermoelectric transport properties that make up zT. In
this study, we investigate the theoretical performance of thermoelectric
coolers that maintain self-compatibility across the device. We find such a
device behaves very differently from a Peltier cooler, and term self-compatible
coolers "Thomson coolers" when the Fourier heat divergence is dominated by the
Thomson, as opposed to the Joule, term. A Thomson cooler requires an
exponentially rising Seebeck coefficient with increasing temperature, while
traditional Peltier coolers, such as those used commercially, have
comparatively minimal change in Seebeck coefficient with temperature. When
reasonable material property bounds are placed on the thermoelectric leg, the
Thomson cooler is predicted to achieve approximately twice the maximum
temperature drop of a traditional Peltier cooler with equivalent figure of
merit (zT). We anticipate the development of Thomson coolers will ultimately
lead to solid state cooling to cryogenic temperatures.Comment: The Manuscript has been revised for publication in PR
Evaluating implicit feedback models using searcher simulations
In this article we describe an evaluation of relevance feedback (RF) algorithms using searcher simulations. Since these algorithms select additional terms for query modification based on inferences made from searcher interaction, not on relevance information searchers explicitly provide (as in traditional RF), we refer to them as implicit feedback models. We introduce six different models that base their decisions on the interactions of searchers and use different approaches to rank query modification terms. The aim of this article is to determine which of these models should be used to assist searchers in the systems we develop. To evaluate these models we used searcher simulations that afforded us more control over the experimental conditions than experiments with human subjects and allowed complex interaction to be modeled without the need for costly human experimentation. The simulation-based evaluation methodology measures how well the models learn the distribution of terms across relevant documents (i.e., learn what information is relevant) and how well they improve search effectiveness (i.e., create effective search queries). Our findings show that an implicit feedback model based on Jeffrey's rule of conditioning outperformed other models under investigation
Recommended from our members
Parametrizing horizontally-averaged wind and temperature profiles in the urban roughness sublayer
Tower-based measurements from within and above the urban canopy in two cities are used to evaluate several existing approaches that parametrize the vertical profiles of wind speed and temperature within the urban roughness sublayer (RSL). It is shown that current use of Monin–Obukhov similarity theory (MOST) in numerical weather prediction models can be improved upon by using RSL corrections when modelling the vertical profiles of wind speed and friction velocity in the urban RSL using MOST. Using anisotropic building morphological information improves the agreement between observed and parametrized profiles of wind speed and momentum fluxes for selected methods. The largest improvement is found when using dynamically-varying aerodynamic roughness length and displacement height. Adding a RSL correction to MOST, however, does not improve the parametrization of the vertical profiles of temperature and heat fluxes. This is expected since sources and sinks of heat are assumed uniformly distributed through a simple flux boundary condition in all RSL formulations, yet are highly patchy and anisotropic in a real urban context. Our results can be used to inform the choice of surface-layer representations for air quality, dispersion, and numerical weather prediction applications in the urban environment
Probabilistic models of information retrieval based on measuring the divergence from randomness
We introduce and create a framework for deriving probabilistic models of Information Retrieval. The models are nonparametric models of IR obtained in the language model approach. We derive term-weighting models by measuring the divergence of the actual term distribution from that obtained under a random process. Among the random processes we study the binomial distribution and Bose--Einstein statistics. We define two types of term frequency normalization for tuning term weights in the document--query matching process. The first normalization assumes that documents have the same length and measures the information gain with the observed term once it has been accepted as a good descriptor of the observed document. The second normalization is related to the document length and to other statistics. These two normalization methods are applied to the basic models in succession to obtain weighting formulae. Results show that our framework produces different nonparametric models forming baseline alternatives to the standard tf-idf model
- …