30,424 research outputs found
Animal emergence during Snowball Earths by thermosynthesis in submarine hydrothermal vents
Darwin already commented on the lateness in the fossil record of the emergence of the animals, calling it a valid argument against his theory of evolution^1^. This emergence of the animals (metazoans: multicellular animals) has therefore attracted much attention^2-5^. Two decades ago it was reported that extensive global glaciations (Snowball Earths) preceded the emergence^6-7^. Here we causally relate the emergence and the glaciations by invoking benthic sessile^8-11^ thermosynthesizing^12-13^ protists that gained free energy as ATP while oscillating in the thermal gradient between a submarine hydrothermal vent^14^ and the ice-covered ocean. During a global glaciation their size increased from microscopic to macroscopic due to the selective advantage of a larger span of the thermal gradient. At the glaciation's end the ATP-generating mechanisms reversed and used ATP to sustain movement. Lastly, by functioning as animal organs, these protists then through symbiogenesis^15-17^ brought forth the first animals. This simple and straightforward scenario for the emergence of animals accounts for their large organ and organism size and their use of ATP, embryo and epigenetic control of development. The scenario is extended to a general model for the emergence of biological movement^18^. The presented hypothesis is testable by collecting organisms near today's submarine hydrothermal vents and studying their behaviour in the laboratory in easily constructed thermal gradients
Self-synchronizing, bi-orthogonal coded PCM telemetry system
Communications and data handling system improves signal to noise ratio when transmission channel is perturbed by noise. Telemetry system consists of airborne source, Gaussian additive noise channel, and ground receiver unit. Advantages of system are given
Classical Optimizers for Noisy Intermediate-Scale Quantum Devices
We present a collection of optimizers tuned for usage on Noisy Intermediate-Scale Quantum (NISQ) devices. Optimizers have a range of applications in quantum computing, including the Variational Quantum Eigensolver (VQE) and Quantum Approximate Optimization (QAOA) algorithms. They are also used for calibration tasks, hyperparameter tuning, in machine learning, etc. We analyze the efficiency and effectiveness of different optimizers in a VQE case study. VQE is a hybrid algorithm, with a classical minimizer step driving the next evaluation on the quantum processor. While most results to date concentrated on tuning the quantum VQE circuit, we show that, in the presence of quantum noise, the classical minimizer step needs to be carefully chosen to obtain correct results. We explore state-of-the-art gradient-free optimizers capable of handling noisy, black-box, cost functions and stress-test them using a quantum circuit simulation environment with noise injection capabilities on individual gates. Our results indicate that specifically tuned optimizers are crucial to obtaining valid science results on NISQ hardware, and will likely remain necessary even for future fault tolerant circuits
A Mathematical Model for Estimating Biological Damage Caused by Radiation
We propose a mathematical model for estimating biological damage caused by
low-dose irradiation. We understand that the Linear Non Threshold (LNT)
hypothesis is realized only in the case of no recovery effects. In order to
treat the realistic living objects, our model takes into account various types
of recovery as well as proliferation mechanism, which may change the resultant
damage, especially for the case of lower dose rate irradiation. It turns out
that the lower the radiation dose rate, the safer the irradiated system of
living object (which is called symbolically "tissue" hereafter) can have
chances to survive, which can reproduce the so-called dose and dose-rate
effectiveness factor (DDREF).Comment: 22 pages, 6 Figs, accepted in Journal of the Physical Society of
Japa
Analysis of distortion data from TF30-P-3 mixed compression inlet test
A program was conducted to reduce and analyze inlet and engine data obtained during testing of a TF30-P-3 engine operating behind a mixed compression inlet. Previously developed distortion analysis techniques were applied to the data to assist in the development of a new distortion methodology. Instantaneous distortion techniques were refined as part of the distortion methodology development. A technique for estimating maximum levels of instantaneous distortion from steady state and average turbulence data was also developed as part of the program
Two-Dimensional Hydrodynamics of Pre-Core Collapse: Oxygen Shell Burning
By direct hydrodynamic simulation, using the Piecewise Parabolic Method (PPM)
code PROMETHEUS, we study the properties of a convective oxygen burning shell
in a SN 1987A progenitor star prior to collapse. The convection is too
heterogeneous and dynamic to be well approximated by one-dimensional
diffusion-like algorithms which have previously been used for this epoch.
Qualitatively new phenomena are seen.
The simulations are two-dimensional, with good resolution in radius and
angle, and use a large (90-degree) slice centered at the equator. The
microphysics and the initial model were carefully treated. Many of the
qualitative features of previous multi-dimensional simulations of convection
are seen, including large kinetic and acoustic energy fluxes, which are not
accounted for by mixing length theory. Small but significant amounts of
carbon-12 are mixed non-uniformly into the oxygen burning convection zone,
resulting in hot spots of nuclear energy production which are more than an
order of magnitude more energetic than the oxygen flame itself. Density
perturbations (up to 8%) occur at the `edges' of the convective zone and are
the result of gravity waves generated by interaction of penetrating flows into
the stable region. Perturbations of temperature and electron fraction at the
base of the convective zone are of sufficient magnitude to create angular
inhomogeneities in explosive nucleosynthesis products, and need to be included
in quantitative estimates of yields. Combined with the plume-like velocity
structure arising from convection, the perturbations will contribute to the
mixing of nickel-56 throughout supernovae envelopes. Runs of different
resolution, and angular extent, were performed to test the robustness of theseComment: For mpeg movies of these simulations, see
http://www.astrophysics.arizona.edu/movies.html Submitted to the
Astrophysical Journa
- …