82 research outputs found
Neutral theory of chemical reaction networks
To what extent do the characteristic features of a chemical reaction network
reflect its purpose and function? In general, one argues that correlations
between specific features and specific functions are key to understanding a
complex structure. However, specific features may sometimes be neutral and
uncorrelated with any system-specific purpose, function or causal chain. Such
neutral features are caused by chance and randomness. Here we compare two
classes of chemical networks: one that has been subjected to biological
evolution (the chemical reaction network of metabolism in living cells) and one
that has not (the atmospheric planetary chemical reaction networks). Their
degree distributions are shown to share the very same neutral
system-independent features. The shape of the broad distributions is to a large
extent controlled by a single parameter, the network size. From this
perspective, there is little difference between atmospheric and metabolic
networks; they are just different sizes of the same random assembling network.
In other words, the shape of the degree distribution is a neutral
characteristic feature and has no functional or evolutionary implications in
itself; it is not a matter of life and death.Comment: 13 pages, 8 figure
Species Abundance Patterns in Complex Evolutionary Dynamics
An analytic theory of species abundance patterns (SAPs) in biological
networks is presented. The theory is based on multispecies replicator dynamics
equivalent to the Lotka-Volterra equation, with diverse interspecies
interactions. Various SAPs observed in nature are derived from a single
parameter. The abundance distribution is formed like a widely observed
left-skewed lognormal distribution. As the model has a general form, the result
can be applied to similar patterns in other complex biological networks, e.g.
gene expression.Comment: 4 pages, 3 figures. Physical Review Letters, in pres
Electromagnetic Cascades and Cascade Nucleosynthesis in the Early Universe
We describe a calculation of electromagnetic cascading in radiation and
matter in the early universe initiated by the decay of massive particles or by
some other process. We have used a combination of Monte Carlo and numerical
techniques which enables us to use exact cross sections, where known, for all
the relevant processes. In cascades initiated after the epoch of big bang
nucleosynthesis -rays in the cascades will photodisintegrate He,
producing He and deuterium. Using the observed He and deuterium
abundances we are able to place constraints on the cascade energy deposition as
a function of cosmic time. In the case of the decay of massive primordial
particles, we place limits on the density of massive primordial particles as a
function of their mean decay time, and on the expected intensity of decay
neutrinos.Comment: compressed and uuencoded postscript. We now include a comparison with
previous work of the photon spectrum in the cascade and the limits we
calculate for the density of massive particles. The method of calculation of
photon spectra at low energies has been improved. Most figures are revised.
Our conclusions are substantially unchange
Electron-based crystalline undulator
We discuss the features of a crystalline undulator of the novel type based on
the effect of a planar channeling of ultra-relativistic electrons in a
periodically bent crystals. It is demonstrated that an electron-based undulator
is feasible in the tens of GeV range of the beam energies, which is noticeably
higher than the energy interval allowed in a positron-based undulator.
Numerical analysis of the main parameters of the undulator as well as the
characteristics of the emitted undulator radiation is carried out for 20 and 50
GeV electrons channeling in diamond and silicon crystals along the (111)
crystallographic planes.Comment: 16 pages, 8 figures, Latex, IOP styl
A robust measure of correlation between two genes on a microarray
<p>Abstract</p> <p>Background</p> <p>The underlying goal of microarray experiments is to identify gene expression patterns across different experimental conditions. Genes that are contained in a particular pathway or that respond similarly to experimental conditions could be co-expressed and show similar patterns of expression on a microarray. Using any of a variety of clustering methods or gene network analyses we can partition genes of interest into groups, clusters, or modules based on measures of similarity. Typically, Pearson correlation is used to measure distance (or similarity) before implementing a clustering algorithm. Pearson correlation is quite susceptible to outliers, however, an unfortunate characteristic when dealing with microarray data (well known to be typically quite noisy.)</p> <p>Results</p> <p>We propose a resistant similarity metric based on Tukey's biweight estimate of multivariate scale and location. The resistant metric is simply the correlation obtained from a resistant covariance matrix of scale. We give results which demonstrate that our correlation metric is much more resistant than the Pearson correlation while being more efficient than other nonparametric measures of correlation (e.g., Spearman correlation.) Additionally, our method gives a systematic gene flagging procedure which is useful when dealing with large amounts of noisy data.</p> <p>Conclusion</p> <p>When dealing with microarray data, which are known to be quite noisy, robust methods should be used. Specifically, robust distances, including the biweight correlation, should be used in clustering and gene network analysis.</p
Ranking differentially expressed genes from Affymetrix gene expression data: methods with reproducibility, sensitivity, and specificity
<p>Abstract</p> <p>Background</p> <p>To identify differentially expressed genes (DEGs) from microarray data, users of the Affymetrix GeneChip system need to select both a preprocessing algorithm to obtain expression-level measurements and a way of ranking genes to obtain the most plausible candidates. We recently recommended suitable combinations of a preprocessing algorithm and gene ranking method that can be used to identify DEGs with a higher level of sensitivity and specificity. However, in addition to these recommendations, researchers also want to know which combinations enhance reproducibility.</p> <p>Results</p> <p>We compared eight conventional methods for ranking genes: weighted average difference (WAD), average difference (AD), fold change (FC), rank products (RP), moderated <it>t </it>statistic (modT), significance analysis of microarrays (samT), shrinkage <it>t </it>statistic (shrinkT), and intensity-based moderated <it>t </it>statistic (ibmT) with six preprocessing algorithms (PLIER, VSN, FARMS, multi-mgMOS (mmgMOS), MBEI, and GCRMA). A total of 36 real experimental datasets was evaluated on the basis of the area under the receiver operating characteristic curve (AUC) as a measure for both sensitivity and specificity. We found that the RP method performed well for VSN-, FARMS-, MBEI-, and GCRMA-preprocessed data, and the WAD method performed well for mmgMOS-preprocessed data. Our analysis of the MicroArray Quality Control (MAQC) project's datasets showed that the FC-based gene ranking methods (WAD, AD, FC, and RP) had a higher level of reproducibility: The percentages of overlapping genes (POGs) across different sites for the FC-based methods were higher overall than those for the <it>t</it>-statistic-based methods (modT, samT, shrinkT, and ibmT). In particular, POG values for WAD were the highest overall among the FC-based methods irrespective of the choice of preprocessing algorithm.</p> <p>Conclusion</p> <p>Our results demonstrate that to increase sensitivity, specificity, and reproducibility in microarray analyses, we need to select suitable combinations of preprocessing algorithms and gene ranking methods. We recommend the use of FC-based methods, in particular RP or WAD.</p
Study of photo-proton reactions driven by bremsstrahlung radiation of high-intensity laser generated electrons
Photo-nuclear reactions were investigated using a high power table-top laser. The laser system at the University of Jena ( I similar to 3-5 x 10(19) W cm(-2)) produced hard bremsstrahlung photons ( kT similar to 2(9 MeV) via a laser-gas interaction which served to induce ( gamma, p) and ( gamma, n) reactions in Mg, Ti, Zn and Mo isotopes. Several ( gamma, p) decay channels were identified using nuclear activation analysis to determine their integral reaction yields
Real-time phase-contrast x-ray imaging: a new technique for the study of animal form and function
BACKGROUND: Despite advances in imaging techniques, real-time visualization of the structure and dynamics of tissues and organs inside small living animals has remained elusive. Recently, we have been using synchrotron x-rays to visualize the internal anatomy of millimeter-sized opaque, living animals. This technique takes advantage of partially-coherent x-rays and diffraction to enable clear visualization of internal soft tissue not viewable via conventional absorption radiography. However, because higher quality images require greater x-ray fluxes, there exists an inherent tradeoff between image quality and tissue damage. RESULTS: We evaluated the tradeoff between image quality and harm to the animal by determining the impact of targeted synchrotron x-rays on insect physiology, behavior and survival. Using 25 keV x-rays at a flux density of 80 μW/mm(-2), high quality video-rate images can be obtained without major detrimental effects on the insects for multiple minutes, a duration sufficient for many physiological studies. At this setting, insects do not heat up. Additionally, we demonstrate the range of uses of synchrotron phase-contrast imaging by showing high-resolution images of internal anatomy and observations of labeled food movement during ingestion and digestion. CONCLUSION: Synchrotron x-ray phase contrast imaging has the potential to revolutionize the study of physiology and internal biomechanics in small animals. This is the only generally applicable technique that has the necessary spatial and temporal resolutions, penetrating power, and sensitivity to soft tissue that is required to visualize the internal physiology of living animals on the scale from millimeters to microns
- …