790 research outputs found
Frequent lineage-specific substitution rate changes support an episodic model for protein evolution
Since the inception of the molecular clock model for sequence evolution, the investigation of protein divergence has revolved around the question of a more or less constant change of amino acid sequences, with specific overall rates for each family. Although anomalies in clock-like divergence are well known, the assumption of a constant decay rate for a given protein family is usually taken as the null model for protein evolution. However, systematic tests of this null model at a genome-wide scale have lagged behind, despite the databases’ enormous growth. We focus here on divergence rate comparisons between very closely related lineages since this allows clear orthology assignments by synteny and reliable alignments, which are crucial for determining substitution rate changes. We generated a high-confidence dataset of syntenic orthologs from four ape species, including humans. We find that despite the appearance of an overall clock-like substitution pattern, several hundred protein families show lineage-specific acceleration and deceleration in divergence rates, or combinations of both in different lineages. Hence, our analysis uncovers a rather dynamic history of substitution rate changes, even between these closely related lineages, implying that one should expect that a large fraction of proteins will have had a history of episodic rate changes in deeper phylogenies. Furthermore, each of the lineages has a separate set of particularly fast diverging proteins. The genes with the highest percentage of branch-specific substitutions are ADCYAP1 in the human lineage (9.7%), CALU in chimpanzees (7.1%), SLC39A14 in the internal branch leading to humans and chimpanzees (4.1%), RNF128 in gorillas (9%), and S100Z in gibbons (15.2%). The mutational pattern in ADCYAP1 suggests a biased mutation process, possibly through asymmetric gene conversion effects. We conclude that a null model of constant change can be problematic for predicting the evolutionary trajectories of individual proteins
Selective sweeps versus introgression - population genetic dynamics of the murine leukemia virus receptor Xpr1 in wild populations of the house mouse (Mus musculus)
Background: The interaction between viruses and their receptors in the host can be expected to lead to an evolutionary arms race resulting in cycles of rapid adaptations. We focus here on the receptor gene Xpr1 (xenotropic and polytropic retrovirus receptor 1) for murine leukemia viruses (MLVs). In a previous screen for selective sweeps in mouse populations we discovered that a population from Germany was almost monomorphic for Xpr1 haplotypes, while a population from France was polymorphic. Results: Here we analyze Xpr1 sequences and haplotypes from a broad sample of wild mouse populations of two subspecies, M. m. domesticus and M. m. musculus, to trace the origins of this distinctive polymorphism pattern. We show that the high polymorphism in the population in France is caused by a relatively recent invasion of a haplotype from a population in Iran, rather than a selective sweep in Germany. The invading haplotype codes for a novel receptor variant, which has itself undergone a recent selective sweep in the Iranian population. Conclusions: Our data support a scenario in which Xpr1 is frequently subject to positive selection, possibly as a response to resistance development against recurrently emerging infectious viruses. During such an infection cycle, receptor variants that may convey viral resistance can be captured from another population and quickly introgress into populations actively dealing with the infectious virus. © 2015 Hasenkamp et al
Surfactant-Mediated Epitaxial Growth of Single-Layer Graphene in an Unconventional Orientation on SiC
We report the use of a surfactant molecule during the epitaxy of graphene on
SiC(0001) that leads to the growth in an unconventional orientation, namely
rotation with respect to the SiC lattice. It yields a very
high-quality single-layer graphene with a uniform orientation with respect to
the substrate, on the wafer scale. We find an increased quality and homogeneity
compared to the approach based on the use of a pre-oriented template to induce
the unconventional orientation. Using spot profile analysis low energy electron
diffraction, angle-resolved photoelectron spectroscopy, and the normal
incidence x-ray standing wave technique, we assess the crystalline quality and
coverage of the graphene layer. Combined with the presence of a
covalently-bound graphene layer in the conventional orientation underneath, our
surfactant-mediated growth offers an ideal platform to prepare epitaxial
twisted bilayer graphene via intercalation.Comment: 7 pages, 3 figure
A de novo evolved gene in the house mouse regulates female pregnancy cycles
The de novo emergence of new genes has been well documented through genomic analyses. However, a functional analysis, especially of very young protein-coding genes, is still largely lacking. Here, we identify a set of house mouse-specific protein-coding genes and assess their translation by ribosome profiling and mass spectrometry data. We functionally analyze one of them, ̑extitGm13030}, which is specifically expressed in females in the oviduct. The interruption of the reading frame affects the transcriptional network in the oviducts at a specific stage of the estrous cycle. This includes the upregulation of ̑extit{Dcpp} genes, which are known to stimulate the growth of preimplantation embryos. As a consequence, knockout females have their second litters after shorter times and have a higher infanticide rate. Given that ̑extit{Gm13030 shows no signs of positive selection, our findings support the hypothesis that a de novo evolved gene can directly adopt a function without much sequence adaptation
Testing Hardy nonlocality proof with genuine energy-time entanglement
We show two experimental realizations of Hardy ladder test of quantum
nonlocality using energy-time correlated photons, following the scheme proposed
by A. Cabello \emph{et al.} [Phys. Rev. Lett. \textbf{102}, 040401 (2009)].
Unlike, previous energy-time Bell experiments, these tests require precise
tailored nonmaximally entangled states. One of them is equivalent to the
two-setting two-outcome Bell test requiring a minimum detection efficiency. The
reported experiments are still affected by the locality and detection
loopholes, but are free of the post-selection loophole of previous energy-time
and time-bin Bell tests.Comment: 5 pages, revtex4, 6 figure
Dedicated transcriptomics combined with power analysis lead to functional understanding of genes with weak phenotypic changes in knockout lines
Author summary Knockout mice benefit the understanding of gene functions in mammals. However, it has proven difficult for many genes to identify clear phenotypes, related due to lack of sufficient assays. As Lewis Wolpert put it in a famous quote “But did you take them to the opera?”, thus metaphorically alluding to the need to extend phenotyping efforts. This insight led to the establishment of phenotyping pipelines that are nowadays routinely used to characterize knock-out lines. However, transcriptomic approaches based on RNA-Seq have been much less explored for such deep-level studies. We conducted here both, a theoretical power analysis and practical RNA-Seq experiments on two knockout lines with small phenotypic effects to investigate the parameters including sample size, sequencing depth, fold change, and dispersion. Our dedicated RNA-Seq studies discovered thousands of genes with small transcriptional changes and enriched in specific functions in both knockout lines. We find that it is more important to increase the number of samples than to increase the sequencing depth. Our work shows that a deep RNA-Seq study on knockouts is powerful for understanding gene functions in cases of weak phenotypic effects, and provides a guideline for the experimental design of such studies
Quantum transport through STM-lifted single PTCDA molecules
Using a scanning tunneling microscope we have measured the quantum
conductance through a PTCDA molecule for different configurations of the
tip-molecule-surface junction. A peculiar conductance resonance arises at the
Fermi level for certain tip to surface distances. We have relaxed the molecular
junction coordinates and calculated transport by means of the Landauer/Keldysh
approach. The zero bias transmission calculated for fixed tip positions in
lateral dimensions but different tip substrate distances show a clear shift and
sharpening of the molecular chemisorption level on increasing the STM-surface
distance, in agreement with experiment.Comment: accepted for publication in Applied Physics
A Revised Design for Microarray Experiments to Account for Experimental Noise and Uncertainty of Probe Response
Background
Although microarrays are analysis tools in biomedical research, they are known to yield noisy output that usually requires experimental confirmation. To tackle this problem, many studies have developed rules for optimizing probe design and devised complex statistical tools to analyze the output. However, less emphasis has been placed on systematically identifying the noise component as part of the experimental procedure. One source of noise is the variance in probe binding, which can be assessed by replicating array probes. The second source is poor probe performance, which can be assessed by calibrating the array based on a dilution series of target molecules. Using model experiments for copy number variation and gene expression measurements, we investigate here a revised design for microarray experiments that addresses both of these sources of variance.
Results
Two custom arrays were used to evaluate the revised design: one based on 25 mer probes from an Affymetrix design and the other based on 60 mer probes from an Agilent design. To assess experimental variance in probe binding, all probes were replicated ten times. To assess probe performance, the probes were calibrated using a dilution series of target molecules and the signal response was fitted to an adsorption model. We found that significant variance of the signal could be controlled by averaging across probes and removing probes that are nonresponsive or poorly responsive in the calibration experiment. Taking this into account, one can obtain a more reliable signal with the added option of obtaining absolute rather than relative measurements.
Conclusion
The assessment of technical variance within the experiments, combined with the calibration of probes allows to remove poorly responding probes and yields more reliable signals for the remaining ones. Once an array is properly calibrated, absolute quantification of signals becomes straight forward, alleviating the need for normalization and reference hybridizations
- …