835 research outputs found

    Double quantum dot with integrated charge sensor based on Ge/Si heterostructure nanowires

    Get PDF
    Coupled electron spins in semiconductor double quantum dots hold promise as the basis for solid-state qubits. To date, most experiments have used III-V materials, in which coherence is limited by hyperfine interactions. Ge/Si heterostructure nanowires seem ideally suited to overcome this limitation: the predominance of spin-zero nuclei suppresses the hyperfine interaction and chemical synthesis creates a clean and defect-free system with highly controllable properties. Here we present a top gate-defined double quantum dot based on Ge/Si heterostructure nanowires with fully tunable coupling between the dots and to the leads. We also demonstrate a novel approach to charge sensing in a one-dimensional nanostructure by capacitively coupling the double dot to a single dot on an adjacent nanowire. The double quantum dot and integrated charge sensor serve as an essential building block required to form a solid-state spin qubit free of nuclear spin.Comment: Related work at http://marcuslab.harvard.edu and http://cmliris.harvard.ed

    Estimating Surface Area in Early Hominins

    Get PDF
    Height and weight-based methods of estimating surface area have played an important role in the development of the current consensus regarding the role of thermoregulation in human evolution. However, such methods may not be reliable when applied to early hominins because their limb proportions differ markedly from those of humans. Here, we report a study in which this possibility was evaluated by comparing surface area estimates generated with the best-known height and weight-based method to estimates generated with a method that is sensitive to proportional differences. We found that the two methods yield indistinguishable estimates when applied to taxa whose limb proportions are similar to those of humans, but significantly different results when applied to taxa whose proportions differ from those of humans. We also found that the discrepancy between the estimates generated by the two methods is almost entirely attributable to inter-taxa differences in limb proportions. One corollary of these findings is that we need to reassess hypotheses about the role of thermoregulation in human evolution that have been developed with the aid of height and weight-based methods of estimating body surface area. Another is that we need to use other methods in future work on fossil hominin body surface areas

    High resolution structural evidence suggests the Sarcoplasmic Reticulum forms microdomains with acidic stores (lysosomes) in the heart

    Get PDF
    Nicotinic Acid Adenine Dinucleotide Phosphate (NAADP) stimulates calcium release from acidic stores such as lysosomes and is a highly potent calcium-mobilising second messenger. NAADP plays an important role in calcium signalling in the heart under basal conditions and following β-adrenergic stress. Nevertheless, the spatial interaction of acidic stores with other parts of the calcium signalling apparatus in cardiac myocytes is unknown. We present evidence that lysosomes are intimately associated with the sarcoplasmic reticulum (SR) in ventricular myocytes; a median separation of 20 nm in 2D electron microscopy and 3.3 nm in 3D electron tomography indicates a genuine signalling microdomain between these organelles. Fourier analysis of immunolabelled lysosomes suggests a sarcomeric pattern (dominant wavelength 1.80 μm). Furthermore, we show that lysosomes form close associations with mitochondria (median separation 6.2 nm in 3D studies) which may provide a basis for the recently-discovered role of NAADP in reperfusion-induced cell death. The trigger hypothesis for NAADP action proposes that calcium release from acidic stores subsequently acts to enhance calcium release from the SR. This work provides structural evidence in cardiac myocytes to indicate the formation of microdomains between acidic and SR calcium stores, supporting emerging interpretations of NAADP physiology and pharmacology in heart

    Generalized shrinkage F-like statistics for testing an interaction term in gene expression analysis in the presence of heteroscedasticity

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Many analyses of gene expression data involve hypothesis tests of an interaction term between two fixed effects, typically tested using a residual variance. In expression studies, the issue of variance heteroscedasticity has received much attention, and previous work has focused on either between-gene or within-gene heteroscedasticity. However, in a single experiment, heteroscedasticity may exist both within and between genes. Here we develop flexible shrinkage error estimators considering both between-gene and within-gene heteroscedasticity and use them to construct <it>F</it>-like test statistics for testing interactions, with cutoff values obtained by permutation. These permutation tests are complicated, and several permutation tests are investigated here.</p> <p>Results</p> <p>Our proposed test statistics are compared with other existing shrinkage-type test statistics through extensive simulation studies and a real data example. The results show that the choice of permutation procedures has dramatically more influence on detection power than the choice of <it>F </it>or <it>F</it>-like test statistics. When both types of gene heteroscedasticity exist, our proposed test statistics can control preselected type-I errors and are more powerful. Raw data permutation is not valid in this setting. Whether unrestricted or restricted residual permutation should be used depends on the specific type of test statistic.</p> <p>Conclusions</p> <p>The <it>F</it>-like test statistic that uses the proposed flexible shrinkage error estimator considering both types of gene heteroscedasticity and unrestricted residual permutation can provide a statistically valid and powerful test. Therefore, we recommended that it should always applied in the analysis of real gene expression data analysis to test an interaction term.</p

    Cadaveric and three-dimensional computed tomography study of the morphology of the scapula with reference to reversed shoulder prosthesis

    Get PDF
    <p>Abstract</p> <p>Purpose</p> <p>The purpose of this study is to analyze the morphology of the scapula with reference to the glenoid component implantation in reversed shoulder prosthesis, in order to improve primary fixation of the component.</p> <p>Methods</p> <p>Seventy-three 3-dimensional computed tomography of the scapula and 108 scapular dry specimens were analyzed to determine the anterior and posterior length of the glenoid neck, the angle between the glenoid surface and the upper posterior column of the scapula and the angle between the major craneo-caudal glenoid axis and the base of the coracoid process and the upper posterior column.</p> <p>Results</p> <p>The anterior and posterior length of glenoid neck was classified into two groups named "short-neck" and "long-neck" with significant differences between them. The angle between the glenoid surface and the upper posterior column of the scapula was also classified into two different types: type I (mean 50°–52°) and type II (mean 62,50°–64°), with significant differences between them (p < 0,001). The angle between the major craneo-caudal glenoid axis and the base of the coracoid process averaged 18,25° while the angle with the upper posterior column of the scapula averaged 8°.</p> <p>Conclusion</p> <p>Scapular morphological variability advices for individual adjustments of glenoid component implantation in reversed total shoulder prosthesis. Three-dimensional computed tomography of the scapula constitutes an important tool when planning reversed prostheses implantation.</p

    Optimizing Preprocessing and Analysis Pipelines for Single-Subject fMRI: 2. Interactions with ICA, PCA, Task Contrast and Inter-Subject Heterogeneity

    Get PDF
    A variety of preprocessing techniques are available to correct subject-dependant artifacts in fMRI, caused by head motion and physiological noise. Although it has been established that the chosen preprocessing steps (or “pipeline”) may significantly affect fMRI results, it is not well understood how preprocessing choices interact with other parts of the fMRI experimental design. In this study, we examine how two experimental factors interact with preprocessing: between-subject heterogeneity, and strength of task contrast. Two levels of cognitive contrast were examined in an fMRI adaptation of the Trail-Making Test, with data from young, healthy adults. The importance of standard preprocessing with motion correction, physiological noise correction, motion parameter regression and temporal detrending were examined for the two task contrasts. We also tested subspace estimation using Principal Component Analysis (PCA), and Independent Component Analysis (ICA). Results were obtained for Penalized Discriminant Analysis, and model performance quantified with reproducibility (R) and prediction metrics (P). Simulation methods were also used to test for potential biases from individual-subject optimization. Our results demonstrate that (1) individual pipeline optimization is not significantly more biased than fixed preprocessing. In addition, (2) when applying a fixed pipeline across all subjects, the task contrast significantly affects pipeline performance; in particular, the effects of PCA and ICA models vary with contrast, and are not by themselves optimal preprocessing steps. Also, (3) selecting the optimal pipeline for each subject improves within-subject (P,R) and between-subject overlap, with the weaker cognitive contrast being more sensitive to pipeline optimization. These results demonstrate that sensitivity of fMRI results is influenced not only by preprocessing choices, but also by interactions with other experimental design factors. This paper outlines a quantitative procedure to denoise data that would otherwise be discarded due to artifact; this is particularly relevant for weak signal contrasts in single-subject, small-sample and clinical datasets

    Haplotype-based quantitative trait mapping using a clustering algorithm

    Get PDF
    BACKGROUND: With the availability of large-scale, high-density single-nucleotide polymorphism (SNP) markers, substantial effort has been made in identifying disease-causing genes using linkage disequilibrium (LD) mapping by haplotype analysis of unrelated individuals. In addition to complex diseases, many continuously distributed quantitative traits are of primary clinical and health significance. However the development of association mapping methods using unrelated individuals for quantitative traits has received relatively less attention. RESULTS: We recently developed an association mapping method for complex diseases by mining the sharing of haplotype segments (i.e., phased genotype pairs) in affected individuals that are rarely present in normal individuals. In this paper, we extend our previous work to address the problem of quantitative trait mapping from unrelated individuals. The method is non-parametric in nature, and statistical significance can be obtained by a permutation test. It can also be incorporated into the one-way ANCOVA (analysis of covariance) framework so that other factors and covariates can be easily incorporated. The effectiveness of the approach is demonstrated by extensive experimental studies using both simulated and real data sets. The results show that our haplotype-based approach is more robust than two statistical methods based on single markers: a single SNP association test (SSA) and the Mann-Whitney U-test (MWU). The algorithm has been incorporated into our existing software package called HapMiner, which is available from our website at . CONCLUSION: For QTL (quantitative trait loci) fine mapping, to identify QTNs (quantitative trait nucleotides) with realistic effects (the contribution of each QTN less than 10% of total variance of the trait), large samples sizes (≥ 500) are needed for all the methods. The overall performance of HapMiner is better than that of the other two methods. Its effectiveness further depends on other factors such as recombination rates and the density of typed SNPs. Haplotype-based methods might provide higher power than methods based on a single SNP when using tag SNPs selected from a small number of samples or some other sources (such as HapMap data). Rank-based statistics usually have much lower power, as shown in our study

    Small-Bodied Humans from Palau, Micronesia

    Get PDF
    UNLABELLED: Newly discovered fossil assemblages of small bodied Homo sapiens from Palau, Micronesia possess characters thought to be taxonomically primitive for the genus Homo. BACKGROUND: Recent surface collection and test excavation in limestone caves in the rock islands of Palau, Micronesia, has produced a sizeable sample of human skeletal remains dating roughly between 940-2890 cal ybp. PRINCIPLE FINDINGS: Preliminary analysis indicates that this material is important for two reasons. First, individuals from the older time horizons are small in body size even relative to "pygmoid" populations from Southeast Asia and Indonesia, and thus may represent a marked case of human insular dwarfism. Second, while possessing a number of derived features that align them with Homo sapiens, the human remains from Palau also exhibit several skeletal traits that are considered to be primitive for the genus Homo. SIGNIFICANCE: These features may be previously unrecognized developmental correlates of small body size and, if so, they may have important implications for interpreting the taxonomic affinities of fossil specimens of Homo
    corecore