167 research outputs found

    Disruption of mesoderm formation during cardiac differentiation due to developmental exposure to 13-cis-retinoic acid.

    Get PDF
    13-cis-retinoic acid (isotretinoin, INN) is an oral pharmaceutical drug used for the treatment of skin acne, and is also a known teratogen. In this study, the molecular mechanisms underlying INN-induced developmental toxicity during early cardiac differentiation were investigated using both human induced pluripotent stem cells (hiPSCs) and human embryonic stem cells (hESCs). Pre-exposure of hiPSCs and hESCs to a sublethal concentration of INN did not influence cell proliferation and pluripotency. However, mesodermal differentiation was disrupted when INN was included in the medium during differentiation. Transcriptomic profiling by RNA-seq revealed that INN exposure leads to aberrant expression of genes involved in several signaling pathways that control early mesoderm differentiation, such as TGF-beta signaling. In addition, genome-wide chromatin accessibility profiling by ATAC-seq suggested that INN-exposure leads to enhanced DNA-binding of specific transcription factors (TFs), including HNF1B, SOX10 and NFIC, often in close spatial proximity to genes that are dysregulated in response to INN treatment. Altogether, these results identify potential molecular mechanisms underlying INN-induced perturbation during mesodermal differentiation in the context of cardiac development. This study further highlights the utility of human stem cells as an alternative system for investigating congenital diseases of newborns that arise as a result of maternal drug exposure during pregnancy

    Spectral Classification and Luminosity Function of Galaxies in the Las Campanas Redshift Survey

    Get PDF
    We construct a spectral classification scheme for the galaxies of the Las Campanas Redshift Survey (LCRS) based on a principal component analysis of the measured galaxy spectra. We interpret the physical significance of our six spectral types and conclude that they are sensitive to morphological type and the amount of active star formation. In this first analysis of the LCRS to include spectral classification, we estimate the general luminosity function, expressed as a weighted sum of the type-specific luminosity functions. In the R-band magnitude range of -23 < M <= -16.5, this function exhibits a broad shoulder centered near M = -20, and an increasing faint-end slope which formally converges on an alpha value of about -1.8 in the faint limit. The Schechter parameterization does not provide a good representation in this case, a fact which may partly explain the reported discrepancy between the luminosity functions of the LCRS and other redshift catalogs such as the Century Survey (Geller et al. 1997). The discrepancy may also arise from environmental effects such as the density-morphology relationship for which we see strong evidence in the LCRS galaxies. However, the Schechter parameterization is more effective for the luminosity functions of the individual spectral types. The data show a significant, progressive steepening of the faint-end slope, from alpha = +0.5 for early-type objects, to alpha = -1.8 for the extreme late-type galaxies. The extreme late-type population has a sufficiently high space density that its contribution to the general luminosity function is expected to dominate fainter than M = -16. We conclude that an evaluation of type-dependence is essential to any assessment of the general luminosity function.Comment: 21 pages (LaTeX), 7 figures (Postscript). To appear in the Astrophysical Journal. The discussion of environmental dependence of luminosity functions has been shortened; the material from the earlier version now appears in a separate manuscript (astro-ph/9805197

    The Tully-Fisher Relation as a Measure of Luminosity Evolution: A Low Redshift Baseline for Evolving Galaxies

    Get PDF
    We use optical rotation curves to investigate the R-band Tully-Fisher properties of a sample of 90 spiral galaxies in close pairs. The galaxies follow the Tully-Fisher relation remarkably well, with the exception of eight distinct 3-sigma outliers. Although most of the outliers show signs of recent star formation, gasdynamical effects are probably the dominant cause of their anomalous Tully-Fisher properties. Four outliers with small emission line widths have very centrally concentrated line emission and truncated rotation curves; the central emission indicates recent gas infall after a close galaxy-galaxy pass. These four galaxies may be local counterparts to compact, blue galaxies at intermediate redshift. The remaining galaxies have a negligible offset from the reference Tully-Fisher relation, but a shallower slope (2.6-sigma significance) and a 25% larger scatter. We characterize the non-outlier sample with measures of distortion and star formation to search for third parameter dependence in the residuals of the TF relation. Severe kinematic distortion is the only significant predictor of TF residuals; this distortion is not, however, responsible for the slope difference from the reference distribution. Because the outliers are easily removed by sigma clipping, we conclude that even in the presence of some tidal distortion, detection of moderate luminosity evolution should be possible with high-redshift samples the size of this 90-galaxy study. (Abridged.)Comment: LaTeX document, 55 pages including 18 figures, to appear in A

    Simulating open quantum systems: from many-body interactions to stabilizer pumping

    Get PDF
    In a recent experiment, Barreiro et al. demonstrated the fundamental building blocks of an open-system quantum simulator with trapped ions [Nature 470, 486 (2011)]. Using up to five ions, single- and multi-qubit entangling gate operations were combined with optical pumping in stroboscopic sequences. This enabled the implementation of both coherent many-body dynamics as well as dissipative processes by controlling the coupling of the system to an artificial, suitably tailored environment. This engineering was illustrated by the dissipative preparation of entangled two- and four-qubit states, the simulation of coherent four-body spin interactions and the quantum non-demolition measurement of a multi-qubit stabilizer operator. In the present paper, we present the theoretical framework of this gate-based ("digital") simulation approach for open-system dynamics with trapped ions. In addition, we discuss how within this simulation approach minimal instances of spin models of interest in the context of topological quantum computing and condensed matter physics can be realized in state-of-the-art linear ion-trap quantum computing architectures. We outline concrete simulation schemes for Kitaev's toric code Hamiltonian and a recently suggested color code model. The presented simulation protocols can be adapted to scalable and two-dimensional ion-trap architectures, which are currently under development.Comment: 27 pages, 9 figures, submitted to NJP Focus on Topological Quantum Computatio

    Hypervelocity Stars: From the Galactic Center to the Halo

    Full text link
    Hypervelocity stars (HVS) traverse the Galaxy from the central black hole to the outer halo. We show that the Galactic potential within 200 pc acts as a high pass filter preventing low velocity HVS from reaching the halo. To trace the orbits of HVS throughout the Galaxy, we construct two forms of the potential which reasonably represent the observations in the range 5--100,000 pc, a simple spherically symmetric model and a bulge-disk-halo model. We use the Hills mechanism (disruption of binaries by the tidal field of the central black hole) to inject HVS into the Galaxy and compute the observable spatial and velocity distributions of HVS with masses in the range 0.6--4 Msun. These distributions reflect the mass function in the Galactic Center, properties of binaries in the Galactic Center, and aspects of stellar evolution and the injection mechanism. For 0.6--4 Msun main sequence stars, the fraction of unbound HVS and the asymmetry of the velocity distribution for their bound counterparts increases with stellar mass. The density profiles for unbound HVS decline with distance from the Galactic Center approximately as r^{-2} (but are steeper for the most massive stars which evolve off the main sequence during their travel time from the Galactic Center); the density profiles for the bound ejecta decline with distance approximately as r^{-3}. In a survey with a limiting visual magnitude V of 23, the detectability of HVS (unbound or bound) increases with stellar mass.Comment: 32 pages of text, 5 tables, 12 figures, ApJ, accepted; revisions: corrected typos, added references, clarified some aspects of potential model and physical processes affecting relative frequency of HVS

    Can One Trust Quantum Simulators?

    Full text link
    Various fundamental phenomena of strongly-correlated quantum systems such as high-TcT_c superconductivity, the fractional quantum-Hall effect, and quark confinement are still awaiting a universally accepted explanation. The main obstacle is the computational complexity of solving even the most simplified theoretical models that are designed to capture the relevant quantum correlations of the many-body system of interest. In his seminal 1982 paper [Int. J. Theor. Phys. 21, 467], Richard Feynman suggested that such models might be solved by "simulation" with a new type of computer whose constituent parts are effectively governed by a desired quantum many-body dynamics. Measurements on this engineered machine, now known as a "quantum simulator," would reveal some unknown or difficult to compute properties of a model of interest. We argue that a useful quantum simulator must satisfy four conditions: relevance, controllability, reliability, and efficiency. We review the current state of the art of digital and analog quantum simulators. Whereas so far the majority of the focus, both theoretically and experimentally, has been on controllability of relevant models, we emphasize here the need for a careful analysis of reliability and efficiency in the presence of imperfections. We discuss how disorder and noise can impact these conditions, and illustrate our concerns with novel numerical simulations of a paradigmatic example: a disordered quantum spin chain governed by the Ising model in a transverse magnetic field. We find that disorder can decrease the reliability of an analog quantum simulator of this model, although large errors in local observables are introduced only for strong levels of disorder. We conclude that the answer to the question "Can we trust quantum simulators?" is... to some extent.Comment: 20 pages. Minor changes with respect to version 2 (some additional explanations, added references...

    Synergistic drug-cytokine induction of hepatocellular death as an in vitro approach for the study of inflammation-associated idiosyncratic drug hepatotoxicity

    Get PDF
    Idiosyncratic drug hepatotoxicity represents a major problem in drug development due to inadequacy of current preclinical screening assays, but recently established rodent models utilizing bacterial LPS co-administration to induce an inflammatory background have successfully reproduced idiosyncratic hepatotoxicity signatures for certain drugs. However, the low-throughput nature of these models renders them problematic for employment as preclinical screening assays. Here, we present an analogous, but high-throughput, in vitro approach in which drugs are administered to a variety of cell types (primary human and rat hepatocytes and the human HepG2 cell line) across a landscape of inflammatory contexts containing LPS and cytokines TNF, IFNγ, IL-1α, and IL-6. Using this assay, we observed drug–cytokine hepatotoxicity synergies for multiple idiosyncratic hepatotoxicants (ranitidine, trovafloxacin, nefazodone, nimesulide, clarithromycin, and telithromycin) but not for their corresponding non-toxic control compounds (famotidine, levofloxacin, buspirone, and aspirin). A larger compendium of drug–cytokine mix hepatotoxicity data demonstrated that hepatotoxicity synergies were largely potentiated by TNF, IL-1α, and LPS within the context of multi-cytokine mixes. Then, we screened 90 drugs for cytokine synergy in human hepatocytes and found that a significantly larger fraction of the idiosyncratic hepatotoxicants (19%) synergized with a single cytokine mix than did the non-hepatotoxic drugs (3%). Finally, we used an information theoretic approach to ascertain especially informative subsets of cytokine treatments for most highly effective construction of regression models for drug- and cytokine mix-induced hepatotoxicities across these cell systems. Our results suggest that this drug–cytokine co-treatment approach could provide a useful preclinical tool for investigating inflammation-associated idiosyncratic drug hepatotoxicity.Pfizer Inc.Institute for Collaborative BiotechnologiesMIT Center for Cell Decision ProcessesNational Institute of Mental Health (U.S.) (grant P50-GM68762)National Institute of Mental Health (U.S.) (grant T32-GM008334)Massachusetts Institute of Technology. Biotechnology Process Engineering CenterMassachusetts Institute of Technology. Center for Environmental Health SciencesNational Institute of Mental Health (U.S.) (grant U19ES011399)Whitaker Foundatio

    Is Persistent Motor or Vocal Tic Disorder a Milder Form of Tourette Syndrome?

    Get PDF
    BACKGROUND: Persistent motor or vocal tic disorder (PMVT) has been hypothesized to be a forme fruste of Tourette syndrome (TS). Although the primary diagnostic criterion for PMVT (presence of motor or vocal tics, but not both) is clear, less is known about its clinical presentation. OBJECTIVE: The goals of this study were to compare the prevalence and number of comorbid psychiatric disorders, tic severity, age at tic onset, and family history for TS and PMVT. METHODS: We analyzed data from two independent cohorts using generalized linear equations and confirmed our findings using meta‐analyses, incorporating data from previously published literature. RESULTS: Rates of obsessive–compulsive disorder (OCD) and attention deficit hyperactivity disorder (ADHD) were lower in PMVT than in TS in all analyses. Other psychiatric comorbidities occurred with similar frequencies in PMVT and TS in both cohorts, although meta‐analyses suggested lower rates of most psychiatric disorders in PMVT compared with TS. ADHD and OCD increased the odds of comorbid mood, anxiety, substance use, and disruptive behaviors, and accounted for observed differences between PMVT and TS. Age of tic onset was approximately 2 years later, and tic severity was lower in PMVT than in TS. First‐degree relatives had elevated rates of TS, PMVT, OCD, and ADHD compared with population prevalences, with rates of TS equal to or greater than PMVT rates. CONCLUSIONS: Our findings support the hypothesis that PMVT and TS occur along a clinical spectrum in which TS is a more severe and PMVT a less severe manifestation of a continuous neurodevelopmental tic spectrum disorder. © 2021 The Authors. Movement Disorders published by Wiley Periodicals LLC on behalf of International Parkinson and Movement Disorder Societ

    Statistical design of personalized medicine interventions: The Clarification of Optimal Anticoagulation through Genetics (COAG) trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is currently much interest in pharmacogenetics: determining variation in genes that regulate drug effects, with a particular emphasis on improving drug safety and efficacy. The ability to determine such variation motivates the application of personalized drug therapies that utilize a patient's genetic makeup to determine a safe and effective drug at the correct dose. To ascertain whether a genotype-guided drug therapy improves patient care, a personalized medicine intervention may be evaluated within the framework of a randomized controlled trial. The statistical design of this type of personalized medicine intervention requires special considerations: the distribution of relevant allelic variants in the study population; and whether the pharmacogenetic intervention is equally effective across subpopulations defined by allelic variants.</p> <p>Methods</p> <p>The statistical design of the Clarification of Optimal Anticoagulation through Genetics (COAG) trial serves as an illustrative example of a personalized medicine intervention that uses each subject's genotype information. The COAG trial is a multicenter, double blind, randomized clinical trial that will compare two approaches to initiation of warfarin therapy: genotype-guided dosing, the initiation of warfarin therapy based on algorithms using clinical information and genotypes for polymorphisms in <it>CYP2C9 </it>and <it>VKORC1</it>; and clinical-guided dosing, the initiation of warfarin therapy based on algorithms using only clinical information.</p> <p>Results</p> <p>We determine an absolute minimum detectable difference of 5.49% based on an assumed 60% population prevalence of zero or multiple genetic variants in either <it>CYP2C9 </it>or <it>VKORC1 </it>and an assumed 15% relative effectiveness of genotype-guided warfarin initiation for those with zero or multiple genetic variants. Thus we calculate a sample size of 1238 to achieve a power level of 80% for the primary outcome. We show that reasonable departures from these assumptions may decrease statistical power to 65%.</p> <p>Conclusions</p> <p>In a personalized medicine intervention, the minimum detectable difference used in sample size calculations is not a known quantity, but rather an unknown quantity that depends on the genetic makeup of the subjects enrolled. Given the possible sensitivity of sample size and power calculations to these key assumptions, we recommend that they be monitored during the conduct of a personalized medicine intervention.</p> <p>Trial Registration</p> <p>clinicaltrials.gov: NCT00839657</p
    corecore