1,043 research outputs found

    Prediction of adverse perinatal outcome by fetal biometry: comparison of customized and populationâ based standards

    Full text link
    ObjectiveTo compare the predictive performance of estimated fetal weight (EFW) percentiles, according to eight growth standards, to detect fetuses at risk for adverse perinatal outcome.MethodsThis was a retrospective cohort study of 3437 Africanâ American women. Populationâ based (Hadlock, INTERGROWTHâ 21st, World Health Organization (WHO), Fetal Medicine Foundation (FMF)), ethnicityâ specific (Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD)), customized (Gestationâ Related Optimal Weight (GROW)) and Africanâ American customized (Perinatology Research Branch (PRB)/NICHD) growth standards were used to calculate EFW percentiles from the last available scan prior to delivery. Prediction performance indices and relative risk (RR) were calculated for EFW â 90th percentiles, according to each standard, for individual and composite adverse perinatal outcomes. Sensitivity at a fixed (10%) falseâ positive rate (FPR) and partial (FPR â 90th percentile were also at risk for any adverse perinatal outcome according to the INTERGROWTHâ 21st (RRâ =â 1.4; 95%â CI, 1.0â 1.9) and Hadlock (RRâ =â 1.7; 95%â CI, 1.1â 2.6) standards, many times fewer cases (2â 5â fold lower sensitivity) were detected by using EFW >â 90th percentile, rather than EFW â 90th percentile were at increased risk of adverse perinatal outcomes according to all or some of the eight growth standards, respectively. The RR of a composite adverse perinatal outcome in pregnancies with EFW <â 10th percentile was higher for the mostâ stringent (NICHD) compared with the leastâ stringent (FMF) standard. The results of the complementary analysis of AUC suggest slightly improved detection of adverse perinatal outcome by more recent populationâ based (INTERGROWTHâ 21st) and customized (PRB/NICHD) standards compared with the Hadlock and FMF standards. Published 2019. This article is a U.S. Government work and is in the public domain in the USA.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/153734/1/uog20299.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/153734/2/uog20299_am.pd

    Synthetic lethal screening in the mammalian central nervous system identifies Gpx6 as a modulator of Huntington’s disease

    Get PDF
    Huntington’s disease, the most common inherited neurodegenerative disease, is characterized by a dramatic loss of deep-layer cortical and striatal neurons, as well as morbidity in midlife. Human genetic studies led to the identification of the causative gene, huntingtin. Recent genomic advances have also led to the identification of hundreds of potential interacting partners for huntingtin protein and many hypotheses as to the molecular mechanisms whereby mutant huntingtin leads to cellular dysfunction and death. However, the multitude of possible interacting partners and cellular pathways affected by mutant huntingtin has complicated efforts to understand the etiology of this disease, and to date no curative therapeutic exists. To address the general problem of identifying the disease-phenotype contributing genes from a large number of correlative studies, here we develop a synthetic lethal screening methodology for the mammalian central nervous system, called SLIC, for synthetic lethal in the central nervous system. Applying SLIC to the study of Huntington’s disease, we identify the age-regulated glutathione peroxidase 6 (Gpx6) gene as a modulator of mutant huntingtin toxicity and show that overexpression of Gpx6 can dramatically alleviate both behavioral and molecular phenotypes associated with a mouse model of Huntington’s disease. SLIC can, in principle, be used in the study of any neurodegenerative disease for which a mouse model exists, promising to reveal modulators of neurodegenerative disease in an unbiased fashion, akin to screens in simpler model organisms.National Institute of Neurological Disorders and Stroke (U.S.) (Award R01NS085880)William N. and Bernice E. Bumpus Foundation (Early Career Investigator Innovation Award)JPB FoundationEuropean Molecular Biology Organization (Long-term Fellowship

    Polyharmonic Smoothing Splines and the Multidimensional Wiener Filtering of Fractal-Like Signals

    Get PDF
    Motivated by the fractal-like behavior of natural images, we develop a smoothing technique that uses a regularization functional which is a fractional iterate of the Laplacian. This type of functional was initially introduced by Duchon for the approximation of nonuniformily sampled, multidimensional data. He proved that the general solution is a smoothing spline that is represented by a linear combination of radial basis functions (RBFs). Unfortunately, this is tedious to implement for images because of the poor conditioning of RBFs and their lack of decay. Here, we present a much more efficient method for the special case of a uniform grid. The key idea is to express Duchon's solution in a fractional polyharmonic B-spline basis that spans the same space as the RBFs. This allows us to derive an algorithm where the smoothing is performed by filtering in the Fourier domain. Next we prove that the above smoothing spline can be optimally tuned to provide the MMSE estimation of a fractional Brownian field corrupted by white noise. This is a strong result that not only yields the best linear filter (Wiener solution), but also the optimal interpolation space, which is not bandlimited. It also suggests a way of using the noisy data to identify the optimal parameters (order of the spline and smoothing strength), which yields a fully automatic smoothing procedure. We evaluate the performance of our algorithm by comparing it against an oracle Wiener filter, which requires the knowledge of the true noiseless power spectrum of the signal. We find that our approach performs almost as well as the oracle solution over a wide range of conditions

    Using resource graphs to represent conceptual change

    Full text link
    We introduce resource graphs, a representation of linked ideas used when reasoning about specific contexts in physics. Our model is consistent with previous descriptions of resources and coordination classes. It can represent mesoscopic scales that are neither knowledge-in-pieces or large-scale concepts. We use resource graphs to describe several forms of conceptual change: incremental, cascade, wholesale, and dual construction. For each, we give evidence from the physics education research literature to show examples of each form of conceptual change. Where possible, we compare our representation to models used by other researchers. Building on our representation, we introduce a new form of conceptual change, differentiation, and suggest several experimental studies that would help understand the differences between reform-based curricula.Comment: 27 pages, 14 figures, no tables. Submitted for publication to the Physical Review Special Topics Physics Education Research on March 8, 200

    Development of intuitive rules: Evaluating the application of the dual-system framework to understanding children's intuitive reasoning

    Get PDF
    This is an author-created version of this article. The original source of publication is Psychon Bull Rev. 2006 Dec;13(6):935-53 The final publication is available at www.springerlink.com Published version: http://dx.doi.org/10.3758/BF0321390
    • …
    corecore