401 research outputs found

    Anomalous morphology in left hemisphere motor and premotor cortex of children who stutter

    Full text link
    Stuttering is a neurodevelopmental disorder that affects the smooth flow of speech production. Stuttering onset occurs during a dynamic period of development when children first start learning to formulate sentences. Although most children grow out of stuttering naturally, ∼1% of all children develop persistent stuttering that can lead to significant psychosocial consequences throughout one’s life. To date, few studies have examined neural bases of stuttering in children who stutter, and even fewer have examined the basis for natural recovery versus persistence of stuttering. Here we report the first study to conduct surface-based analysis of the brain morphometric measures in children who stutter. We used FreeSurfer to extract cortical size and shape measures from structural MRI scans collected from the initial year of a longitudinal study involving 70 children (36 stuttering, 34 controls) in the 3–10-year range. The stuttering group was further divided into two groups: persistent and recovered, based on their later longitudinal visits that allowed determination of their eventual clinical outcome. A region of interest analysis that focused on the left hemisphere speech network and a whole-brain exploratory analysis were conducted to examine group differences and group × age interaction effects. We found that the persistent group could be differentiated from the control and recovered groups by reduced cortical thickness in left motor and lateral premotor cortical regions. The recovered group showed an age-related decrease in local gyrification in the left medial premotor cortex (supplementary motor area and and pre-supplementary motor area). These results provide strong evidence of a primary deficit in the left hemisphere speech network, specifically involving lateral premotor cortex and primary motor cortex, in persistent developmental stuttering. Results further point to a possible compensatory mechanism involving left medial premotor cortex in those who recover from childhood stuttering.This study was supported by Award Numbers R01DC011277 (SC) and R01DC007683 (FG) from the National Institute on Deafness and other Communication Disorders (NIDCD). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIDCD or the National Institutes of Health. (R01DC011277 - National Institute on Deafness and other Communication Disorders (NIDCD); R01DC007683 - National Institute on Deafness and other Communication Disorders (NIDCD))Accepted manuscrip

    Asymptotic Neutronic Solutions for Fast Burst Reactor Design

    Get PDF
    Deterministic numerical methodologies for solving time-eigenvalue problems are valuable in characterizing the inherent rapid transient neutron behavior of a Fast Burst Reactor (FBR). New nonlinear solution techniques used to solve eigenvalue problems show great promise in modeling the neutronics of reactors. This research utilizes nonlinear solution techniques to solve for the dominant time-eigenvalue associated with the asymptotic (exponential) solution to the neutron diffusion and even-parity form of the neutron transport equation, and lays the foundation for coupling with other physics phenomena associated with FBRs. High security costs and proliferation risks associated with Highly Enriched Uranium (HEU) fueled FBRs are the motivation for this research. Use of Low Enriched Uranium (LEU) as fuel reduces these risks to acceptable levels. However, the use of LEU fuel introduces complexities such as, increased volume, and longer neutron lifetimes. Numerical techniques are sought to explore these complexities and determine the limitations and potential of a LEU fueled FBR. A combination of deterministic and stochastic computational modeling techniques are tools used to investigate the effects these complexities have on reactor design and performance. Monte Carlo N-Particle (MCNP) code is useful to determine criticality and calculate reactor kinetics parameters of current and proposed designs. New deterministic methods are developed to directly calculate the fundamental time-eigenvalue in a way that will support multi-physics coupling. The methods incorporate Jacobian Free Newton Krylov solution techniques to address the nonlinear nature of the neutronics equations. These new deterministic models produce data to determine LEU designs that may meet the performance requirements of proven HEU FBRs in terms of neutron burst yield and burst duration (pulse width) based on the Nordheim-Fuchs model. This computational data and measured performance characteristics of historical LEU FBRs show that LEU designs can generate pulses that are beneficial for meeting Research and Development (R&D) requirements. These modern computational neutronic results indicate that a LEU fueled FBR is a plausible alternative to current HEU fueled reactors

    Doctor of Philosophy

    Get PDF
    dissertationThe method of moments in conjunction with the maximum entropy method of reconstructing density distributions is applied to the energy dependent neutron diffusion equation to solve for neutron flux within a critical assembly. The energy dependent neutron diffusion equation (EDNDE) is converted into a moment equation which is solved analytically for a bare spherical critical assembly of pure 235U in the radial direction. The normalized energy dependent neutron diffusion moments (NEDNDM) generated analytically is verified to NEDNDM, as calculated by Monte Carlo N Particle 5 version 1.40 (MCNP5) and Attila-7.1.0-beta version (Attila). The normalized NEDNDM are validated with the bare spherical critical assembly experiment, named GODIVA. The NEDNDM are then put into the maximum entropy method to solve for neutron flux within the two critical assemblies (100% 235U and GODIVA) and the neutron flux is verified with MCNP5 and Attila and validated with GODIVA. The analytic NEDNDM values fall between the NEDNDM from MCNP5 (lower bound) and Attila (upper bound). The error is taken to be relative to the Monte Carlo simulation. The error range is from 0% to 14%. The error range of the NEDNDM compared to NEDNDM from GODIVA is 0% to 24%. The verification and validation error of the maximum entropy method is 12% to 25% where MCNP5 is taken to be the comparison standard. The error range of the reconstructed flux validated with GODIVA is 0% to 10%. The error range of the neutron flux spectrum from MCNP5 compared to GODIVA is 0%-20% and the Attila error range compared to the GODIVA is 0%-35%. The method of moments coupled with the maximum entropy method for reconstructing flux is shown to be a fast reliable method, compared to either Monte Carlo methods (MCNP5) or 30 multienergy group methods (Attila) and to GODIVA the bare sphere critical assembly experiment

    Predictive Modeling of Partitioned Systems: Implementation and Applications

    Get PDF
    A general mathematical methodology for predictive modeling of coupled multi-physics systems is implemented and has been applied without change to an illustrative heat conduction example and reactor physics benchmarks

    Biomedical Research Group, Health Division annual report 1954

    Full text link

    Alcayota Gum Films: Experimental Reviews

    Get PDF
    Polysaccharides obtained from plant have been investigated for the develop-ment of edible/biodegradable non petrochemical-based packaging materials. Alcayota (Curcubita ficifolia) is the fruit of a creeping plant such as waterme-lon and melon. It is the fruit of a creeping plant such as watermelon and me-lon. After separating the pulp from the husk and seeds it is dried and ground to obtain a flour. Different alcayota flour is made to hydrolyze then. Alkaline hydrolysis is released to extraction of alcayota gum and is purified by hy-droalcoholic solutions. The films were prepared from water solution of hy-drolyzed alcayota gum (AlcOH). AlcOH film present properties are mainly due to the strong hydrophilicity. In order to improve water resistance, the films were modified using glutaraldehyde (Glu), in order to make the water insoluble film. The crosslinked films providing a low water vapor permeability (WVP) and high mechanical properties expressed in elastic modulus. The X-ray diffraction showed amorphous and shift to lower dspacing, i.e. at lower distances between the polysaccharide chains. These crosslinked membranes exhibit excellent water resistance, low O2 permeation, which make them very useful in selecting biodegradable polymer and films.Fil: Zanon, Marisa. Universidad Nacional de San Luis; ArgentinaFil: Masuelli, Martin Alberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - San Luis. Instituto de Física Aplicada "Dr. Jorge Andrés Zgrablich". Universidad Nacional de San Luis. Facultad de Ciencias Físico Matemáticas y Naturales. Instituto de Física Aplicada "Dr. Jorge Andrés Zgrablich"; Argentin

    Gone in Sixty Milliseconds: Trademark Law and Cognitive Science

    Get PDF
    Trademark dilution is a cause of action for interfering with the uniqueness of a trademark. For example, consumers would probably not think that Kodak soap was produced by the makers of Kodak cameras, but its presence in the market would diminish the uniqueness of the original Kodak mark. Trademark owners think dilution is harmful but have had difficulty explaining why. Many courts have therefore been reluctant to enforce dilution laws, even while legislatures have enacted more of them over the past half century. Courts and commentators have now begun to use psychological theories, drawing on associationist models of cognition, to explain how a trademark can be harmed by the existence of similar marks even when consumers can readily distinguish the marks from one another and thus are not confused. Though the cognitive theory of dilution is internally consistent and appeals to the authority of science, it does not rest on sufficient empirical evidence to justify its adoption. Moreover, the harms it identifies do not generally come from commercial competitors but from free speech about trademarked products. As a result, even a limited dilution law should be held unconstitutional under current First Amendment commercial-speech doctrine. In the absence of constitutional invalidation, the cognitive explanation of dilution is likely to change the law for the worse. Rather than working like fingerprint evidence--which ideally produces more evidence about already-defined crimes--psychological explanations of dilution are more like economic theories in antitrust, which changed the definition of actionable restraints of trade. Given the empirical and normative flaws in the cognitive theory, using it to fill dilution\u27s theoretical vacuum would be a mistake
    corecore