53 research outputs found

    The development of sentence-interpretation strategies in monolingual German-learning children with and without specific language impairment

    Get PDF
    Previous research on sentence comprehension conducted with German-learning children has concentrated on the role of case marking and word order in typically developing children. This paper compares, the performance of German-learning children with language impairment (age 4-6 years) and without language impairment (aged 2-6, 8-9 years) in two experiments that systematically vary the cues animacy, case marking; word-order, and subject-verb agreement. The two experiments differ with regard to the choice of case marking: in the first it is distinct but in the second it is neutralized. The theoretical framework is the competition model developed by Bates and Mac Whinney and their collaborators, a variant of the parallel distributed processing models. It is hypothesized that children of either population first appreciate the cue animacy that can be processed locally, that is, "on the spot," before they turn to more distributed cues leading ultimately up to subject-verb agreement, which presupposes the comparison of various constituents before an interpretation can be established. Thus agreement is more "costly" in processing than animacy or the (more) local cue initial NP. In experiment I with unambiguous case markers it is shown that the typically developing children proceed from animacy to the nominative (predominantly in coalition with the initial NP) to agreement, while in the second experiment with ambiguous case markers these children turn from animacy to the initial NP and then to agreement. The impaired children also progress from local to distributed cues. Yet, in contrast to the control group, they do not acknowledge the nominative in coalition with the initial NP in the first experiment but only in support of agreement. However, although they do not seem to appreciate distinct case markers to any large extent in the first experiment, they are irritated if such distinctions are lacking: in experiment II all impaired children turn to. animacy (some in coalition with the initial NP and/or particular word orders). In the discussion, the relationship between short-term memory and processing as well as the relationship between production and comprehension of case markers and agreement are addressed. Further research is needed to explore in more detail "cue costs" in sentence comprehension

    Three LINERs Under the Chandra X-Ray Microscope

    Get PDF
    We use X-ray observations of three galaxies hosting LINERs (NGC 404, NGC 4736, NGC 4579) with Chandra to study their power sources. We find very diverse properties within this small group: NGC 404 has an X-ray-faint nucleus with a soft, thermal spectrum, NGC 4736 harbors a plethora of discrete X-ray sources in and around its nucleus, and NGC 4579 has a dominant nuclear point source embedded in a very extended, diffuse nebulosity. From their multi- wavelength properties we conclude the following: the nucleus of NGC 404 is the site of a weak, compact starburst, whose X-ray emission is due to gas heated by stellar winds and supernovae, NGC 4736 is in a recent or aging starburst phase, where the X-ray emission is dominated by a dense cluster of X-ray binaries, and NGC 4579 is powered by accretion onto a supermassive black hole. We detect 39 discrete sources in NGC 4736 and 21 in NGC 4579, most with L_X > 10^37 erg/s. One source in the disk of NGC 4579 could be an ultraluminous X-ray binary with L_X (2-10 keV) = 9x10^39 erg/s, but it could also be a background quasar. The most luminous discrete sources have simple power-law spectra, which along with their luminosities suggest that these are X-ray binaries accreting near or above the Eddington rate for a neutron star. By comparing the luminosity functions of discrete X-ray sources in these and other galaxies we find a potential connection between the age of the stellar population and the slope of the cumulative X-ray source luminosity function: galaxies with primarily old stellar populations have steeper luminosity functions than starburst galaxies. We suggest that this difference results from the contribution of high-mass X-ray binaries from the young stellar population to the upper end of the luminosity function.Comment: To appear in the Astrophysical Journal. Enlarged views of images available at http://www.astro.psu.edu/users/mce/preprints/preprint_index.htm

    Expanding the clinical and genetic spectrum of ALPK3 variants: Phenotypes identified in pediatric cardiomyopathy patients and adults with heterozygous variants

    Get PDF
    Introduction: Biallelic damaging variants in ALPK3, encoding alpha-protein kinase 3, cause pediatric-onset cardiomyopathy with manifestations that are incompletely defined. Methods and Results: We analyzed clinical manifestations of damaging biallelic ALPK3 variants in 19 pediatric patients, including nine previously published cases. Among these, 11 loss-of-function (LoF) variants, seven compound LoF and deleterious missense variants, and one homozygous deleterious missense variant were identified. Among 18 live-born patients, 8 exhibited neonatal dilated cardiomyopathy (44.4%; 95% CI: 21.5%-69.2%) that subsequently transitioned into ventricular hypertrophy. The majority of patients had extracardiac phenotypes, including contractures, scoliosis, cleft palate, and facial dysmorphisms. We observed no association between variant type or location, disease severity, and/or extracardiac manifestations. Myocardial histopathology showed focal cardiomyocyte hypertrophy, subendocardial fibroelastosis in patients under 4 years of age, and myofibrillar disarray in adults. Rare heterozygous ALPK3 variants were also assessed in adult-onset cardiomyopathy patients. Among 1548 Dutch patients referred for initial genetic analyses, we identified 39 individuals with rare heterozygous ALPK3 variants (2.5%; 95% CI: 1.8%-3.4%), including 26 missense and 10 LoF variants. Among 149 U.S. patients without pathogenic variants in 83 cardiomyopathy-related genes, we identified six missense and nine LoF ALPK3 variants (10.1%; 95% CI: 5.7%-16.1%). LoF ALPK3 variants were increased in comparison to matched controls (Dutch cohort, P = 1.6×10−5; U.S. cohort, P = 2.2×10−13). Conclusion: Biallelic damaging ALPK3 variants cause pediatric cardiomyopathy manifested by DCM transitioning to hypertrophy, often with poor contractile function. Additional extracardiac features occur in most patients, including musculoskeletal abnormalities and cleft palate. Heterozygous LoF ALPK3 variants are enriched in adults with cardiomyopathy and may contribute to their cardiomyopathy. Adults with ALPK3 LoF variants therefore warrant evaluations for cardiomyopathy

    Second Language Processing Shows Increased Native-Like Neural Responses after Months of No Exposure

    Get PDF
    Although learning a second language (L2) as an adult is notoriously difficult, research has shown that adults can indeed attain native language-like brain processing and high proficiency levels. However, it is important to then retain what has been attained, even in the absence of continued exposure to the L2—particularly since periods of minimal or no L2 exposure are common. This event-related potential (ERP) study of an artificial language tested performance and neural processing following a substantial period of no exposure. Adults learned to speak and comprehend the artificial language to high proficiency with either explicit, classroom-like, or implicit, immersion-like training, and then underwent several months of no exposure to the language. Surprisingly, proficiency did not decrease during this delay. Instead, it remained unchanged, and there was an increase in native-like neural processing of syntax, as evidenced by several ERP changes—including earlier, more reliable, and more left-lateralized anterior negativities, and more robust P600s, in response to word-order violations. Moreover, both the explicitly and implicitly trained groups showed increased native-like ERP patterns over the delay, indicating that such changes can hold independently of L2 training type. The results demonstrate that substantial periods with no L2 exposure are not necessarily detrimental. Rather, benefits may ensue from such periods of time even when there is no L2 exposure. Interestingly, both before and after the delay the implicitly trained group showed more native-like processing than the explicitly trained group, indicating that type of training also affects the attainment of native-like processing in the brain. Overall, the findings may be largely explained by a combination of forgetting and consolidation in declarative and procedural memory, on which L2 grammar learning appears to depend. The study has a range of implications, and suggests a research program with potentially important consequences for second language acquisition and related fields

    Linking Hydrodynamic Complexity to Delta Smelt (<i>Hypomesus transpacificus</i>) Distribution in the San Francisco Estuary, USA

    No full text
    doi: http://dx.doi.org/10.15447/sfews.2016v14iss1art3Long-term fish sampling data from the San Francisco Estuary were combined with detailed three-dimensional hydrodynamic modeling to investigate the relationship between historical fish catch and hydrodynamic complexity. Delta Smelt catch data at 45 stations from the Fall Midwater Trawl (FMWT) survey in the vicinity of Suisun Bay were used to develop a quantitative catch-based station index. This index was used to rank stations based on historical Delta Smelt catch. The correlations between historical Delta Smelt catch and 35 quantitative metrics of environmental complexity were evaluated at each station. Eight metrics of environmental conditions were derived from FMWT data and 27 metrics were derived from model predictions at each FMWT station. To relate the station index to conceptual models of Delta Smelt habitat, the metrics were used to predict the station ranking based on the quantified environmental conditions. Salinity, current speed, and turbidity metrics were used to predict the relative ranking of each station for Delta Smelt catch. Including a measure of the current speed at each station improved predictions of the historical ranking for Delta Smelt catch relative to similar predictions made using only salinity and turbidity. Current speed was also found to be a better predictor of historical Delta Smelt catch than water depth. The quantitative approach developed using the FMWT data was validated using the Delta Smelt catch data from the San Francisco Bay Study. Complexity metrics in Suisun Bay were evaluated during 2010 and 2011. This analysis indicated that a key to historical Delta Smelt catch is the overlap of low salinity, low maximum velocity, and low Secchi depth regions. This overlap occurred in Suisun Bay during 2011, and may have contributed to higher Delta Smelt abundance in 2011 than in 2010 when the favorable ranges of the metrics did not overlap in Suisun Bay.</p

    The Apoe-/- Mouse PhysioLab® Platform: A Validated Physiologically-based Mathematical Model of Atherosclerotic Plaque Progression in the Apoe-/- Mouse

    No full text
    &lt;em&gt;Motivation:&lt;/em&gt;Atherosclerosis is a complex multi-pathway inflammatory disease where accumulation of oxidatively modified lipids and leukocytes in the arterial intima leads to plaque formation over time. Translating Apoe&lt;sup&gt;-/-&lt;/sup&gt; mouse results to the clinical setting is complicated by uncertainty around (a) mechanisms underlying disease etiology, (b) relative importance of these mechanisms as drivers of progression, and (c) how these roles change in response to perturbation by therapeutic intervention or lifestyle changes. &lt;br&gt;&lt;em&gt;Results: &lt;/em&gt;We describe a large-scale mechanistic, mathematical model of atherosclerosis in the Apoe&lt;sup&gt;-/-&lt;/sup&gt; mouse and its validation with &lt;em&gt;in vivo&lt;/em&gt; Apoe &lt;sup&gt;-/-&lt;/sup&gt; data. Major physiological components include cholesterol/macrophage trafficking, inflammation, endothelial function, oxidative stress, and thrombosis. Heterogeneity in disease progression, observed despite genetic uniformity and experimentally controlled conditions, was captured through “virtual mice”. This model may be used to optimize &lt;em&gt;in vivo&lt;/em&gt; experiments and paves the way for a similar modeling approach for human disease.&lt;br&gt;&lt;em&gt;Availability: &lt;/em&gt;The model is available by remote desktop client at Apoe.entelos.com

    Advancing estuarine ecological forecasts: seasonal hypoxia in Chesapeake Bay

    Full text link
    Ecological forecasts are quantitative tools that can guide ecosystem management. The coemergence of extensive environmental monitoring and quantitative frameworks allows for widespread development and continued improvement of ecological forecasting systems. We use a relatively simple estuarine hypoxia model to demonstrate advances in addressing some of the most critical challenges and opportunities of contemporary ecological forecasting, including predictive accuracy, uncertainty characterization, and management relevance. We explore the impacts of different combinations of forecast metrics, drivers, and driver time windows on predictive performance. We also incorporate multiple sets of state‐variable observations from different sources and separately quantify model prediction error and measurement uncertainty through a flexible Bayesian hierarchical framework. Results illustrate the benefits of (1) adopting forecast metrics and drivers that strike an optimal balance between predictability and relevance to management, (2) incorporating multiple data sources in the calibration data set to separate and propagate different sources of uncertainty, and (3) using the model in scenario mode to probabilistically evaluate the effects of alternative management decisions on future ecosystem state. In the Chesapeake Bay, the subject of this case study, we find that average summer or total annual hypoxia metrics are more predictable than monthly metrics and that measurement error represents an important source of uncertainty. Application of the model in scenario mode suggests that absent watershed management actions over the past decades, long‐term average hypoxia would have increased by 7% compared to 1985. Conversely, the model projects that if management goals currently in place to restore the Bay are met, long‐term average hypoxia would eventually decrease by 32% with respect to the mid‐1980s.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/169288/1/eap2384_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/169288/2/eap2384.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/169288/3/eap2384-sup-0001-AppendixS1.pd
    corecore