19 research outputs found

    From Input to Intake: Researching Learner Cognition

    Get PDF
    The distinction between input, what the teachers say to their learners, and intake, what the learners hear, has been recognised in research into second language acquisition at least since Corder (1967). The distinction is important because language development does not result from the input to which learners are exposed but what the learners take in. If a teacher in a class focussing on the past simple says “Harry and Megan cooked a curry last weekend,” the input that the learners are exposed to is “Harry and Megan cooked a curry last weekend.” However, if a learner hears “Harry and Megan cook a curry last weekend” then this sentence, in all its non-standard grammaticality, is the intake and this episode is unlikely to contribute to the learner’s command of the past tense. If, for example, researchers were interested in identifying the number of instances of the past tense needed for learners to acquire this tense, research that counted the number of times the teacher used the past tense will only help us to understand the acquisition process if there is a systematic relationship between what the teacher says, the input, and what the learner hears, the intake. However, the conceptual distinction between input and intake in second language acquisition (SLA) has been poorly operationalized and much research treats input as a straightforward proxy for intake. This article explores the relationship between input and intake in order to identify strategies for researching language development that are based on a more solid understanding of the connections between input and intake

    ALLOMORPHY IN LINGUISTIC THEORY: STRONG VERBS AND DERIVED NOUNS IN GERMAN

    No full text
    This thesis investigates properties of German strong verbs and noun counterparts within a restrictive theory in which related words are derived from a uniform base representation in the lexicon. I compare this approach to a morpholexical approach (Lieber, 1980) in which all stem variants are in the lexicon. I will assume the general theory of word formation proposed by Allen (1978), in which word formation rules are level-ordered. I will also assume a theory of phonology (Michaels, 1980a) in which three components of rules are distinguished: (i)allomorphy rules, (ii)derivational rules, and (iii)phonetic rules. ^ In Chapter 2, preliminary discussions of issues such as level-ordering, zero-derivation, conversion, allomorphy and morpholexical rules are presented. ^ Chapter 3 is an analysis of German strong verbs and noun counterparts in which class designations are lexical primitives. Allomorphy rules which are sensitive to class membership derive the stem vowels of the principal parts of the verb from a base stem vowel. I argue that the noun counterparts in Class I and Class II are derived from the verbs based on the fact that a single nominalization rule can predict the properties of all derived nouns in a class. Noun counterparts in the remaining classes are not derived from verbs and have distinct lexical entries.^ Chapter 4 is an alternative analysis of German strong verbs and noun counterparts. Here, no primitive notion of strong verb or class membership is assumed. A strong verb is one which is marked for allomorphy rules; all verbs which are marked for the same allomorphy rules constitute a class. I present data from verbs with separable prefixes which offer support for this alternative analysis. ^ In Chapter 5, I consider data from Old English strong verbs and noun counterparts. My analysis here closely parallels the analysis of the German data in Chapter 4. The Old English data provide further support for my claim that the generalizations are best captured by the more restrictive theory.

    Increase in Ruptured Cerebral Arteriovenous Malformations and Mortality in the United States: Unintended Consequences of the ARUBA Trial?

    No full text
    Background The findings of the ARUBA (A Randomized Trial of Unruptured Brain Arteriovenous Malformation) trial, which determined that medical management was superior to prophylactic interventional therapy for the treatment of unruptured cerebral arteriovenous malformations (cAVMs), remain polarizing and controversial. Methods Adult cAVM patient admissions were identified in the National Inpatient Sample from 2009 to 2019. The incidence of cAVM rupture and in‐hospital mortality were compared between the pre‐ (2009–2013) and post‐ARUBA trial eras (2014–2019) using complex samples‐weighted estimates and multivariable logistic regression analyses. A control cohort composed of an alternate pathology (ruptured and unruptured cerebral aneurysms) was also assessed during the study period to evaluate potential bias. Results Among 121 415 hospitalizations for cAVM during the study period, 31 389 (25.9%) were admissions for ruptured malformations. The incidence of ruptured cAVM increased in the post‐ARUBA trial era (13.3% versus 34.4%; P<0.001) as well as rates of in‐hospital mortality (2.0% versus 7.6%; P<0.001). Following multivariable regression analysis adjusting for age, illness severity, and acute neurological condition, the post‐ARUBA trial era was independently associated with both cAVM rupture (adjusted odds ratio [OR], 1.99; [95% CI, 1.72–2.29]; P<0.001) and in‐hospital mortality (adjusted OR, 1.94; [95% CI, 1.37–2.75]; P<0.001). Control cohort comparative analysis revealed that rates of hospitalizations for ruptured cerebral aneurysms relative to all aneurysm admissions did not differ before and after 2014 (84.5% versus 84.3%; P=0.185). Conclusion The incidence of ruptured cAVM increased following 2014, potentially a reflection of a paradigm shift to conservative and noninterventional management strategies in patients with unruptured cAVM. Further studies may be necessary to exclude other confounders contributing to this rise

    Impact of Pre-ictal Antiplatelet Therapy Use in Aneurysmal Subarachnoid Hemorrhage

    No full text
    OBJECTIVE: There is limited evidence on the use of antiplatelet therapy (APT) to reduce the risk and morbidity of cerebral aneurysmal rupture. This analysis retrospectively assessed APT use in patients presenting to our institution with aneurysmal subarachnoid hemorrhage (aSAH). METHODS: We evaluated the records of 186 patients over 7 years of retrospective data from our tertiary care center and an existing database of patients with aSAH. A total of 18 cases with patients on APT and 168 patients not on APT (controls) were identified. Primary outcomes measured were clinical grade (Hunt and Hess score), radiographic grade (Fisher score), and presence of delayed cerebral ischemia (DCI). Secondary outcomes were modified Rankin score at discharge and at 3 months. DCI from cerebral vasospasm was defined as the occurrence of focal neurological impairment or a decrease in at least 2 points on the Glasgow Coma Scale. Logistic regression models were generated. RESULTS: We found that APT use did not appear to lead to statistically significant differences in initial presentation, including Hunt-Hess score and Fisher grade (2.91 vs 3.06, p = 0.66, and 3.23 vs 3.22, p = 0.96 respectively). In addition, APT use was not associated with increased rates of delayed cerebral ischemia (DCI) (OR 0.27 p = 0.12). Our analysis showed that increased Hunt Hess score and the presence of DCI are both associated with increased mRS at 90 days (OR 2.32 p \u3c 0.001; OR 2.91 p = 0.002). CONCLUSION: The patients in this retrospective observational study did not demonstrate worse outcomes from their aSAH despite APT therapy. Larger prospective studies should be performed to see if this relationship holds and if decreased rates of DCI can be observed

    Relation Between Brain Natriuretic Peptide and Delayed Cerebral Ischemia in Patients with Aneurysmalsubarachnoid Hemorrhage

    No full text
    BACKGROUND: Brain natriuretic peptide (BNP), often used to evaluate degree of heart failure, has been implicated in fluid dysregulation and inflammation in critically-ill patients. Twenty to 30% of patients with aneurysmal subarachnoid hemorrhage (aSAH) will develop some degree of neurogenic stress cardiomyopathy (NSC) and in turn elevation of BNP levels. We sought to explore the association between BNP levels and development of delayed cerebral ischemia (DCI) in patients with aSAH. METHODS: We retrospectively evaluated the records of 149 patients admitted to the Neurological Intensive Care Unit between 2006 and 2015 and enrolled in an existing prospectively maintained aSAH database. Demographic data, treatment and outcomes, and BNP levels at admission and throughout the hospital admission were noted. RESULTS: Of the 149 patients included in the analysis, 79 developed DCI during their hospital course. We found a statistically significant association between DCI and the highest recorded BNP (OR 1.001, 95% CI-1.001-1.002, p = 0.002). The ROC curve analysis for DCI based on BNP showed that the highest BNP level during hospital admission (AUC 0.78) was the strongest predictor of DCI compared to the change in BNP over time (AUC 0.776) or the admission BNP (AUC 0.632). CONCLUSION: Our study shows that DCI is associated not only with higher baseline BNP values (admission BNP), but also with the highest BNP level attained during the hospital course and the rapidity of change or increase in BNP over time. Prospective studies are needed to evaluate whether routine measurement of BNP may help identify SAH patients at high risk of DCI

    Acute Respiratory Distress Syndrome in Patients With Subarachnoid Hemorrhage: Incidence, Predictive Factors, and Impact on Mortality

    No full text
    INTRODUCTION: Acute respiratory distress syndrome (ARDS) is a known predictor of poor outcomes in critically ill patients. We sought to examine the role ARDS plays in outcomes in subarachnoid hemorrhage (SAH) patients. Prior studies investigating the incidence of ARDS in SAH patients did not control for SAH severity. Hence, we sought to determine the incidence ARDS in patients diagnosed with aneurysmal SAH and investigate the predisposing risk factors and impact upon outcomes. METHODS: A retrospective cohort study was conducted using the National Inpatient Sample (NIS) database for the years 2008 to 2014. Multivariate stepwise regression analysis was performed to identify the risk factors and outcome associated with developing ARDS in the setting of SAH. RESULTS: We identified 170,869 patients with non-traumatic subarachnoid hemorrhage, of whom 6962 were diagnosed with ARDS and of those 4829 required mechanical ventilation. ARDS more frequently developed in high grade SAH patients (1.97 ± 0.05 vs. 1.15 ± 0.01; p \u3c 0.0001). Neurologic predictors of ARDS included cerebral edema (OR 1.892, CI 1.180-3.034, p = 0.0035) and medical predictors included cardiac arrest (OR 4.642, CI 2.273-9.482, p \u3c 0.0001) and cardiogenic shock (OR 2.984, CI 1.157-7.696, p = 0.0239). ARDS was associated with significantly worse outcomes (15.5% vs. 52.9% discharged home, 63.0% vs. 40.8% discharged to rehabilitation facility and 21.5% vs. 6.3% in-hospital mortality). CONCLUSION: Patients with SAH who developed ARDS were less likely to be discharged home, more likely to need rehabilitation and had a significantly higher risk of mortality. The identification of risk factors contributing to ARDS is helpful for improving outcomes and resource utilization

    Demographics and Outcomes of Interhospital Neurosurgical Transfer Patients Undergoing Spine Surgery

    No full text
    OBJECTIVE: Interhospital patient transfer (IHT) of patients is common and accounts for a significant portion of health care costs, yet the variables driving neurosurgical IHT have not been systematically described. We analyzed variables that distinguished spine surgery patients who underwent IHT from patients who did not undergo IHT to report on the effect of frailty on IHT. METHODS: A retrospective chart review was performed to collect data on consecutive patients undergoing spinal procedures during 2015-2017. IHT patients were identified and compared with non-interhospital patient transfer (n-IHT) patients to identify factors that distinguished the 2 patient groups using multivariate regression analysis. Studied variables included case complexity, frailty (modified frailty index), age, insurance status, and baseline demographic variables. Postoperative outcomes affected by transfer status were identified in binary regression analysis. RESULTS: During 2015-2017, there were 595 n-IHT and 76 IHT spine surgery patients (N = 671). Increased frailty (modified frailty index ≥3; odds ratio = 2.4, P = 0.01) and increased spine surgery complexity (spine surgery complexity score ≥2; odds ratio = 2.57, P = 0.002) were independent risk factors associated with IHT. IHT was an independent risk factor for increased hospital length of stay and increased postoperative complications (Clavien-Dindo scale; P \u3c 0.001). CONCLUSIONS: IHT patients comprise a more frail and surgically complex surgical spine population compared with n-IHT patients. IHT was also an independent risk factor for increased complications and length of stay after spine surgery. Patients\u27 insurance status and age did not distinguish between IHT and n-IHT groups. This is the first report in any specialty to demonstrate increasing frailty is associated with IHT
    corecore