16 research outputs found

    Deepwater Horizon oil spill exposure and child health: a longitudinal analysis

    Get PDF
    The BP Deepwater Horizon oil spill (DHOS) created widespread concern about threats to health among residents of the Louisiana Gulf Coast. This study uses data from the Resilient Children, Youth, and Communities study—a longitudinal cohort survey of households with children in DHOS-affected areas of South Louisiana—to consider the effect of DHOS exposure on health trajectories of children, an especially vulnerable population subgroup. Results from latent linear growth curve models show that family DHOS exposure via physical contact and job/income loss both negatively influenced initial child health. However, the effects of physical exposure dissipated over time while the effects of job/income loss persisted. This pattern holds for both general child health and the number of recent physical health problems children had experienced. These findings help to bridge the literature on disaster impacts and resilience/vulnerability, with the literature on socioeconomic status as a fundamental cause of health outcomes over the life course

    Repeated Vowel Production Affects Features of Neural Activity in Sensorimotor Cortex

    No full text
    The sensorimotor cortex is responsible for the generation of movements and interest in the ability to use this area for decoding speech by brain–computer interfaces has increased recently. Speech decoding is challenging however, since the relationship between neural activity and motor actions is not completely understood. Non-linearity between neural activity and movement has been found for instance for simple finger movements. Despite equal motor output, neural activity amplitudes are affected by preceding movements and the time between movements. It is unknown if neural activity is also affected by preceding motor actions during speech. We addressed this issue, using electrocorticographic high frequency band (HFB; 75–135 Hz) power changes in the sensorimotor cortex during discrete vowel generation. Three subjects with temporarily implanted electrode grids produced the /i/ vowel at repetition rates of 1, 1.33 and 1.66 Hz. For every repetition, the HFB power amplitude was determined. During the first utterance, most electrodes showed a large HFB power peak, which decreased for subsequent utterances. This result could not be explained by differences in performance. With increasing duration between utterances, more electrodes showed an equal response to all repetitions, suggesting that the duration between vowel productions influences the effect of previous productions on sensorimotor cortex activity. Our findings correspond with previous studies for finger movements and bear relevance for the development of brain-computer interfaces that employ speech decoding based on brain signals, in that past utterances will need to be taken into account for these systems to work accurately

    Repeated Vowel Production Affects Features of Neural Activity in Sensorimotor Cortex

    No full text
    The sensorimotor cortex is responsible for the generation of movements and interest in the ability to use this area for decoding speech by brain–computer interfaces has increased recently. Speech decoding is challenging however, since the relationship between neural activity and motor actions is not completely understood. Non-linearity between neural activity and movement has been found for instance for simple finger movements. Despite equal motor output, neural activity amplitudes are affected by preceding movements and the time between movements. It is unknown if neural activity is also affected by preceding motor actions during speech. We addressed this issue, using electrocorticographic high frequency band (HFB; 75–135 Hz) power changes in the sensorimotor cortex during discrete vowel generation. Three subjects with temporarily implanted electrode grids produced the /i/ vowel at repetition rates of 1, 1.33 and 1.66 Hz. For every repetition, the HFB power amplitude was determined. During the first utterance, most electrodes showed a large HFB power peak, which decreased for subsequent utterances. This result could not be explained by differences in performance. With increasing duration between utterances, more electrodes showed an equal response to all repetitions, suggesting that the duration between vowel productions influences the effect of previous productions on sensorimotor cortex activity. Our findings correspond with previous studies for finger movements and bear relevance for the development of brain-computer interfaces that employ speech decoding based on brain signals, in that past utterances will need to be taken into account for these systems to work accurately

    Classification of mouth movements using 7 T fMRI

    No full text
    Objective. A brain-computer interface (BCI) is an interface that uses signals from the brain to control a computer. BCIs will likely become important tools for severely paralyzed patients to restore interaction with the environment. The sensorimotor cortex is a promising target brain region for a BCI due to the detailed topography and minimal functional interference with other important brain processes. Previous studies have shown that attempted movements in paralyzed people generate neural activity that strongly resembles actual movements. Hence decodability for BCI applications can be studied in able-bodied volunteers with actual movements. Approach. In this study we tested whether mouth movements provide adequate signals in the sensorimotor cortex for a BCI. The study was executed using fMRI at 7 T to ensure relevance for BCI with cortical electrodes, as 7 T measurements have been shown to correlate well with electrocortical measurements. Twelve healthy volunteers executed four mouth movements (lip protrusion, tongue movement, teeth clenching, and the production of a larynx activating sound) while in the scanner. Subjects performed a training and a test run. Single trials were classified based on the Pearson correlation values between the activation patterns per trial type in the training run and single trials in the test run in a 'winner-takes-all' design. Main results. Single trial mouth movements could be classified with 90% accuracy. The classification was based on an area with a volume of about 0.5 cc, located on the sensorimotor cortex. If voxels were limited to the surface, which is accessible for electrode grids, classification accuracy was still very high (82%). Voxels located on the precentral cortex performed better (87%) than the postcentral cortex (72%). Significance. The high reliability of decoding mouth movements suggests that attempted mouth movements are a promising candidate for BCI in paralyzed people

    Classification of Articulator Movements and Movement Direction from Sensorimotor Cortex Activity

    No full text
    For people suffering from severe paralysis, communication can be difficult or nearly impossible. Technology systems called brain-computer interfaces (BCIs) are being developed to assist these people with communication by using their brain activity to control a computer without any muscle activity. To benefit the development of BCIs that employ neural activity related to speech, we investigated if neural activity patterns related to different articulator movements can be distinguished from each other. We recorded with electrocorticography (ECoG), the neural activity related to different articulator movements in 4 epilepsy patients and classified which articulator participants moved based on the sensorimotor cortex activity patterns. The same was done for different movement directions of a single articulator, the tongue. In both experiments highly accurate classification was obtained, on average 92% for different articulators and 85% for different tongue directions. Furthermore, the data show that only a small part of the sensorimotor cortex is needed for classification (ca. 1 cm2). We show that recordings from small parts of the sensorimotor cortex contain information about different articulator movements which might be used for BCI control. Our results are of interest for BCI systems that aim to decode neural activity related to (actual or attempted) movements from a contained cortical area

    Classification of Articulator Movements and Movement Direction from Sensorimotor Cortex Activity

    No full text
    For people suffering from severe paralysis, communication can be difficult or nearly impossible. Technology systems called brain-computer interfaces (BCIs) are being developed to assist these people with communication by using their brain activity to control a computer without any muscle activity. To benefit the development of BCIs that employ neural activity related to speech, we investigated if neural activity patterns related to different articulator movements can be distinguished from each other. We recorded with electrocorticography (ECoG), the neural activity related to different articulator movements in 4 epilepsy patients and classified which articulator participants moved based on the sensorimotor cortex activity patterns. The same was done for different movement directions of a single articulator, the tongue. In both experiments highly accurate classification was obtained, on average 92% for different articulators and 85% for different tongue directions. Furthermore, the data show that only a small part of the sensorimotor cortex is needed for classification (ca. 1 cm2). We show that recordings from small parts of the sensorimotor cortex contain information about different articulator movements which might be used for BCI control. Our results are of interest for BCI systems that aim to decode neural activity related to (actual or attempted) movements from a contained cortical area

    Give me a sign : decoding four complex hand gestures based on high-density ECoG

    No full text
    The increasing understanding of human brain functions makes it possible to directly interact with the brain for therapeutic purposes. Implantable brain computer interfaces promise to replace or restore motor functions in patients with partial or complete paralysis. We postulate that neuronal states associated with gestures, as they are used in the finger spelling alphabet of sign languages, provide an excellent signal for implantable brain computer interfaces to restore communication. To test this, we evaluated decodability of four gestures using high-density electrocorticography in two participants. The electrode grids were located subdurally on the hand knob area of the sensorimotor cortex covering a surface of 2.5–5.2 cm2. Using a pattern-matching classification approach four types of hand gestures were classified based on their pattern of neuronal activity. In the two participants the gestures were classified with 97 and 74 % accuracy. The high frequencies (>65 Hz) allowed for the best classification results. This proof-of-principle study indicates that the four gestures are associated with a reliable and discriminable spatial representation on a confined area of the sensorimotor cortex. This robust representation on a small area makes hand gestures an interesting control feature for an implantable BCI to restore communication for severely paralyzed people
    corecore