4 research outputs found

    Brain Vital Signs: Expanding From the Auditory to Visual Modality

    Get PDF
    The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs—a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan. The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs—a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan

    Effects of a New Conjugate Drug in a Rat Model of Postmenopausal Osteoporosis

    No full text
    Postmenopausal osteoporosis is a disease characterized by bone loss and increased risk of fracture, and represents a significant burden on the Canadian health care system. Current treatments lack the ability to simultaneously address the therapeutic needs for promoting bone formation and inhibiting resorption. Our approach employs a novel conjugate drug in which an anabolic agent (EP4 receptor agonist) is reversibly joined with an anti-resorptive agent (alendronate) through a linker. This allows the bone-targeting ability of alendronate to deliver the EP4 agonist to bone sites, thereby mitigating the side effects associated with systemic administration of the EP4 agonist. This study investigated the in vivo efficacy of this drug in a curative experiment to treat postmenopausal osteoporosis using an ovariectomized rat model. Results showed that conjugate treatment dose-dependently stimulated bone formation and restored ovariectomy-induced bone loss, and conjugation between alendronate and the EP4 agonist was crucial to the drug’s anabolic effect.MAS

    Blink-Related Oscillations Provide Naturalistic Assessments of Brain Function and Cognitive Workload within Complex Real-World Multitasking Environments

    No full text
    Background: There is a significant need to monitor human cognitive performance in complex environments, with one example being pilot performance. However, existing assessments largely focus on subjective experiences (e.g., questionnaires) and the evaluation of behavior (e.g., aircraft handling) as surrogates for cognition or utilize brainwave measures which require artificial setups (e.g., simultaneous auditory stimuli) that intrude on the primary tasks. Blink-related oscillations (BROs) are a recently discovered neural phenomenon associated with spontaneous blinking that can be captured without artificial setups and are also modulated by cognitive loading and the external sensory environment—making them ideal for brain function assessment within complex operational settings. Methods: Electroencephalography (EEG) data were recorded from eight adult participants (five F, M = 21.1 years) while they completed the Multi-Attribute Task Battery under three different cognitive loading conditions. BRO responses in time and frequency domains were derived from the EEG data, and comparisons of BRO responses across cognitive loading conditions were undertaken. Simultaneously, assessments of blink behavior were also undertaken. Results: Blink behavior assessments revealed decreasing blink rate with increasing cognitive load (p p p p < 0.05). Conclusion: This study confirms the ability of BRO responses to capture cognitive loading effects as well as preparatory pre-blink cognitive processes in anticipation of the upcoming blink during a complex multitasking situation. These successful results suggest that blink-related neural processing could be a potential avenue for cognitive state evaluation in operational settings—both specialized environments such as cockpits, space exploration, military units, etc. and everyday situations such as driving, athletics, human-machine interactions, etc.—where human cognition needs to be seamlessly monitored and optimized

    Data_Sheet_1_Blink-related EEG oscillations are neurophysiological indicators of subconcussive head impacts in female soccer players: a preliminary study.docx

    No full text
    IntroductionRepetitive subconcussive head impacts can lead to subtle neural changes and functional consequences on brain health. However, the objective assessment of these changes remains limited. Resting state blink-related oscillations (BROs), recently discovered neurological responses following spontaneous blinking, are explored in this study to evaluate changes in BRO responses in subconcussive head impacts.MethodsWe collected 5-min resting-state electroencephalography (EEG) data from two cohorts of collegiate athletes who were engaged in contact sports (SC) or non-contact sports (HC). Video recordings of all on-field activities were conducted to determine the number of head impacts during games and practices in the SC group.ResultsIn both groups, we were able to detect a BRO response. Following one season of games and practice, we found a strong association between the number of head impacts sustained by the SC group and increases in delta and beta spectral power post-blink. There was also a significant difference between the two groups in the morphology of BRO responses, including decreased peak-to-peak amplitude of response over left parietal channels and differences in spectral power in delta and alpha frequency range post-blink.DiscussionOur preliminary results suggest that the BRO response may be a useful biomarker for detecting subtle neural changes resulting from repetitive head impacts. The clinical utility of this biomarker will need to be validated through further research with larger sample sizes, involving both male and female participants, using a longitudinal design.</p
    corecore