357 research outputs found

    An anatomy-based lumped parameter model of cerebrospinal venous circulation: can an extracranial anatomical change impact intracranial hemodynamics?

    Get PDF
    Background The relationship between extracranial venous system abnormalities and central nervous system disorders has been recently theorized. In this paper we delve into this hypothesis by modeling the venous drainage in brain and spinal column areas and simulating the intracranial flow changes due to extracranial morphological stenoses. Methods A lumped parameter model of the cerebro-spinal venous drainage was created based on anatomical knowledge and vessels diameters and lengths taken from literature. Each vein was modeled as a hydraulic resistance, calculated through Poiseuille’s law. The inputs of the model were arterial flow rates of the intracranial, vertebral and lumbar districts. The effects of the obstruction of the main venous outflows were simulated. A database comprising 112 Multiple Sclerosis patients (Male/Female = 42/70; median age ± standard deviation = 43.7 ± 10.5 years) was retrospectively analyzed. Results The flow rate of the main veins estimated with the model was similar to the measures of 21 healthy controls (Male/Female = 10/11; mean age ± standard deviation = 31 ± 11 years), obtained with a 1.5 T Magnetic Resonance scanner. The intracranial reflux topography predicted with the model in cases of internal jugular vein diameter reduction was similar to those observed in the patients with internal jugular vein obstacles. Conclusions The proposed model can predict physiological and pathological behaviors with good fidelity. Despite the simplifications introduced in cerebrospinal venous circulation modeling, the key anatomical feature of the lumped parameter model allowed for a detailed analysis of the consequences of extracranial venous impairments on intracranial pressure and hemodynamics

    Leaf Morphology, Taxonomy and Geometric Morphometrics: A Simplified Protocol for Beginners

    Get PDF
    Taxonomy relies greatly on morphology to discriminate groups. Computerized geometric morphometric methods for quantitative shape analysis measure, test and visualize differences in form in a highly effective, reproducible, accurate and statistically powerful way. Plant leaves are commonly used in taxonomic analyses and are particularly suitable to landmark based geometric morphometrics. However, botanists do not yet seem to have taken advantage of this set of methods in their studies as much as zoologists have done. Using free software and an example dataset from two geographical populations of sessile oak leaves, we describe in detailed but simple terms how to: a) compute size and shape variables using Procrustes methods; b) test measurement error and the main levels of variation (population and trees) using a hierachical design; c) estimate the accuracy of group discrimination; d) repeat this estimate after controlling for the effect of size differences on shape (i.e., allometry). Measurement error was completely negligible; individual variation in leaf morphology was large and differences between trees were generally bigger than within trees; differences between the two geographic populations were small in both size and shape; despite a weak allometric trend, controlling for the effect of size on shape slighly increased discrimination accuracy. Procrustes based methods for the analysis of landmarks were highly efficient in measuring the hierarchical structure of differences in leaves and in revealing very small-scale variation. In taxonomy and many other fields of botany and biology, the application of geometric morphometrics contributes to increase scientific rigour in the description of important aspects of the phenotypic dimension of biodiversity. Easy to follow but detailed step by step example studies can promote a more extensive use of these numerical methods, as they provide an introduction to the discipline which, for many biologists, is less intimidating than the often inaccessible specialistic literature

    Evaluation of a joint Bioinformatics and Medical Informatics international course in Peru

    Get PDF
    Background: New technologies that emerge at the interface of computational and biomedical science could drive new advances in global health, therefore more training in technology is needed among health care workers. To assess the potential for informatics training using an approach designed to foster interaction at this interface, the University of Washington and the Universidad Peruana Cayetano Heredia developed and assessed a one-week course that included a new Bioinformatics (BIO) track along with an established Medical/Public Health Informatics track (MI) for participants in Peru. Methods: We assessed the background of the participants, and measured the knowledge gained by track-specific (MI or BIO) 30-minute pre- and post-tests. Participants' attitudes were evaluated both by daily evaluations and by an end-course evaluation. Results: Forty-three participants enrolled in the course - 20 in the MI track and 23 in the BIO track. Of 20 questions, the mean % score for the MI track increased from 49.7 pre-test (standard deviation or SD = 17.0) to 59.7 (SD = 15.2) for the post-test (P = 0.002, n = 18). The BIO track mean score increased from 33.6 pre-test to 51.2 post-test (P less than 0.001, n = 21). Most comments (76%) about any aspect of the course were positive. The main perceived strength of the course was the quality of the speakers, and the main perceived weakness was the short duration of the course. Overall, the course acceptability was very good to excellent with a rating of 4.1 (scale 1-5), and the usefulness of the course was rated as very good. Most participants (62.9%) expressed a positive opinion about having had the BIO and MI tracks come together for some of the lectures. Conclusion: Pre- and post-test results and the positive evaluations by the participants indicate that this first joint Bioinformatics and Medical/Public Health Informatics (MI and BIO) course was a success.The University of Washington AMAUTA Global Training in Health Informatics, a Fogarty International Center/NIH funded grant (5D43TW007551), and the AMAUTA Research Practica Program, a Puget Sound Partners for Global Health-funded grant

    The Vigilance Decrement in Executive Function Is Attenuated When Individual Chronotypes Perform at Their Optimal Time of Day

    Get PDF
    Time of day modulates our cognitive functions, especially those related to executive control, such as the ability to inhibit inappropriate responses. However, the impact of individual differences in time of day preferences (i.e. morning vs. evening chronotype) had not been considered by most studies. It was also unclear whether the vigilance decrement (impaired performance with time on task) depends on both time of day and chronotype. In this study, morning-type and evening-type participants performed a task measuring vigilance and response inhibition (the Sustained Attention to Response Task, SART) in morning and evening sessions. The results showed that the vigilance decrement in inhibitory performance was accentuated at non-optimal as compared to optimal times of day. In the morning-type group, inhibition performance decreased linearly with time on task only in the evening session, whereas in the morning session it remained more accurate and stable over time. In contrast, inhibition performance in the evening-type group showed a linear vigilance decrement in the morning session, whereas in the evening session the vigilance decrement was attenuated, following a quadratic trend. Our findings imply that the negative effects of time on task in executive control can be prevented by scheduling cognitive tasks at the optimal time of day according to specific circadian profiles of individuals. Therefore, time of day and chronotype influences should be considered in research and clinical studies as well as real-word situations demanding executive control for response inhibition.This work was supported by the Spanish Ministerio de Ciencia e Innovación (Ramón y Cajal programme: RYC-2007-00296 and PLAN NACIONAL de I+D+i: PSI2010-15399) and Junta de Andalucía (SEJ-3054)

    Alpha-Toxin Induces Programmed Cell Death of Human T cells, B cells, and Monocytes during USA300 Infection

    Get PDF
    This investigation examines the influence of alpha-toxin (Hla) during USA300 infection of human leukocytes. Survival of an USA300 isogenic deletion mutant of hla (USA300Δhla) in human blood was comparable to the parental wild-type strain and polymorphonuclear leukocyte (PMN) plasma membrane permeability caused by USA300 did not require Hla. Flow cytometry analysis of peripheral blood mononuclear cells (PBMCs) following infection by USA300, USA300Δhla, and USA300Δhla transformed with a plasmid over-expressing Hla (USA300Δhla Comp) demonstrated this toxin plays a significant role inducing plasma membrane permeability of CD14+, CD3+, and CD19+ PBMCs. Rapid plasma membrane permeability independent of Hla was observed for PMNs, CD14+ and CD19+ PBMCs following intoxication with USA300 supernatant while the majority of CD3+ PBMC plasma membrane permeability induced by USA300 required Hla. Addition of recombinant Hla to USA300Δhla supernatant rescued CD3+ and CD19+ PBMC plasma membrane permeability generated by USA300 supernatant. An observed delay in plasma membrane permeability caused by Hla in conjunction with Annexin V binding and ApoBrdU Tunel assays examining PBMCs intoxicated with recombinant Hla or infected with USA300, USA300Δhla, USA300Δhla Comp, and USA300ΔsaeR/S suggest Hla induces programmed cell death of monocytes, B cells, and T cells that results in plasma membrane permeability. Together these findings underscore the importance of Hla during S. aureus infection of human tissue and specifically demonstrate Hla activity during USA300 infection triggers programmed cell death of human monocytes, T cells and B cells that leads to plasma membrane permeability

    Suppression of charged particle production at large transverse momentum in central Pb-Pb collisions at sNN=2.76\sqrt{s_{\rm NN}} = 2.76 TeV

    Get PDF
    Inclusive transverse momentum spectra of primary charged particles in Pb-Pb collisions at sNN\sqrt{s_{_{\rm NN}}} = 2.76 TeV have been measured by the ALICE Collaboration at the LHC. The data are presented for central and peripheral collisions, corresponding to 0-5% and 70-80% of the hadronic Pb-Pb cross section. The measured charged particle spectra in η<0.8|\eta|<0.8 and 0.3<pT<200.3 < p_T < 20 GeV/cc are compared to the expectation in pp collisions at the same sNN\sqrt{s_{\rm NN}}, scaled by the number of underlying nucleon-nucleon collisions. The comparison is expressed in terms of the nuclear modification factor RAAR_{\rm AA}. The result indicates only weak medium effects (RAAR_{\rm AA} \approx 0.7) in peripheral collisions. In central collisions, RAAR_{\rm AA} reaches a minimum of about 0.14 at pT=6p_{\rm T}=6-7GeV/cc and increases significantly at larger pTp_{\rm T}. The measured suppression of high-pTp_{\rm T} particles is stronger than that observed at lower collision energies, indicating that a very dense medium is formed in central Pb-Pb collisions at the LHC.Comment: 15 pages, 5 captioned figures, 3 tables, authors from page 10, published version, figures at http://aliceinfo.cern.ch/ArtSubmission/node/98
    corecore