496 research outputs found

    Isolating endogenous visuo-spatial attentional effects using the novel visual-evoked spread spectrum analysis (VESPA) technique

    Get PDF
    In natural visual environments, we use attention to select between relevant and irrelevant stimuli that are presented simultaneously. Our attention to objects in our visual field is largely controlled endogenously, but is also affected exogenously through the influence of novel stimuli and events. The study of endogenous and exogenous attention as separate mechanisms has been possible in behavioral and functional imaging studies, where multiple stimuli can be presented continuously and simultaneously. It has also been possible in electroencephalogram studies using the steady-state visual-evoked potential (SSVEP); however, it has not been possible in conventional event-related potential (ERP) studies, which are hampered by the need to present suddenly onsetting stimuli in isolation. This is unfortunate as the ERP technique allows for the analysis of human physiology with much greater temporal resolution than functional magnetic resonance imaging or the SSVEP. While ERP studies of endogenous attention have been widely reported, these experiments have a serious limitation in that the suddenly onsetting stimuli, used to elicit the ERP, inevitably have an exogenous, attention-grabbing effect. Recently we have shown that it is possible to derive separate event-related responses to concurrent, continuously presented stimuli using the VESPA (visual-evoked spread spectrum analysis) technique. In this study we employed an experimental paradigm based on this method, in which two pairs of diagonally opposite, non-contiguous disc-segment stimuli were presented, one pair to be ignored and the other to be attended. VESPA responses derived for each pair showed a strong modulation at 90–100 ms (during the visual P1 component), demonstrating the utility of the method for isolating endogenous visuospatial attention effects

    Persistence of the immune response induced by BCG vaccination.

    Get PDF
    BACKGROUND: Although BCG vaccination is recommended in most countries of the world, little is known of the persistence of BCG-induced immune responses. As novel TB vaccines may be given to boost the immunity induced by neonatal BCG vaccination, evidence concerning the persistence of the BCG vaccine-induced response would help inform decisions about when such boosting would be most effective. METHODS: A randomised control study of UK adolescents was carried out to investigate persistence of BCG immune responses. Adolescents were tested for interferon-gamma (IFN-gamma) response to Mycobacterium tuberculosis purified protein derivative (M.tb PPD) in a whole blood assay before, 3 months, 12 months (n = 148) and 3 years (n = 19) after receiving teenage BCG vaccination or 14 years after receiving infant BCG vaccination (n = 16). RESULTS: A gradual reduction in magnitude of response was evident from 3 months to 1 year and from 1 year to 3 years following teenage vaccination, but responses 3 years after vaccination were still on average 6 times higher than before vaccination among vaccinees. Some individuals (11/86; 13%) failed to make a detectable antigen-specific response three months after vaccination, or lost the response after 1 (11/86; 13%) or 3 (3/19; 16%) years. IFN-gamma response to Ag85 was measured in a subgroup of adolescents and appeared to be better maintained with no decline from 3 to 12 months. A smaller group of adolescents were tested 14 years after receiving infant BCG vaccination and 13/16 (81%) made a detectable IFN-gamma response to M.tb PPD 14 years after infant vaccination as compared to 6/16 (38%) matched unvaccinated controls (p = 0.012); teenagers vaccinated in infancy were 19 times more likely to make an IFN-gamma response of > 500 pg/ml than unvaccinated teenagers. CONCLUSION: BCG vaccination in infancy and adolescence induces immunological memory to mycobacterial antigens that is still present and measurable for at least 14 years in the majority of vaccinees, although the magnitude of the peripheral blood response wanes from 3 months to 12 months and from 12 months to 3 years post vaccination. The data presented here suggest that because of such waning in the response there may be scope for boosting anti-tuberculous immunity in BCG vaccinated children anytime from 3 months post-vaccination. This supports the prime boost strategies being employed for some new TB vaccines currently under development

    Management, regulation and environmental impacts of nitrogen fertilization in northwestern Europe under the Nitrates Directive; a benchmark study

    Get PDF
    Implementation of the Nitrates Directive (NiD) and its environmental impacts were compared for member states in the northwest of the European Union (Ireland, United Kingdom, Denmark, the Netherlands, Belgium, Northern France and Germany). The main sources of data were national reports for the third reporting period for the NiD (2004–2007) and results of the MITERRA-EUROPE model. Implementation of the NiD in the considered member states is fairly comparable regarding restrictions for where and when to apply fertilizer and manure, but very different regarding application limits for N fertilization. Issues of concern and improvement of the implementation of the NiD are accounting for the fertilizer value of nitrogen in manure, and relating application limits for total nitrogen (N) to potential crop yield and N removal. The most significant environmental effect of the implementation of the NiD since 1995 is a major contribution to the decrease of the soil N balance (N surplus), particularly in Belgium, Denmark, Ireland, the Netherlands and the United Kingdom. This decrease is accompanied by a modest decrease of nitrate concentrations since 2000 in fresh surface waters in most countries. This decrease is less prominent for groundwater in view of delayed response of nitrate in deep aquifers. In spite of improved fertilization practices, the southeast of the Netherlands, the Flemish Region and Brittany remain to be regions of major concern in view of a combination of a high nitrogen surplus, high leaching fractions to groundwater and tenacious exceedance of the water quality standards. On average the gross N balance in 2008 for the seven member states in EUROSTAT and in national reports was about 20 kg N ha<sup>−1</sup> yr<sup>−1</sup> lower than by MITERRA. The major cause is higher estimates of N removal in national reports which can amount to more than 50 kg N ha<sup>−1</sup> yr<sup>−1</sup>. Differences between procedures in member states to assess nitrogen balances and water quality and a lack of cross-boundary policy evaluations are handicaps when benchmarking the effectiveness of the NiD. This provides a challenge for the European Commission and its member states, as the NiD remains an important piece of legislation for protecting drinking water quality in regions with many private or small public production facilities and controlling aquatic eutrophication from agricultural sources

    Exclusively cephalic venous access for cardiac resynchronisation: A prospective multi-centre evaluation.

    Get PDF
    BACKGROUND: Small series has shown that cardiac resynchronisation therapy (CRT) can be achieved in a majority of patients using exclusively cephalic venous access. We sought to determine whether this method is suitable for widespread use. METHODS: A group of 19 operators including 11 trainees in three pacing centres attempted to use cephalic access alone for all CRT device implants over a period of 8 years. The access route for each lead, the procedure outcome, duration, and complications were collected prospectively. Data were also collected for 105 consecutive CRT device implants performed by experienced operators not using the exclusively cephalic method. RESULTS: A new implantation of a CRT device using exclusively cephalic venous access was attempted in 1091 patients (73.6% male, aged 73 ± 12 years). Implantation was achieved using cephalic venous access alone in 801 cases (73.4%) and using a combination of cephalic and other access in a further 180 (16.5%). Cephalic access was used for 2468 of 3132 leads implanted (78.8%). Compared to a non-cephalic reference group, complications occurred less frequently (69/1091 vs 12/105; P = .0468), and there were no pneumothoraces with cephalic implants. Procedure and fluoroscopy duration were shorter (procedure duration 118 ± 45 vs 144 ± 39 minutes, P < .0001; fluoroscopy duration 15.7 ± 12.9 vs 22.8 ± 12.2 minutes, P < .0001). CONCLUSIONS: CRT devices can be implanted using cephalic access alone in a substantial majority of cases. This approach is safe and efficient

    The Changes of Nutrition Labeling of Packaged Food in Hangzhou in China during 2008∼2010

    Get PDF
    OBJECTIVE: To understand the changes of the nutrition labeling of packaged food in China two years after the promulgation of the Regulation for Food Nutrition Labeling, which encourages food manufacturers to identify nutrition labeling. METHODS: Investigators copied out the nutrition information panel, nutrition claim and nutrient function claim of packaged food in a supermarket with prepared questionnaire and finished normative judgment in 2008 and 2010. RESULTS: 4693 and 5526 kinds of packaged food were investigated separately. Nutrition information panel, nutrition claim and nutrient function claim were found on the food label of 27.6%, 13.0% and 1.9% of packaged food respectively in 2008, while 35.1%, 7.7% and 2.3% in 2010. The nutrition information panel which labeled energy, protein, fat, carbohydrate and sodium was 597 (43.8%) and 1661 (85.9%) in 2008 and 2010, only 134 (9.8%) and 985 (51.0%) nutrition information panel were totally normalized. Nutrition claim and nutrient function claim focused on vitamin, mineral and dietary fiber. The total qualified proportions for nutrition claim were increased significantly for most of the nutrients, except for cholesterol. There were 6 (6.4%) and 5 (3.9%) nutrient function claims with hinting of therapeutic effects on diseases separately. CONCLUSION: Although the voluntary regulation remarkably improved the level of normalization for nutrition labeling, its role on the prevalence was minus. It's imperative to enforce nutrition labeling for not only China but also other countries, and furthermore, health education on nutrition labeling should be initiated to support the policy

    Adipose Tissue Dysfunction Signals Progression of Hepatic Steatosis Towards Nonalcoholic Steatohepatitis in C57Bl/6 Mice

    Get PDF
    OBJECTIVE - Nonalcoholic fatty liver disease (NAFLD) is linked to obesity and diabetes, suggesting an important role of adipose tissue in the pathogenesis of NAFLD. Here, we aimed to investigate the interaction between adipose tissue and liver in NAFLD and identify potential early plasma markers that predict nonalcoholic steatohepatitis (NASH). RESEARCH DESIGN AND METHODS - C57Bl/6 mice were chronically fed a high-fat diet to induce NAFLD and compared with mice fed a low-fat diet. Extensive histological and phenotypical analyses coupled with a time course study of plasma proteins using multiplex assay were performed. RESULTS - Mice exhibited pronounced heterogeneity in liver histological scoring, leading to classification into four subgroups: low-fat low (LFL) responders displaying normal liver morphology, low-fat high (LFH) responders showing benign hepatic steatosis, high-fat low (HFL) responders displaying pre-NASH with macrovesicular lipid droplets, and high fat high (HFH) responders exhibiting overt NASH characterized by ballooning of hepatocytes, presence of Mallory bodies, and activated inflammatory cells. Compared with HFL responders, HFH mice gained weight more rapidly and exhibited adipose tissue dysfunction characterized by decreased final fat mass, enhanced macrophage infiltration and inflammation, and adipose tissue remodeling. Plasma haptoglobin, IL-1β, TIMP-1, adiponectin, and leptin were significantly changed in HFH mice. Multivariate analysis indicated that in addition to leptin, plasma CRP, haptoglobin, eotaxin, and MIP-1α early in the intervention were positively associated with liver triglycerides. Intermediate prognostic markers of liver triglycerides included IL-18, IL-1β, MIP-1γ, and MIP-2, whereas insulin, TIMP-1, granulocyte chemotactic protein 2, and myeloperoxidase emerged as late markers. CONCLUSIONS - Our data support the existence of a tight relationship between adipose tissue dysfunction and NASH pathogenesis and point to several novel potential predictive biomarkers for NASH

    A Genetic Lesion that Arrests Plasma Cell Homing to the Bone Marrow

    Get PDF
    The coordinated regulation of chemokine responsiveness plays a critical role in the development of humoral immunity. After antigen challenge and B cell activation, the emerging plasma cells (PCs) undergo CXCL12-induced chemotaxis to the bone marrow, where they produce Ab and persist. Here we show that PCs, but not B cells or T cells from lupus-prone NZM mice, are deficient in CXCL12-induced migration. PC unresponsiveness to CXCL12 results in a marked accumulation of PCs in the spleen of mice, and a concordant decrease in bone marrow PCs. Unlike normal mice, in NZM mice, a majority of the splenic PCs are long-lived. This deficiency is a consequence of the genetic interactions of multiple systemic lupus erythematosus susceptibility loci

    Change in brain activity through virtual reality-based brain-machine communication in a chronic tetraplegic subject with muscular dystrophy

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>For severely paralyzed people, a brain-computer interface (BCI) provides a way of re-establishing communication. Although subjects with muscular dystrophy (MD) appear to be potential BCI users, the actual long-term effects of BCI use on brain activities in MD subjects have yet to be clarified. To investigate these effects, we followed BCI use by a chronic tetraplegic subject with MD over 5 months. The topographic changes in an electroencephalogram (EEG) after long-term use of the virtual reality (VR)-based BCI were also assessed. Our originally developed BCI system was used to classify an EEG recorded over the sensorimotor cortex in real time and estimate the user's motor intention (MI) in 3 different limb movements: feet, left hand, and right hand. An avatar in the internet-based VR was controlled in accordance with the results of the EEG classification by the BCI. The subject was trained to control his avatar via the BCI by strolling in the VR for 1 hour a day and then continued the same training twice a month at his home.</p> <p>Results</p> <p>After the training, the error rate of the EEG classification decreased from 40% to 28%. The subject successfully walked around in the VR using only his MI and chatted with other users through a voice-chat function embedded in the internet-based VR. With this improvement in BCI control, event-related desynchronization (ERD) following MI was significantly enhanced (<it>p </it>< 0.01) for feet MI (from -29% to -55%), left-hand MI (from -23% to -42%), and right-hand MI (from -22% to -51%).</p> <p>Conclusions</p> <p>These results show that our subject with severe MD was able to learn to control his EEG signal and communicate with other users through use of VR navigation and suggest that an internet-based VR has the potential to provide paralyzed people with the opportunity for easy communication.</p
    corecore