234 research outputs found

    Episodic memory encoding and retrieval in face-name paired paradigm: An FNIRS study

    Get PDF
    Background: Episodic memory (EM) is particularly sensitive to pathological conditions and aging. In a neurocognitive context, the paired-associate learning (PAL) paradigm, which requires participants to learn and recall associations between stimuli, has been used to measure EM. The present study aimed to explore whether functional near-infrared spectroscopy (fNIRS) can be employed to determine cortical activity underlying encoding and retrieval. Moreover, we examined whether and how different aspects of task (i.e., novelty, difficulty) affects those cortical activities. Methods: Twenty-two male college students (age: M = 20.55, SD = 1.62) underwent a face-name PAL paradigm under 40-channel fNIRS covering fronto-parietal and middle occipital regions. Results: A decreased activity during encoding in a broad network encompassing the bilateral frontal cortex (Brodmann areas 9, 11, 45, and 46) was observed during the encoding, while an increased activity in the left orbitofrontal cortex (Brodmann area 11) was observed during the retrieval. Increased HbO concentration in the superior parietal cortices and decreased HbO concentration in the inferior parietal cortices were observed during encoding while dominant activation of left PFC was found during retrieval only. Higher task difficulty was associated with greater neural activity in the bilateral prefrontal cortex and higher task novelty was associated with greater activation in occipital regions. Conclusion: Combining the PAL paradigm with fNIRS provided the means to differentiate neural activity characterising encoding and retrieval. Therefore, the fNIRS may have the potential to complete EM assessments in clinical settings

    Affective Responses to Increasing- and Decreasing-Intensity Resistance Training Protocols.

    Get PDF
    This study compared the effects of an increasing-intensity (UP) and a decreasing-intensity (DOWN) resistance training protocol on affective responses across six training sessions. Novice participants (Mage 43.5 ± 13.7 years) were randomly assigned to UP (n = 18) or DOWN (n = 17) resistance training groups. Linear mixed-effects models showed that the evolution of affective valence within each training session was significantly moderated by the group (b = -0.45, p ≤ .001), with participants in the UP group reporting a decline in pleasure during each session (b = -0.82) and the DOWN group reporting an improvement (b = 0.97; ps < .001). Remembered pleasure was significantly higher in the DOWN group compared to the UP group (b = 0.57, p = .004). These findings indicate that a pattern of decreasing intensity throughout a resistance exercise session can elicit more positive affective responses and retrospective affective evaluations of resistance training

    A Multiset Rewriting Model for Specifying and Verifying Timing Aspects of Security Protocols

    Get PDF
    Catherine Meadows has played an important role in the advancement of formal methods for protocol security verification. Her insights on the use of, for example, narrowing and rewriting logic has made possible the automated discovery of new attacks and the shaping of new protocols. Meadows has also investigated other security aspects, such as, distance-bounding protocols and denial of service attacks. We have been greatly inspired by her work. This paper describes the use of Multiset Rewriting for the specification and verification of timing aspects of protocols, such as network delays, timeouts, timed intruder models and distance-bounding properties. We detail these timed features with a number of examples and describe decidable fragments of related verification problems

    Effects of hyperoxia and cardiovascular risk factors on myocardial ischemia reperfusion injury: a randomized, sham and placebo controlled parallel study

    Full text link
    peer reviewedRecent studies on O2 supplementation in acute coronary syndrome patients are equivocal. We tested the hypothesis that oxidative stress (OS) is increased in rodents with cardiovascular risk factors and enhances ischemia reperfusion injury in the presence of hyperoxia. Forty-three Wistar rats (WR), 30 spontaneous hypertensive rats (SHR) and 33 obese Zucker rats (ZR) were randomized in a sham procedure (1/3rd) or a left anterior descending ligation for 60 minutes (2/3rd). This was followed by 3 hours of reperfusion while animals were randomised either in a hyperoxic (HR) or a normoxic reperfusion group (NR). Baseline troponin (cTnT) was larger in SHR and ZR than WR (both p &lt; 0.001). HR was associated with a lesser troponin rise in SHR and ZR than in NR (both p &lt; 0.001); while the reverse occurred in WR (p &lt; 0.001). In SHR, HR limited total MPO (myeloperoxydase) increase as compared to NR (p = 0.0056) to the contrary of total MPO in WR (p = 0.013). NR was associated with a drastic reduction of total thiols as compared to HR both in SHR and in ZR (both p &lt; 0.001). Despite a heightened baseline OS, HR rather restrained myocardial necrosis and anti/pro-oxidant imbalance in SHR and ZR, to the reverse of healthy WR

    Trace Equivalence and Epistemic Logic to Express Security Properties

    Full text link
    In process algebras, security properties are expressed as equivalences between processes, but which equivalence is suitable is not clear. This means that there is a gap between an intuitive security notion and the formulation. Appropriate formalization is essential for verification, and our purpose is bridging this gap. By chasing scope extrusions, we prove that trace equivalence is congruent. Moreover, we construct an epistemic logic for the applied pi calculus and show that its logical equivalence agrees with the trace equivalence. We use the epistemic logic to show that trace equivalence is pertinent in the presence of a non-adaptive attacker

    Benchmarking homogenization algorithms for monthly data

    Get PDF
    The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies. The algorithms were validated against a realistic benchmark dataset. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including i) the centered root mean square error relative to the true homogeneous values at various averaging scales, ii) the error in linear trend estimates and iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data. Moreover, state-of-the-art relative homogenization algorithms developed to work with an inhomogeneous reference are shown to perform best. The study showed that currently automatic algorithms can perform as well as manual ones

    Tissue Compartment Analysis for Biomarker Discovery by Gene Expression Profiling

    Get PDF
    BACKGROUND:Although high throughput technologies for gene profiling are reliable tools, sample/tissue heterogeneity limits their outcomes when applied to identify molecular markers. Indeed, inter-sample differences in cell composition contribute to scatter the data, preventing detection of small but relevant changes in gene expression level. To date, attempts to circumvent this difficulty were based on isolation of the different cell structures constituting biological samples. As an alternate approach, we developed a tissue compartment analysis (TCA) method to assess the cell composition of tissue samples, and applied it to standardize data and to identify biomarkers. METHODOLOGY/PRINCIPAL FINDINGS:TCA is based on the comparison of mRNA expression levels of specific markers of the different constitutive structures in pure isolated structures, on the one hand, and in the whole sample on the other. TCA method was here developed with human kidney samples, as an example of highly heterogeneous organ. It was validated by comparison of the data with those obtained by histo-morphometry. TCA demonstrated the extreme variety of composition of kidney samples, with abundance of specific structures varying from 5 to 95% of the whole sample. TCA permitted to accurately standardize gene expression level amongst >100 kidney biopsies, and to identify otherwise imperceptible molecular disease markers. CONCLUSIONS/SIGNIFICANCE:Because TCA does not require specific preparation of sample, it can be applied to all existing tissue or cDNA libraries or to published data sets, inasmuch specific operational compartments markers are available. In human, where the small size of tissue samples collected in clinical practice accounts for high structural diversity, TCA is well suited for the identification of molecular markers of diseases, and the follow up of identified markers in single patients for diagnosis/prognosis and evaluation of therapy efficiency. In laboratory animals, TCA will interestingly be applied to central nervous system where tissue heterogeneity is a limiting factor
    • …
    corecore