32 research outputs found

    Functional Changes in Brain Activity Using Hypnosis: A Systematic Review.

    Get PDF
    Hypnosis has proven a powerful method in indications such as pain control and anxiety reduction. As recently discussed, it has been yielding increased attention from medical/dental perspectives. This systematic review (PROSPERO-registration-ID-CRD42021259187) aimed to critically evaluate and discuss functional changes in brain activity using hypnosis by means of different imaging techniques. Randomized controlled trials, cohort, comparative, cross-sectional, evaluation and validation studies from three databases-Cochrane, Embase and Medline via PubMed from January 1979 to August 2021-were reviewed using an ad hoc prepared search string and following the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) guidelines. A total of 10,404 articles were identified, 1194 duplicates were removed and 9190 papers were discarded after consulting article titles/abstracts. Ultimately, 20 papers were assessed for eligibility, and 20 papers were included after a hand search (ntotal = 40). Despite a broad heterogenicity of included studies, evidence of functional changes in brain activity using hypnosis was identified. Electromyography (EMG) startle amplitudes result in greater activity in the frontal brain area; amplitudes using Somatosensory Event-Related Potentials (SERPs) showed similar results. Electroencephalography (EEG) oscillations of θ activity are positively associated with response to hypnosis. EEG results showed greater amplitudes for highly hypnotizable subjects over the left hemisphere. Less activity during hypnosis was observed in the insula and anterior cingulate cortex (ACC)

    Impact of an in-hospital endocarditis team and a state-wide endocarditis network on perioperative outcomes

    Get PDF
    Background: Infective endocarditis (IE) requires multidisciplinary management. We established an endocarditis team within our hospital in 2011 and a state-wide endocarditis network with referring hospitals in 2015. We aimed to investigate their impact on perioperative outcomes. Methods: We retrospectively analyzed data from patients operated on for IE in our center between 01/2007 and 03/2018. To investigate the impact of the endocarditis network on referral latency and pre-operative complications we divided patients into two eras: before ( n = 409) and after ( n = 221) 01/2015. To investigate the impact of the endocarditis team on post-operative outcomes we conducted multivariate binary logistic regression analyses for the whole population. Kaplan–Meier estimates of 5-year survival were reported. Results: In the second era, after establishing the endocarditis network, the median time from symptoms to referral was halved (7 days (interquartile range: 2–19) vs. 15 days (interquartile range: 6–35)), and pre-operative endocarditis-related complications were reduced, i.e., stroke (14% vs. 27%, p < 0.001), heart failure (45% vs. 69%, p < 0.001), cardiac abscesses (24% vs. 34%, p = 0.018), and acute requirement of hemodialysis (8% vs. 14%, p = 0.026). In both eras, a lack of recommendations from the endocarditis team was an independent predictor for in-hospital mortality (adjusted odds ratio: 2.12, 95% CI: 1.27–3.53, p = 0.004) and post-operative stroke (adjusted odds ratio: 2.23, 95% CI: 1.12–4.39, p = 0.02), and was associated with worse 5-year survival (59% vs. 40%, log-rank < 0.001). Conclusion: The establishment of an endocarditis network led to the earlier referral of patients with fewer pre-operative endocarditis-related complications. Adhering to endocarditis team recommendations was an independent predictor for lower post-operative stroke and in-hospital mortality, and was associated with better 5-year survival

    Visibility in Information Spaces and in Geographic Environments. Post-Proceedings of the KI'11 Workshop (October 4th, 2011, TU Berlin, Germany)

    Get PDF
    In the post-proceedings of the Workshop "Visibility in Information Spaces and in Geographic Environments" a selection of research papers is presented where the topic of visibility is addressed in different contexts. Visibility governs information selection in geographic environments as well as in information spaces and in cognition. The users of social media navigate in information spaces and at the same time, as embodied agents, they move in geographic environments. Both activities follow a similar type of information economy in which decisions by individuals or groups require a highly selective filtering to avoid information overload. In this context, visibility refers to the fact that in social processes some actors, topics or places are more salient than others. Formal notions of visibility include the centrality measures from social network analysis or the plethora of web page ranking methods. Recently, comparable approaches have been proposed to analyse activities in geographic environments: Place Rank, for instance, describes the social visibility of urban places based on the temporal sequence of tourist visit patterns. The workshop aimed to bring together researchers from AI, Geographic Information Science, Cognitive Science, and other disciplines who are interested in understanding how the different forms of visibility in information spaces and geographic environments relate to one another and how the results from basic research can be used to improve spatial search engines, geo-recommender systems or location-based social networks

    DAPHNE: An Open and Extensible System Infrastructure for Integrated Data Analysis Pipelines

    Get PDF
    Integrated data analysis (IDA) pipelines—that combine data management (DM) and query processing, high-performance computing (HPC), and machine learning (ML) training and scoring—become increasingly common in practice. Interestingly, systems of these areas share many compilation and runtime techniques, and the used—increasingly heterogeneous—hardware infrastructure converges as well. Yet, the programming paradigms, cluster resource management, data formats and representations, as well as execution strategies differ substantially. DAPHNE is an open and extensible system infrastructure for such IDA pipelines, including language abstractions, compilation and runtime techniques, multi-level scheduling, hardware (HW) accelerators, and computational storage for increasing productivity and eliminating unnecessary overheads. In this paper, we make a case for IDA pipelines, describe the overall DAPHNE system architecture, its key components, and the design of a vectorized execution engine for computational storage, HW accelerators, as well as local and distributed operations. Preliminary experiments that compare DAPHNE with MonetDB, Pandas, DuckDB, and TensorFlow show promising results

    The Mere Exposure Effect in the Domain of Haptics

    Get PDF
    Background: Zajonc showed that the attitude towards stimuli that one had been previously exposed to is more positive than towards novel stimuli. This mere exposure effect (MEE) has been tested extensively using various visual stimuli. Research on the MEE is sparse, however, for other sensory modalities. Methodology/Principal Findings: We used objects of two material categories (stone and wood) and two complexity levels (simple and complex) to test the influence of exposure frequency (F0 = novel stimuli, F2 = stimuli exposed twice, F10 = stimuli exposed ten times) under two sensory modalities (haptics only and haptics &amp; vision). Effects of exposure frequency were found for high complex stimuli with significantly increasing liking from F0 to F2 and F10, but only for the stone category. Analysis of ‘‘Need for Touch’ ’ data showed the MEE in participants with high need for touch, which suggests different sensitivity or saturation levels of MEE. Conclusions/Significance: This different sensitivity or saturation levels might also reflect the effects of expertise on the haptic evaluation of objects. It seems that haptic and cross-modal MEEs are influenced by factors similar to those in the visual domain indicating a common cognitive basis

    Extending the Implicit Association Test (IAT): Assessing Consumer Attitudes Based on Multi-Dimensional Implicit Associations

    Get PDF
    Background: The authors present a procedural extension of the popular Implicit Association Test (IAT; [1]) that allows for indirect measurement of attitudes on multiple dimensions (e.g., safe–unsafe; young–old; innovative–conventional, etc.) rather than on a single evaluative dimension only (e.g., good–bad). Methodology/Principal Findings: In two within-subjects studies, attitudes toward three automobile brands were measured on six attribute dimensions. Emphasis was placed on evaluating the methodological appropriateness of the new procedure, providing strong evidence for its reliability, validity, and sensitivity. Conclusions/Significance: This new procedure yields detailed information on the multifaceted nature of brand associations that can add up to a more abstract overall attitude. Just as the IAT, its multi-dimensional extension/application (dubbed md-IAT) is suited for reliably measuring attitudes consumers may not be consciously aware of, able to express, or willing to share with the researcher [2,3].Product Innovation ManagementIndustrial Design Engineerin

    Restricting Glycolysis Preserves T Cell Effector Functions and Augments Checkpoint Therapy

    Get PDF
    Tumor-derived lactic acid inhibits T and natural killer (NK) cell function and, thereby, tumor immunosurveillance. Here, we report that melanoma patients with high expression of glycolysis-related genes show a worse progression free survival upon anti-PD1 treatment. The non-steroidal anti-inflammatory drug (NSAID) diclofenac lowers lactate secretion of tumor cells and improves anti-PD1-induced T cell killing in vitro. Surprisingly, diclofenac, but not other NSAIDs, turns out to be a potent inhibitor of the lactate transporters monocarboxylate transporter 1 and 4 and diminishes lactate efflux. Notably, T cell activation, viability, and effector functions are preserved under diclofenac treatment and in a low glucose environment in vitro. Diclofenac, but not aspirin, delays tumor growth and improves the efficacy of checkpoint therapy in vivo. Moreover, genetic suppression of glycolysis in tumor cells strongly improves checkpoint therapy. These findings support the rationale for targeting glycolysis in patients with high glycolytic tumors together with checkpoint inhibitors in clinical trials

    Radiation damage at LHCb, results and expectations

    No full text
    The LHCb Detector is a single-arm spectrometer at the LHC designed to detect new physics through measuring CP violation and rare decays of heavy flavor mesons. The detector consists of vertex detector, tracking system, dipole magnet, 2 RICH detectors, em. calorimeter, hadron calorimeter, muon detector which all use different technologies and suffer differently from radiation damage. These radiation damage results and the investigation methods will be shown. The delivered luminosity till July 2011 was about 450 pb−1. The Vertex detector receives the highest particle flux at LHCb. The currents drawn by the silicon sensors are, as expected, increasing proportional to the integrated luminosity. The highest irradiaton regions of the n-bulk silicon sensors are observed to have recently undergone space charge sign inversion. The Silicon Trackers show increasing leakage currents comparable with earlier predictions. The electromagentic calorimeter and hadron calorimeter suffer under percent-level signal decrease which is monitored to achieve a 1% precision in the energy calibration. The Outer Tracker observes no beam induced radiation damage so far. The RICH detectors see no significant irradiation effects as expected. To investigate irradiation effects in a muon system additional monitoring tools are in development, with current tools no radiation damages have been detected

    FPGA Compute Acceleration for High-Throughput Data Processing in High-Energy Physics Experiments

    No full text
    The upgrades of the four large experiments of the LHC at CERN in the coming years will result in a huge increase of data bandwidth for each experiment which needs to be processed very efficiently. For example the LHCb experiment will upgrade its detector 2019/2020 to a 'triggerless' readout scheme, where all of the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40MHz. This increases the data bandwidth from the detector down to the event filter farm to 40TBit/s, which must be processed to select the interesting proton-proton collisions for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered.&nbsp;&nbsp;&nbsp; In the high performance computing sector more and more FPGA compute accelerators are being used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project, Bing search engine and Amazon EC2 F1 Instances). Therefore different types of FPGA compute accelerators are being developed by different companies with compute performances of up to 10 TFlops. Many of these accelerators are already available on the market or will become available very soon, like the Intel Xeon+FPGAs. In addition, simpler programming models like OpenCL are becoming more popular for application development, which allows easier programming and maintenance of the FPGA firmware in a high-level language. FPGAs are especially interesting in the metric of performance per Joule, which, for larger data-centers, is very important when considering that power and cooling costs drive the total operational budget. In this talk different performance studies with HEP algorithms will be shown and discussed, which were performed for the LHCb upgrade in the High-Throughput-Compute Collaboration with Intel. Different types of FPGA compute accelerators are compared, including studies of their programming model with OpenCL and Verilog. In addition, the performance per Joule will be discussed. The results show that the new Intel Xeon+FPGA platforms, which are built in general for high performance computing, are very interesting for the High Energy Physics community.&nbsp; At the end of the talk the outlook is given for the near future development of the FPGA compute acceleration.</p
    corecore