48 research outputs found
Cognitive changes in patients with epilepsy identified through the MoCA test during neurology outpatient consultation
Introduction
Epilepsy is a chronic neurological disorder that may occur alongside cognitive changes, with effects on multiple cognitive domains.
Objective
To compare the cognitive performance of patients with epilepsy and healthy controls through Montreal Cognitive Assessment (MoCA) during outpatient consultation at a reference diagnostic center in Colombia and analyze and the influencing factors.
Materials and methodology
One-hundred and four patients during neurology outpatient consultation in the city of Cartagena, Colombia, were assessed with the (MoCA) test, i.e., 54 people who consulted for headache and have not been diagnosed with epilepsy (NEP) and 50 with a diagnosis of epilepsy (EPs) according to the diagnostic criteria of the International League Against Epilepsy (ILAE).
Results
Significant differences were found in the total mean scores of the (MoCA) between (EPs) and (NPE) groups (t = 4.72; p < 0.01), particularly in attention (t = 3.22; p < 0.02) and memory (t = 5.04; p < 0.01) dimensions. Additionally, a significant association was observed between years of schooling and (MoCA) scores (p = 0,019) but not between socioeconomic level (p = 0,510), age (p = 0,452) and the frequency of seizures (p = 0,471).
Discussion
Patients with epilepsy show lower scores in several cognitive domains in respect of the control group. The (MoCA) has proven its appropriateness for cognitive screening in the contexts of clinical neurology outpatient consultation
Measurement of jet suppression in central Pb-Pb collisions at root s(NN)=2.76 TeV
The transverse momentum(p(T)) spectrum and nuclear modification factor (R-AA) of reconstructed jets in 0-10% and 10-30% central Pb-Pb collisions at root s(NN) = 2.76 TeV were measured. Jets were reconstructed using the anti-k(T) jet algorithm with a resolution parameter of R = 0.2 from charged and neutral particles, utilizing the ALICE tracking detectors and Electromagnetic Calorimeter (EMCal). The jet p(T) spectra are reported in the pseudorapidity interval of \eta(jet)\ 5 GeV/c to suppress jets constructed from the combinatorial background in Pb-Pb collisions. The leading charged particle requirement applied to jet spectra both in pp and Pb-Pb collisions had a negligible effect on the R-AA. The nuclear modification factor R-AA was found to be 0.28 +/- 0.04 in 0-10% and 0.35 +/- 0.04 in 10-30% collisions, independent of p(T), jet within the uncertainties of the measurement. The observed suppression is in fair agreement with expectations from two model calculations with different approaches to jet quenching. (C) 2015 CERN for the benefit of the ALICE Collaboration. Published by Elsevier B.V.Peer reviewe
Highly-parallelized simulation of a pixelated LArTPC on a GPU
The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype
Recommended from our members
Squeezing the angular momentum of an ensemble of complex multilevel atoms
Squeezing of collective atomic spins has been shown to improve the sensitivity of atomic clocks and magnetometers to levels significantly below the standard quantum limit. In most cases the requisite atom-atom entanglement has been generated by dispersive interaction with a quantized probe field or by state-dependent collisions in a quantum gas. Such experiments typically use complex multilevel atoms like Rb or Cs, with the relevant interactions designed so that atoms behave like pseudospin-12 particles. We demonstrate the viability of spin squeezing for collective spins composed of the physical angular momenta of 106 Cs atoms, each in an internal spin-4 hyperfine state. A peak metrological squeezing of at least 5dB is generated by quantum backaction from a dispersive quantum nondemolition (QND) measurement, implemented using a two-color optical probe that minimizes tensor light shifts without sacrificing measurement strength. Other significant developments include the successful application of composite pulse techniques for accurate dynamical control of the collective spin, enabled by broadband suppression of background magnetic fields inside a state-of-the-art magnetic shield. The absence of classical noise allows us to compare the observed quantum projection noise and squeezing to a theoretical model that properly accounts for both the relevant atomic physics and the spatial mode of the collective spin, finding good quantitative agreement and thereby validating its use in other contexts. Our work sets the stage for experiments on quantum feedback, deterministic squeezing, and closed-loop magnetometry. The implementation of real-time feedback may also create an opportunity for new types of quantum simulation, wherein the evolution of a quantum system is conditioned on the outcome of a time-continuous QND measurement. Such a scheme has the potential to access new regimes near the quantum-classical boundary, with opportunities to study long-standing issues related to quantum-classical correspondence in chaotic systems. © 2021 American Physical Society.Immediate accessThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
經濟學全集「統計學」を讀む
39 pages, 11 captioned figures, 8 tables (5 of them in Appendix A), authors from page 33, submitted to JHEP, figures at http://aliceinfo.cern.ch/ArtSubmission/node/2359 ; see paper for full list of authorsInternational audienceThe measurement of prompt D-meson production as a function of multiplicity in p-Pb collisions at TeV with the ALICE detector at the LHC is reported. D, D and D mesons are reconstructed via their hadronic decay channels in the centre-of-mass rapidity range and transverse momentum interval GeV/. The multiplicity dependence of D-meson production is examined by either comparing yields in p-Pb collisions in different event classes, selected based on the multiplicity of produced particles or zero-degree energy, with those in pp collisions, scaled by the number of binary nucleon-nucleon collisions (nuclear modification factor); as well as by evaluating the per-event yields in p-Pb collisions in different multiplicity intervals normalised to the multiplicity-integrated ones (relative yields). The nuclear modification factors for D, D and D are consistent with one another. The D-meson nuclear modification factors as a function of the zero-degree energy are consistent with unity within uncertainties in the measured regions and event classes. The relative D-meson yields, calculated in various intervals, increase as a function of the charged-particle multiplicity. The results are compared with the equivalent pp measurements at TeV as well as with EPOS~3 calculations