139 research outputs found

    Evaporation-induced self-assembly of gold nanorods on a hydrophobic substrate for surface enhanced Raman spectroscopy applications

    Get PDF
    The controllable assembly of plasmonic nanoparticles has developed as one of the most significant approaches for surface enhanced Raman spectroscopy (SERS) applications. This study developed a simple approach to improve a large-scale ordered assembly of gold nanorods (GNRs) by controlling the droplet evaporation mode on hydrophobic substrates. The hydrophobic substrate was efficiently produced by spin coating the silicone oil onto the glass slides and annealing them. The analyte molecule rhodamine (R6G) was employed as a surface-enhanced Raman scattering probe to demonstrate the potential effects of the synthesized arrays. This hydrophobic platform enables the concentration and delivery of analyte molecules into the surface enhanced Raman spectroscopy sensitive site while suppressing the coffee ring effect generated by the smooth contraction motion of the base contact radius of the droplet without any pinning. Thus, the limit of detection (LOD) of the R6G analyte was lowered to 10−10 M and the homogenous dispersion of surface enhanced Raman spectroscopy hotspots within the self-assembly reproducible surface enhanced Raman spectroscopy signal. This new method enables a broad range of packing patterns and mechanisms by changing the host nanoparticles in the dispersion

    Specification of Complex Logical Expressions for Task Automation: An EUD Approach

    Get PDF
    The growing availability of smart objects is stimulating researchers in investigating the IoT phenomenon from different perspectives. In the HCI area, and in particular from the EUD perspective, one prominent goal is to enable nontechnical users to be directly involved in configuring smart object behaviour. With this respect, this paper discusses three visual composition techniques to specify logical expressions in Event-Condition-Action rules used for synchronizing the behavior of smart objects

    The CrowdHEALTH project and the Hollistic Health Records: Collective Wisdom Driving Public Health Policies.

    Get PDF
    Introduction: With the expansion of available Information and Communication Technology (ICT) services, a plethora of data sources provide structured and unstructured data used to detect certain health conditions or indicators of disease. Data is spread across various settings, stored and managed in different systems. Due to the lack of technology interoperability and the large amounts of health-related data, data exploitation has not reached its full potential yet. Aim: The aim of the CrowdHEALTH approach, is to introduce a new paradigm of Holistic Health Records (HHRs) that include all health determinants defining health status by using big data management mechanisms. Methods: HHRs are transformed into HHRs clusters capturing the clinical, social and human context with the aim to benefit from the collective knowledge. The presented approach integrates big data technologies, providing Data as a Service (DaaS) to healthcare professionals and policy makers towards a "health in all policies" approach. A toolkit, on top of the DaaS, providing mechanisms for causal and risk analysis, and for the compilation of predictions is developed. Results: CrowdHEALTH platform is based on three main pillars: Data & structures, Health analytics, and Policies. Conclusions: A holistic approach for capturing all health determinants in the proposed HHRs, while creating clusters of them to exploit collective knowledge with the aim of the provision of insight for different population segments according to different factors (e.g. location, occupation, medication status, emerging risks, etc) was presented. The aforementioned approach is under evaluation through different scenarios with heterogeneous data from multiple sources

    CrowdHEALTH: Holistic Health Records and Big Data Analytics for Health Policy Making and Personalized Health.

    Get PDF
    Today's rich digital information environment is characterized by the multitude of data sources providing information that has not yet reached its full potential in eHealth. The aim of the presented approach, namely CrowdHEALTH, is to introduce a new paradigm of Holistic Health Records (HHRs) that include all health determinants. HHRs are transformed into HHRs clusters capturing the clinical, social and human context of population segments and as a result collective knowledge for different factors. The proposed approach also seamlessly integrates big data technologies across the complete data path, providing of Data as a Service (DaaS) to the health ecosystem stakeholders, as well as to policy makers towards a "health in all policies" approach. Cross-domain co-creation of policies is feasible through a rich toolkit, being provided on top of the DaaS, incorporating mechanisms for causal and risk analysis, and for the compilation of predictions

    Remote heart rate monitoring - Assessment of the Facereader rPPg by Noldus

    Get PDF
    Remote photoplethysmography (rPPG) allows contactless monitoring of human cardiac activity through a video camera. In this study, we assessed the accuracy and precision for heart rate measurements of the only consumer product available on the market, namely the Facereaderâ„¢ rPPG by Noldus, with respect to a gold standard electrocardiograph. Twenty-four healthy participants were asked to sit in front of a computer screen and alternate two periods of rest with two stress tests (i.e. Go/No-Go task), while their heart rate was simultaneously acquired for 20 minutes using the ECG criterion measure and the Facereaderâ„¢ rPPG. Results show that the Facereaderâ„¢ rPPG tends to overestimate lower heart rates and underestimate higher heart rates compared to the ECG. The Facereaderâ„¢ rPPG revealed a mean bias of 9.8 bpm, the 95% limits of agreement (LoA) ranged from almost -30 up to +50 bpm. These results suggest that whilst the rPPG Facereaderâ„¢ technology has potential for contactless heart rate monitoring, its predictions are inaccurate for higher heart rates, with unacceptable precision across the entire range, rendering its estimates unreliable for monitoring individuals

    Triple-GEM discharge probability studies at CHARM: Simulations and experimental results

    Get PDF
    The CMS muon system in the region with 2.03<|η|<2.82 is characterized by a very harsh radiation environment which can generate hit rates up to 144 kHz/cm2^{2} and an integrated charge of 8 C/cm2^{2} over ten years of operation. In order to increase the detector performance and acceptance for physics events including muons, a new muon station (ME0) has been proposed for installation in that region. The technology proposed is Triple—Gas Electron Multiplier (Triple-GEM), which has already been qualified for the operation in the CMS muon system. However, an additional set of studies focused on the discharge probability is necessary for the ME0 station, because of the large radiation environment mentioned above. A test was carried out in 2017 at the Cern High energy AcceleRator Mixed (CHARM) facility, with the aim of giving an estimation of the discharge probability of Triple-GEM detectors in a very intense radiation field environment, similar to the one of the CMS muon system. A dedicated standalone Geant4 simulation was performed simultaneously, to evaluate the behavior expected in the detector exposed to the CHARM field. The geometry of the detector has been carefully reproduced, as well as the background field present in the facility. This paper presents the results obtained from the Geant4 simulation, in terms of sensitivity of the detector to the CHARM environment, together with the analysis of the energy deposited in the gaps and of the processes developed inside the detector. The discharge probability test performed at CHARM will be presented, with a complete discussion of the results obtained, which turn out to be consistent with measurements performed by other groups

    Detector Control System for the GE1/1 slice test

    Get PDF
    Gas Electron Multiplier (GEM) technology, in particular triple-GEM, was selected for the upgrade of the CMS endcap muon system following several years of intense effort on R&D. The triple-GEM chambers (GE1/1) are being installed at station 1 during the second long shutdown with the goal of reducing the Level-1 muon trigger rate and improving the tracking performance in the harsh radiation environment foreseen in the future LHC operation [1]. A first installation of a demonstrator system started at the beginning of 2017: 10 triple-GEM detectors were installed in the CMS muon system with the aim of gaining operational experience and demonstrating the integration of the GE1/1 system into the trigger. In this context, a dedicated Detector Control System (DCS) has been developed, to control and monitor the detectors installed and integrating them into the CMS operation. This paper presents the slice test DCS, describing in detail the different parts of the system and their implementation

    Impact of magnetic field on the stability of the CMS GE1/1 GEM detector operation

    Get PDF
    The Gas Electron Multiplier (GEM) detectors of the GE1/1 station of the CMS experiment have been operated in the CMS magnetic field for the first time on the 7th^{th} of October 2021. During the magnetic field ramps, several discharge phenomena were observed, leading to instability in the GEM High Voltage (HV) power system. In order to reproduce the behavior, it was decided to conduct a dedicated test at the CERN North Area with the Goliath magnet, using four GE1/1 spare chambers. The test consisted in studying the characteristics of discharge events that occurred in different detector configurations and external conditions. Multiple magnetic field ramps were performed in sequence: patterns in the evolution of the discharge rates were observed with these data. The goal of this test is the understanding of the experimental conditions inducing discharges and short circuits in a GEM foil. The results of this test lead to the development of procedure for the optimal operation and performance of GEM detectors in the CMS experiment during the magnet ramps. Another important result is the estimation of the probability of short circuit generation, at 68 % confidence level, pshort_{short}HV^{HV} OFF^{OFF} = 0.42−0.35+0.94^{-0.35+0.94}% with detector HV OFF and pshort_{short}HV^{HV} OFF^{OFF} < 0.49% with the HV ON. These numbers are specific for the detectors used during this test, but they provide a first quantitative indication on the phenomenon, and a point of comparison for future studies adopting the same procedure

    Benchmarking LHC background particle simulation with the CMS triple-GEM detector

    Get PDF
    In 2018, a system of large-size triple-GEM demonstrator chambers was installed in the CMS experiment at CERN\u27s Large Hadron Collider (LHC). The demonstrator\u27s design mimicks that of the final detector, installed for Run-3. A successful Monte Carlo (MC) simulation of the collision-induced background hit rate in this system in proton-proton collisions at 13 TeV is presented. The MC predictions are compared to CMS measurements recorded at an instantaneous luminosity of 1.5 ×1034^{34} cm−2^{-2} s−1^{-1}. The simulation framework uses a combination of the FLUKA and GEANT4 packages. FLUKA simulates the radiation environment around the GE1/1 chambers. The particle flux by FLUKA covers energy spectra ranging from 10−11^{-11} to 104^{4} MeV for neutrons, 10−3^{-3} to 104^{4} MeV for γ\u27s, 10−2^{-2} to 104^{4} MeV for e±^{±}, and 10−1^{-1} to 104^{4} MeV for charged hadrons. GEANT4 provides an estimate of the detector response (sensitivity) based on an accurate description of the detector geometry, the material composition, and the interaction of particles with the detector layers. The detector hit rate, as obtained from the simulation using FLUKA and GEANT4, is estimated as a function of the perpendicular distance from the beam line and agrees with data within the assigned uncertainties in the range 13.7-14.5%. This simulation framework can be used to obtain a reliable estimate of the background rates expected at the High Luminosity LHC

    Modeling the triple-GEM detector response to background particles for the CMS Experiment

    Get PDF
    An estimate of environmental background hit rate on triple-GEM chambers is performed using Monte Carlo (MC) simulation and compared to data taken by test chambers installed in the CMS experiment (GE1/1) during Run-2 at the Large Hadron Collider (LHC). The hit rate is measured using data collected with proton-proton collisions at 13 TeV and a luminosity of 1.5×1034\times10^{34} cm−2^{-2} s−1^{-1}. The simulation framework uses a combination of the FLUKA and Geant4 packages to obtain the hit rate. FLUKA provides the radiation environment around the GE1/1 chambers, which is comprised of the particle flux with momentum direction and energy spectra ranging from 10−1110^{-11} to 10410^{4} MeV for neutrons, 10−310^{-3} to 10410^{4} MeV for γ\gamma's, 10−210^{-2} to 10410^{4} MeV for e±e^{\pm}, and 10−110^{-1} to 10410^{4} MeV for charged hadrons. Geant4 provides an estimate of detector response (sensitivity) based on an accurate description of detector geometry, material composition and interaction of particles with the various detector layers. The MC simulated hit rate is estimated as a function of the perpendicular distance from the beam line and agrees with data within the assigned uncertainties of 10-14.5%. This simulation framework can be used to obtain a reliable estimate of background rates expected at the High Luminosity LHC.Comment: 16 pages, 9 figures, 6 table
    • …
    corecore