90,662 research outputs found

    First in-beam studies of a Resistive-Plate WELL gaseous multiplier

    Full text link
    We present the results of the first in-beam studies of a medium size (10×\times10 cm2^2) Resistive-Plate WELL (RPWELL): a single-sided THGEM coupled to a pad anode through a resistive layer of high bulk resistivity (\sim109Ω^9 \Omegacm). The 6.2~mm thick (excluding readout electronics) single-stage detector was studied with 150~GeV muons and pions. Signals were recorded from 1×\times1 cm2^2 square copper pads with APV25-SRS readout electronics. The single-element detector was operated in Ne\(5% CH4\mathrm{CH_{4}}) at a gas gain of a few times 104^4, reaching 99%\% detection efficiency at average pad multiplicity of \sim1.2. Operation at particle fluxes up to \sim104^4 Hz/cm2^2 resulted in \sim23%\% gain drop leading to \sim5%\% efficiency loss. The striking feature was the discharge-free operation, also in intense pion beams. These results pave the way towards robust, efficient large-scale detectors for applications requiring economic solutions at moderate spatial and energy resolutions.Comment: Accepted by JINS

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    DART: Distribution Aware Retinal Transform for Event-based Cameras

    Full text link
    We introduce a generic visual descriptor, termed as distribution aware retinal transform (DART), that encodes the structural context using log-polar grids for event cameras. The DART descriptor is applied to four different problems, namely object classification, tracking, detection and feature matching: (1) The DART features are directly employed as local descriptors in a bag-of-features classification framework and testing is carried out on four standard event-based object datasets (N-MNIST, MNIST-DVS, CIFAR10-DVS, NCaltech-101). (2) Extending the classification system, tracking is demonstrated using two key novelties: (i) For overcoming the low-sample problem for the one-shot learning of a binary classifier, statistical bootstrapping is leveraged with online learning; (ii) To achieve tracker robustness, the scale and rotation equivariance property of the DART descriptors is exploited for the one-shot learning. (3) To solve the long-term object tracking problem, an object detector is designed using the principle of cluster majority voting. The detection scheme is then combined with the tracker to result in a high intersection-over-union score with augmented ground truth annotations on the publicly available event camera dataset. (4) Finally, the event context encoded by DART greatly simplifies the feature correspondence problem, especially for spatio-temporal slices far apart in time, which has not been explicitly tackled in the event-based vision domain.Comment: 12 pages, revision submitted to TPAMI in Nov 201

    Prospects in MPGDs development for neutron detection

    Full text link
    The aim of this document is to summarise the discussion and the contributions from the 2nd Academia-Industry Matching Event on Detecting Neutrons with MPGDs which took place at CERN on the 16th and the 17th of March 2015. These events provide a platform for discussing the prospects of Micro-Pattern Gaseous Detectors (MPGDs) for thermal and fast neutron detection, commercial constraints and possible solutions. The aim is to foster the collaboration between the particle physics community, the neutron detector users, instrument scientists and fabricants

    leave a trace - A People Tracking System Meets Anomaly Detection

    Full text link
    Video surveillance always had a negative connotation, among others because of the loss of privacy and because it may not automatically increase public safety. If it was able to detect atypical (i.e. dangerous) situations in real time, autonomously and anonymously, this could change. A prerequisite for this is a reliable automatic detection of possibly dangerous situations from video data. This is done classically by object extraction and tracking. From the derived trajectories, we then want to determine dangerous situations by detecting atypical trajectories. However, due to ethical considerations it is better to develop such a system on data without people being threatened or even harmed, plus with having them know that there is such a tracking system installed. Another important point is that these situations do not occur very often in real, public CCTV areas and may be captured properly even less. In the artistic project leave a trace the tracked objects, people in an atrium of a institutional building, become actor and thus part of the installation. Visualisation in real-time allows interaction by these actors, which in turn creates many atypical interaction situations on which we can develop our situation detection. The data set has evolved over three years and hence, is huge. In this article we describe the tracking system and several approaches for the detection of atypical trajectories

    The HPS electromagnetic calorimeter

    Get PDF
    The Heavy Photon Search experiment (HPS) is searching for a new gauge boson, the so-called “heavy photon.” Through its kinetic mixing with the Standard Model photon, this particle could decay into an electron-positron pair. It would then be detectable as a narrow peak in the invariant mass spectrum of such pairs, or, depending on its lifetime, by a decay downstream of the production target. The HPS experiment is installed in Hall-B of Jefferson Lab. This article presents the design and performance of one of the two detectors of the experiment, the electromagnetic calorimeter, during the runs performed in 2015–2016. The calorimeter's main purpose is to provide a fast trigger and reduce the copious background from electromagnetic processes through matching with a tracking detector. The detector is a homogeneous calorimeter, made of 442 lead-tungstate (PbWO4) scintillating crystals, each read out by an avalanche photodiode coupled to a custom trans-impedance amplifier
    corecore