21,104 research outputs found

    LoCuSS: Hydrostatic Mass Measurements of the High-LXL_X Cluster Sample -- Cross-calibration of Chandra and XMM-Newton

    Full text link
    We present a consistent analysis of Chandra and XMM-Newton observations of an approximately mass-selected sample of 50 galaxy clusters at 0.15<z<0.30.15<z<0.3 -- the "LoCuSS High-LXL_X Sample". We apply the same analysis methods to data from both satellites, including newly developed analytic background models that predict the spatial variation of the Chandra and XMM-Newton backgrounds to <2%<2\% and <5%<5\% precision respectively. To verify the cross-calibration of Chandra and XMM-Newton-based cluster mass measurements, we derive the mass profiles of the 21 clusters that have been observed with both satellites, extracting surface brightness and temperature profiles from identical regions of the respective datasets. We obtain consistent results for the gas and total hydrostatic cluster masses: the average ratio of Chandra- to XMM-Newton-based measurements of MgasM_{\rm gas} and MXM_X at r500r_{500} are 0.99±0.020.99\pm0.02 and 1.02±0.051.02\pm0.05, respectively with an intrinsic scatter of ∼3%\sim3\% for gas masses and ∼8%\sim8\% for hydrostatic masses. Comparison of our hydrostatic mass measurements at r500r_{500} with the latest LoCuSS weak-lensing results indicate that the data are consistent with non-thermal pressure support at this radius of ∼7%\sim7\%. We also investigate the scaling relation between our hydrostatic cluster masses and published integrated Compton parameter YsphY_{sph} measurements from the Sunyaev-Zel'dovich Array. We measure a scatter in mass at fixed YsphY_{sph} of ∼16%\sim16\% at Δ=500\Delta=500, which is consistent with theoretical predictions of ∼10−15%\sim10-15\% scatter.Comment: 21 pages, 11 figure

    Three-Dimensional Time-Resolved Trajectories from Laboratory Insect Swarms

    Get PDF
    Aggregations of animals display complex and dynamic behaviour, both at the individual level and on the level of the group as a whole. Often, this behaviour is collective, so that the group exhibits properties that are distinct from those of the individuals. In insect swarms, the motion of individuals is typically convoluted, and swarms display neither net polarization nor correlation. The swarms themselves, however, remain nearly stationary and maintain their cohesion even in noisy natural environments. This behaviour stands in contrast with other forms of collective animal behaviour, such as flocking, schooling, or herding, where the motion of individuals is more coordinated, and thus swarms provide a powerful way to study the underpinnings of collective behaviour as distinct from global order. Here, we provide a data set of three-dimensional, time-resolved trajectories, including positions, velocities, and accelerations, of individual insects in laboratory insect swarms. The data can be used to study the collective as a whole as well as the dynamics and behaviour of individuals within the swarm

    Unobtrusive and pervasive video-based eye-gaze tracking

    Get PDF
    Eye-gaze tracking has long been considered a desktop technology that finds its use inside the traditional office setting, where the operating conditions may be controlled. Nonetheless, recent advancements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking eye movements within unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. This critical review focuses on emerging passive and unobtrusive video-based eye-gaze tracking methods in recent literature, with the aim to identify different research avenues that are being followed in response to the challenges of pervasive eye-gaze tracking. Different eye-gaze tracking approaches are discussed in order to bring out their strengths and weaknesses, and to identify any limitations, within the context of pervasive eye-gaze tracking, that have yet to be considered by the computer vision community.peer-reviewe

    CED: Color Event Camera Dataset

    Full text link
    Event cameras are novel, bio-inspired visual sensors, whose pixels output asynchronous and independent timestamped spikes at local intensity changes, called 'events'. Event cameras offer advantages over conventional frame-based cameras in terms of latency, high dynamic range (HDR) and temporal resolution. Until recently, event cameras have been limited to outputting events in the intensity channel, however, recent advances have resulted in the development of color event cameras, such as the Color-DAVIS346. In this work, we present and release the first Color Event Camera Dataset (CED), containing 50 minutes of footage with both color frames and events. CED features a wide variety of indoor and outdoor scenes, which we hope will help drive forward event-based vision research. We also present an extension of the event camera simulator ESIM that enables simulation of color events. Finally, we present an evaluation of three state-of-the-art image reconstruction methods that can be used to convert the Color-DAVIS346 into a continuous-time, HDR, color video camera to visualise the event stream, and for use in downstream vision applications.Comment: Conference on Computer Vision and Pattern Recognition Workshop

    On-sky speckle nulling demonstration at small angular separation with SCExAO

    Get PDF
    This paper presents the first on-sky demonstration of speckle nulling, which was achieved at the Subaru Telescope in the context of the Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) Project. Despite the absence of a high-order high-bandwidth closed-loop AO system, observations conducted with SCExAO show that even in poor-to-moderate observing conditions, speckle nulling can be used to suppress static and slow speckles even in the presence of a brighter dynamic speckle halo, suggesting that more advanced high-contrast imaging algorithms developed in the laboratory can be applied to ground-based systems.Comment: 5 figures, accepted for publication by PAS
    • …
    corecore