104 research outputs found

    European ALMA operations: the interaction with and support to the users

    Full text link
    The Atacama Large Millimetre/submillimetre Array (ALMA) is one of the largest and most complicated observatories ever built. Constructing and operating an observatory at high altitude (5000m) in a cost effective and safe manner, with minimal effect on the environment creates interesting challenges. Since the array will have to adapt quickly to prevailing weather conditions, ALMA will be operated exclusively in service mode. By the time of full science operations, the fundamental ALMA data product shall be calibrated, deconvolved data cubes and images, but raw data and data reduction software will be made available to users as well. User support is provided by the ALMA Regional Centres (ARCs) located in Europe, North America and Japan. These ARCs constitute the interface between the user community and the ALMA observatory in Chile. For European users the European ARC is being set up as a cluster of nodes located throughout Europe, with the main centre at the ESO Headquarters in Garching. The main centre serves as the access portal and in synergy with the distributed network of ARC nodes, the main aim of the ARC is to optimize the ALMA science output and to fully exploit this unique and powerful facility. The aim of this article is to introduce the process of proposing for observing time, subsequent execution of the observations, obtaining and processing of the data in the ALMA epoch. The complete end-to-end process of the ALMA data flow from the proposal submission to the data delivery is described.Comment: 7 pages, three figure

    The correct estimate of the probability of false detection of the matched filter in the detection of weak signals. II. (Further results with application to a set of ALMA and ATCA data)

    Full text link
    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.Comment: 28 pages, 20 figures, Astronomy & Astrophysics, Minor changes and some typos correcte

    Predicting the clustering properties of galaxy clusters detectable for the Planck satellite

    Get PDF
    We study the clustering properties of the galaxy clusters detectable for the Planck satellite due to their thermal Sunyaev-Zel'dovich effect. We take the past light-cone effect and the redshift evolution of both the underlying dark matter correlation function and the cluster bias factor into account. A theoretical mass-temperature relation allows us to convert the sensitivity limit of a catalogue into a minimum mass for the dark matter haloes hosting the clusters. We confirm that the correlation length is an increasing function of the sensitivity limits defining the survey. Using the expected characteristics of the Planck cluster catalogue, which will be a quite large and unbiased sample, we predict the two-point correlation function and power spectrum for different cosmological models. We show that the wide redshift distribution of the Planck survey, will allow to constrain the cluster clustering properties up to z=1. The dependence of our results on the main cosmological parameters (the matter density parameter, the cosmological constant and the normalisation of the density power-spectrum) is extensively discussed. We find that the future Planck clustering data place only mild constraints on the cosmological parameters, because the results depend on the physical characteristics of the intracluster medium, like the baryon fraction and the mass-temperature relation. Once the cosmological model and the Hubble constant are determined, the clustering data will allow a determination of the baryon fraction with an accuracy of few per cent.Comment: 11 pages, MNRAS in press. Minor changes to match the accepted versio
    • …
    corecore