62 research outputs found

    Technical considerations on using the large Nancay radio telescope for SETI

    Get PDF
    The Nancay decimetric Radio Telescope (NRT) in Nancay, France, is described, and its potential use for Search for Extraterrestrial Intelligence (SETI) observations is discussed. The conclusion reached is that the NRT is well suited for SETI observations because of its large collecting area, its large sky coverage, and its wideband frequency capability. However, a number of improvements are necessary in order to take full advantage of the system in carrying out an efficient SETI program. In particular, system sensitivity should be increased. This can be achieved through a series of improvements to the system, including lowering the ground pickup noise through the use of ground reflectors and more efficient feed design, and by using low-noise amplifier front ends

    A Computational Comparison of Optimization Methods for the Golomb Ruler Problem

    Full text link
    The Golomb ruler problem is defined as follows: Given a positive integer n, locate n marks on a ruler such that the distance between any two distinct pair of marks are different from each other and the total length of the ruler is minimized. The Golomb ruler problem has applications in information theory, astronomy and communications, and it can be seen as a challenge for combinatorial optimization algorithms. Although constructing high quality rulers is well-studied, proving optimality is a far more challenging task. In this paper, we provide a computational comparison of different optimization paradigms, each using a different model (linear integer, constraint programming and quadratic integer) to certify that a given Golomb ruler is optimal. We propose several enhancements to improve the computational performance of each method by exploring bound tightening, valid inequalities, cutting planes and branching strategies. We conclude that a certain quadratic integer programming model solved through a Benders decomposition and strengthened by two types of valid inequalities performs the best in terms of solution time for small-sized Golomb ruler problem instances. On the other hand, a constraint programming model improved by range reduction and a particular branching strategy could have more potential to solve larger size instances due to its promising parallelization features

    Image reconstruction in optical interferometry: Benchmarking the regularization

    Full text link
    With the advent of infrared long-baseline interferometers with more than two telescopes, both the size and the completeness of interferometric data sets have significantly increased, allowing images based on models with no a priori assumptions to be reconstructed. Our main objective is to analyze the multiple parameters of the image reconstruction process with particular attention to the regularization term and the study of their behavior in different situations. The secondary goal is to derive practical rules for the users. Using the Multi-aperture image Reconstruction Algorithm (MiRA), we performed multiple systematic tests, analyzing 11 regularization terms commonly used. The tests are made on different astrophysical objects, different (u,v) plane coverages and several signal-to-noise ratios to determine the minimal configuration needed to reconstruct an image. We establish a methodology and we introduce the mean-square errors (MSE) to discuss the results. From the ~24000 simulations performed for the benchmarking of image reconstruction with MiRA, we are able to classify the different regularizations in the context of the observations. We find typical values of the regularization weight. A minimal (u,v) coverage is required to reconstruct an acceptable image, whereas no limits are found for the studied values of the signal-to-noise ratio. We also show that super-resolution can be achieved with increasing performance with the (u,v) coverage filling. Using image reconstruction with a sufficient (u,v) coverage is shown to be reliable. The choice of the main parameters of the reconstruction is tightly constrained. We recommend that efforts to develop interferometric infrastructures should first concentrate on the number of telescopes to combine, and secondly on improving the accuracy and sensitivity of the arrays.Comment: 15 pages, 16 figures; accepted in A&

    Absence of "Ghost Images" Excludes Large Values of the Cosmological Constant

    Get PDF
    We used the 1.4 GHz NRAO NVSS survey to search for ghost images of radio sources, expected in cosmologies with a positive cosmological constant and positive space curvature. No statistically significant evidence for ghost images was found, placing constraints on the values of L, the space curvature or the duration of the radio-luminous phase of extragalactic radio sources.Comment: 11 pages 2 figure

    Representativeness of Eddy-Covariance flux footprints for areas surrounding AmeriFlux sites

    Get PDF
    Large datasets of greenhouse gas and energy surface-atmosphere fluxes measured with the eddy-covariance technique (e.g., FLUXNET2015, AmeriFlux BASE) are widely used to benchmark models and remote-sensing products. This study addresses one of the major challenges facing model-data integration: To what spatial extent do flux measurements taken at individual eddy-covariance sites reflect model- or satellite-based grid cells? We evaluate flux footprints—the temporally dynamic source areas that contribute to measured fluxes—and the representativeness of these footprints for target areas (e.g., within 250–3000 m radii around flux towers) that are often used in flux-data synthesis and modeling studies. We examine the land-cover composition and vegetation characteristics, represented here by the Enhanced Vegetation Index (EVI), in the flux footprints and target areas across 214 AmeriFlux sites, and evaluate potential biases as a consequence of the footprint-to-target-area mismatch. Monthly 80% footprint climatologies vary across sites and through time ranging four orders of magnitude from 103 to 107 m2 due to the measurement heights, underlying vegetation- and ground-surface characteristics, wind directions, and turbulent state of the atmosphere. Few eddy-covariance sites are located in a truly homogeneous landscape. Thus, the common model-data integration approaches that use a fixed-extent target area across sites introduce biases on the order of 4%–20% for EVI and 6%–20% for the dominant land cover percentage. These biases are site-specific functions of measurement heights, target area extents, and land-surface characteristics. We advocate that flux datasets need to be used with footprint awareness, especially in research and applications that benchmark against models and data products with explicit spatial information. We propose a simple representativeness index based on our evaluations that can be used as a guide to identify site-periods suitable for specific applications and to provide general guidance for data use

    ECOSTRESS: NASA's next generation mission to measure evapotranspiration from the International Space Station

    Get PDF
    The ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station ECOSTRESS) was launched to the International Space Station on June 29, 2018. The primary science focus of ECOSTRESS is centered on evapotranspiration (ET), which is produced as level‐3 (L3) latent heat flux (LE) data products. These data are generated from the level‐2 land surface temperature and emissivity product (L2_LSTE), in conjunction with ancillary surface and atmospheric data. Here, we provide the first validation (Stage 1, preliminary) of the global ECOSTRESS clear‐sky ET product (L3_ET_PT‐JPL, version 6.0) against LE measurements at 82 eddy covariance sites around the world. Overall, the ECOSTRESS ET product performs well against the site measurements (clear‐sky instantaneous/time of overpass: r2 = 0.88; overall bias = 8%; normalized RMSE = 6%). ET uncertainty was generally consistent across climate zones, biome types, and times of day (ECOSTRESS samples the diurnal cycle), though temperate sites are over‐represented. The 70 m high spatial resolution of ECOSTRESS improved correlations by 85%, and RMSE by 62%, relative to 1 km pixels. This paper serves as a reference for the ECOSTRESS L3 ET accuracy and Stage 1 validation status for subsequent science that follows using these data

    The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data.

    Full text link
    The FLUXNET2015 dataset provides ecosystem-scale data on CO2, water, and energy exchange between the biosphere and the atmosphere, and other meteorological and biological measurements, from 212 sites around the globe (over 1500 site-years, up to and including year 2014). These sites, independently managed and operated, voluntarily contributed their data to create global datasets. Data were quality controlled and processed using uniform methods, to improve consistency and intercomparability across sites. The dataset is already being used in a number of applications, including ecophysiology studies, remote sensing studies, and development of ecosystem and Earth system models. FLUXNET2015 includes derived-data products, such as gap-filled time series, ecosystem respiration and photosynthetic uptake estimates, estimation of uncertainties, and metadata about the measurements, presented for the first time in this paper. In addition, 206 of these sites are for the first time distributed under a Creative Commons (CC-BY 4.0) license. This paper details this enhanced dataset and the processing methods, now made available as open-source codes, making the dataset more accessible, transparent, and reproducible

    Author Correction: The FLUXNET2015 dataset and the ONEFlux processing pipeline for eddy covariance data

    Full text link
    The following authors were omitted from the original version of this Data Descriptor: Markus Reichstein and Nicolas Vuichard. Both contributed to the code development and N. Vuichard contributed to the processing of the ERA-Interim data downscaling. Furthermore, the contribution of the co-author Frank Tiedemann was re-evaluated relative to the colleague Corinna Rebmann, both working at the same sites, and based on this re-evaluation a substitution in the co-author list is implemented (with Rebmann replacing Tiedemann). Finally, two affiliations were listed incorrectly and are corrected here (entries 190 and 193). The author list and affiliations have been amended to address these omissions in both the HTML and PDF versions

    Spectral detector for interference time blanking using quantized correlator

    Full text link
    Given the large flow of data to be processed, quantized correlators are widely used in radio astronomy. Unfortunately, the occurrence of non-Gaussian interference combined with a coarse quantization can strongly alter the shape of the estimated spectra. The final spectral estimation can be preserved by blanking the correlator in real-time. A new interference detection criterion is proposed within this framework. It uses the real-time capabilities of correlators and compares contaminated and non-contaminated correlation functions. No a priori information on the interfering signals is required. Simulations, using synthetic and actual data, are presented. This new technique of real time detection can significantly improve the quality of spectral line observations
    • 

    corecore