1,037 research outputs found
HPC, grid and data infrastructures for astrophysics: An integrated view
Also in the case of astrophysics, the capability of performing “Big Science” requires the availability of large HPC facilities. But computational resources alone are far from being enough for the community: as a matter of fact,
the whole set of e-infrastructures (network, computing nodes, data repositories, applications) need to work in an interoperable way. This implies the development of common (or at least compatible) user interfaces to computing resources, transparent access to observations and numerical simulations through the Virtual Observatory, integrated data processing pipelines, data mining and semantic web applications. Achieving this interoperability goal is a must to build a real “Knowledge Infrastructure” in the astrophysical domain
Data Streams from the Low Frequency Instrument On-Board the Planck Satellite: Statistical Analysis and Compression Efficiency
The expected data rate produced by the Low Frequency Instrument (LFI) planned
to fly on the ESA Planck mission in 2007, is over a factor 8 larger than the
bandwidth allowed by the spacecraft transmission system to download the LFI
data. We discuss the application of lossless compression to Planck/LFI data
streams in order to reduce the overall data flow. We perform both theoretical
analysis and experimental tests using realistically simulated data streams in
order to fix the statistical properties of the signal and the maximal
compression rate allowed by several lossless compression algorithms. We studied
the influence of signal composition and of acquisition parameters on the
compression rate Cr and develop a semiempirical formalism to account for it.
The best performing compressor tested up to now is the arithmetic compression
of order 1, designed for optimizing the compression of white noise like
signals, which allows an overall compression rate = 2.65 +/- 0.02. We find
that such result is not improved by other lossless compressors, being the
signal almost white noise dominated. Lossless compression algorithms alone will
not solve the bandwidth problem but needs to be combined with other techniques.Comment: May 3, 2000 release, 61 pages, 6 figures coded as eps, 9 tables (4
included as eps), LaTeX 2.09 + assms4.sty, style file included, submitted for
the pubblication on PASP May 3, 200
On the loss of telemetry data in full-sky surveys from space
In this paper we discuss the issue of loosing telemetry (TM) data due to
different reasons (e.g. spacecraft-ground transmissions) while performing a
full-sky survey with space-borne instrumentation. This is a particularly
important issue considering the current and future space missions (like Planck
from ESA and WMAP from NASA) operating from an orbit far from Earth with short
periods of visibility from ground stations. We consider, as a working case, the
Low Frequency Instrument (LFI) on-board the Planck satellite albeit the
approach developed here can be easily applied to any kind of experiment that
makes use of an observing (scanning) strategy which assumes repeated pointings
of the same region of the sky on different time scales. The issue is addressed
by means of a Monte Carlo approach. Our analysis clearly shows that, under
quite general conditions, it is better to cover the sky more times with a lower
fraction of TM retained than less times with a higher guaranteed TM fraction.
In the case of Planck, an extension of mission time to allow a third sky
coverage with 95% of the total TM guaranteed provides a significant reduction
of the probability to loose scientific information with respect to an increase
of the total guaranteed TM to 98% with the two nominal sky coverages.Comment: 17 pages, 6 figures, accepted for publication on New Astronom
Organization of the Euclid Data Processing: Dealing with Complexity
The data processing development and operations for the Euclid mission (part of the ESA Cosmic Vision 2015-2025 Plan) is distributed within a Consortium composed of 14 countries and 1300+ persons: this imposes a high degree of complexity to the design and implementation of the data processing facilities. The focus of this paper is on the efforts to define an organisational structure capable of handling in manageable terms such a complexity
Imaging the first light: experimental challenges and future perspectives in the observation of the Cosmic Microwave Background Anisotropy
Measurements of the cosmic microwave background (CMB) allow high precision
observation of the Last Scattering Surface at redshift 1100. After the
success of the NASA satellite COBE, that in 1992 provided the first detection
of the CMB anisotropy, results from many ground-based and balloon-borne
experiments have showed a remarkable consistency between different results and
provided quantitative estimates of fundamental cosmological properties. During
2003 the team of the NASA WMAP satellite has released the first improved
full-sky maps of the CMB since COBE, leading to a deeper insight into the
origin and evolution of the Universe. The ESA satellite Planck, scheduled for
launch in 2007, is designed to provide the ultimate measurement of the CMB
temperature anisotropy over the full sky, with an accuracy that will be limited
only by astrophysical foregrounds, and robust detection of polarisation
anisotropy. In this paper we review the experimental challenges in high
precision CMB experiments and discuss the future perspectives opened by second
and third generation space missions like WMAP and Planck.Comment: To be published in "Recent Research Developments in Astronomy &
Astrophysics Astrophysiscs" - Vol I
Cloud Computing for Astronomers on Top of EGI Federated Cloud
EGI Federated Cloud offers a general academic Cloud Infrastructure. We exploit EGI functionalities to address the needs of representative Astronomy and Astrophysics communities through clouds and gateways while respecting commonly used standards. The vision is to offer a novel environment empowering scientists to focus more on experimenting and pitching new ideas to service their needs for scientific discovery
A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the
handling of the scientific and housekeeping telemetry. It is a critical
component of the Planck ground segment which has to strictly commit to the
project schedule to be ready for the launch and flight operations. In order to
guarantee the quality necessary to achieve the objectives of the Planck
mission, the design and development of the Level 1 software has followed the
ESA Software Engineering Standards. A fundamental step in the software life
cycle is the Verification and Validation of the software. The purpose of this
work is to show an example of procedures, test development and analysis
successfully applied to a key software project of an ESA mission. We present
the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by
detailing the methods used and the results obtained. Different approaches have
been used to test the scientific and housekeeping data processing. Scientific
data processing has been tested by injecting signals with known properties
directly into the acquisition electronics, in order to generate a test dataset
of real telemetry data and reproduce as much as possible nominal conditions.
For the HK telemetry processing, validation software have been developed to
inject known parameter values into a set of real housekeeping packets and
perform a comparison with the corresponding timelines generated by the Level 1.
With the proposed validation and verification procedure, where the on-board and
ground processing are viewed as a single pipeline, we demonstrated that the
scientific and housekeeping processing of the Planck-LFI raw data is correct
and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI
papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jins
Applicability of the langley method for non-geostationary in-orbit satellite effective isotropic radiated power estimation
The Effective Isotropic Radiated Power (EIRP) is a crucial parameter characterizing the transmitting antennas of a radiofrequency satellite link. During the satellite commissioning phase, the requirements compliance of communication subsystems is tested. One of the required tests concerns the EIRP of the satellite transmitting antenna. Ground-based power measurements of the satellite-emitted signal are collected to measure EIRP, provided that an estimate of the atmospheric losses is available from independent ancillary measurements or model data. This paper demonstrates the applicability of the so-called Langley method to infer EIRP and atmospheric attenuation simultaneously from ground-based power measurements, with no need for ancillary measurements. It is shown that the proposed method gives results similar to more traditional methods, without prior information on atmospheric attenuation. Thus, the proposed method can be applied to monitor EIRP throughout the satellite life-time from ground-based power measurements alone
Come pensano i servizi socio-sanitari? Servizi di salute riproduttiva e donne migranti: la mancata “calibrazione culturale”
I risultati ottenuti per mezzo di una ricerca etnografica condotta nel 2018 e finalizzata a investigare la relazione tra le donne migranti e i servizi so- cio-sanitari dedicati alla salute riproduttiva a Verona, fungeranno da leva per l’approfondimento e la messa a fuoco di una serie di elementi (convenzioni, stereotipizzazioni, pregiudizi) che sono alla base di ciò a cui si attribuisce l’etichetta di “cultura” dei servizi. Parafrasando il lavoro di Mary Douglas, ci chiederemo: come pensano i servizi socio-sanitari? L’obiettivo è quello di rendere visibile l’azione delle istituzioni e dei servizi socio-sanitari nei pro- cessi di presa in carico dei bisogni di salute sessuale e riproduttiva di donne migranti, gli “scarti” e le “frizioni” che si determinano nella relazione tra servizi e utenza migrante e, infine, l’agency degli attori istituzionali
The Blue Straggler population in the globular cluster M53 (NGC5024): a combined HST, LBT, CFHT study
We used a proper combination of multiband high-resolution and wide field
multi-wavelength observations collected at three different telescopes (HST, LBT
and CFHT) to probe Blue Straggler Star (BSS) populations in the globular
cluster M53. Almost 200 BSS have been identified over the entire cluster
extension. The radial distribution of these stars has been found to be bimodal
(similarly to that of several other clusters) with a prominent dip at ~60'' (~2
r_c) from the cluster center. This value turns out to be a factor of two
smaller than the radius of avoidance (r_avoid, the radius within which all the
stars of ~1.2 M_sun have sunk to the core because of dynamical friction effects
in an Hubble time). While in most of the clusters with a bimodal BSS radial
distribution, r_avoid has been found to be located in the region of the
observed minimum, this is the second case (after NGC6388) where this
discrepancy is noted. This evidence suggests that in a few clusters the
dynamical friction seems to be somehow less efficient than expected.
We have also used this data base to construct the radial star density profile
of the cluster: this is the most extended and accurate radial profile ever
published for this cluster, including detailed star counts in the very inner
region. The star density profile is reproduced by a standard King Model with an
extended core (~25'') and a modest value of the concentration parameter
(c=1.58). A deviation from the model is noted in the most external region of
the cluster (at r>6.5' from the center). This feature needs to be further
investigated in order to address the possible presence of a tidal tail in this
cluster.Comment: 25 pages, 9 figures, accepted for publication on Ap
- …