10,929 research outputs found

    Planck Intermediate Results. IX. Detection of the Galactic haze with Planck

    Get PDF
    Using precise full-sky observations from Planck, and applying several methods of component separation, we identify and characterize the emission from the Galactic "haze" at microwave wavelengths. The haze is a distinct component of diffuse Galactic emission, roughly centered on the Galactic centre, and extends to |b| ~35 deg in Galactic latitude and |l| ~15 deg in longitude. By combining the Planck data with observations from the WMAP we are able to determine the spectrum of this emission to high accuracy, unhindered by the large systematic biases present in previous analyses. The derived spectrum is consistent with power-law emission with a spectral index of -2.55 +/- 0.05, thus excluding free-free emission as the source and instead favouring hard-spectrum synchrotron radiation from an electron population with a spectrum (number density per energy) dN/dE ~ E^-2.1. At Galactic latitudes |b|<30 deg, the microwave haze morphology is consistent with that of the Fermi gamma-ray "haze" or "bubbles," indicating that we have a multi-wavelength view of a distinct component of our Galaxy. Given both the very hard spectrum and the extended nature of the emission, it is highly unlikely that the haze electrons result from supernova shocks in the Galactic disk. Instead, a new mechanism for cosmic-ray acceleration in the centre of our Galaxy is implied.Comment: 15 pages, 9 figures, submitted to Astronomy and Astrophysic

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Intellectual capital and value co-creation: an empirical analysis from a marketing perspective

    Get PDF
    The aim of this study is to investigate intellectual capital (IC) drivers that may influence Italian consumers’ decision to participate in value co-creation (VCC) activities with firms. Given the exploratory nature of the research, after a review of the relevant literature, we conducted a survey among Italian consumers to see if IC principal sub-dimensions (i.e. Relational Capital, Human Capital and Structural Capital) played a role in triggering VCC processes. Using a Principal Component Analysis (PCA), we analyzed 270 usable questionnaires finding that, in order to decide to co-create value with firms, IC sub-dimensions actually play a critical role. Our findings showed that the motivations (i.e., IC components) that influence Italian consumers’ decision to participate in value co-creation activities with firms are quite homogeneous and similar both for those who already participated in past in these activities as well for those who never participated. The study has several managerial implications as well as limitations. In fact, the survey has been conducted only among Italian consumers and therefore the research should be extended by a geographically standpoint. Moreover, the research analyzed only the demand-side, while it would be certainly useful to know the point of view of companies also adopting other research methods (e.g., in-depth interviews). This study provides to practitioners important suggestions and warnings about the importance of the development of IC sub-dimensions to (co-)create value with external actors and consequently suggests the importance of adopting a “open” approach towards consumers to establish an effective and interactive relationship with them. The study fills a gap in the literature, since there are not so many references in literature for a deep understanding of the concrete relationship between IC and VCC. In addition, to our best knowledge this paper is the first that explore IC-related issues from a marketing perspective

    ¿Los sujetos con obesidad subestiman su tamaño corporal? Una revisión narrativa de los métodos de estimación y teorías explicativas

    Get PDF
    The widespread of overweight and obesity in the developed countries is a real societal issue, nevertheless a considerable amount of subjects with obesity do not recognize their condition. Researchers used different methods to assess body size perception by obese subjects and the results show that while some subjects with obesity estimate accurately or overestimate their body size, others underestimate their weight and their body size measures. A failure to identify overweight or obesity has serious consequences on the subject's health, as it is widely recognised that self-awareness is the first step to engage in a rehabilitation program. The spread of obesity underestimation and its implications make the case for a new hypothetical body image disorder, which has been called Fatorexia (TM). It consists in the significant underestimation of body size by subjects with obesity, as they are unable or unwilling to acknowledge their condition. Some researchers proposed a social explanation to the underestimation phenomenon, but here an alternative hypothesis, the Allocentric Lock Theory (ALT), is outlined to describe the mechanisms behind the underestimation of body size by subjects with obesity

    Simultaneous Planck, Swift, and Fermi observations of X-ray and gamma-ray selected blazars

    Get PDF
    We present simultaneous Planck, Swift, Fermi, and ground-based data for 105 blazars belonging to three samples with flux limits in the soft X-ray, hard X-ray, and gamma-ray bands. Our unique data set has allowed us to demonstrate that the selection method strongly influences the results, producing biases that cannot be ignored. Almost all the BL Lac objects have been detected by Fermi-LAT, whereas ~40% of the flat-spectrum radio quasars (FSRQs) in the radio, soft X-ray, and hard X-ray selected samples are still below the gamma-ray detection limit even after integrating 27 months of Fermi-LAT data. The radio to sub-mm spectral slope of blazars is quite flat up to ~70GHz, above which it steepens to ~-0.65. BL Lacs have significantly flatter spectra than FSRQs at higher frequencies. The distribution of the rest-frame synchrotron peak frequency (\nupS) in the SED of FSRQs is the same in all the blazar samples with =10^13.1 Hz, while the mean inverse-Compton peak frequency, , ranges from 10^21 to 10^22 Hz. The distributions of \nupS and of \nupIC of BL Lacs are much broader and are shifted to higher energies than those of FSRQs and strongly depend on the selection method. The Compton dominance of blazars ranges from ~0.2 to ~100, with only FSRQs reaching values >3. Its distribution is broad and depends strongly on the selection method, with gamma-ray selected blazars peaking at ~7 or more, and radio-selected blazars at values ~1, thus implying that the assumption that the blazar power is dominated by high-energy emission is a selection effect. Simple SSC models cannot explain the SEDs of most of the gamma-ray detected blazars in all samples. The SED of the blazars that were not detected by Fermi-LAT may instead be consistent with SSC emission. Our data challenge the correlation between bolometric luminosity and \nupS predicted by the blazar sequence.Comment: Version accepted by A&A. Joint Planck, Swift, and Fermi collaborations pape

    Assessment of Natural Resources Use for Sustainable Development - DPSIR Framework for Case Studies in Portsmouth and Thames Gateway, U.K.

    Get PDF
    This chapter reports on the uses of the DPSIR framework to assess the sustainability of the intertidal environments within the two UK case study areas, Portsmouth and Thames Gateway. It focuses on statutory conservation areas dominated by intertidal habitats. Two are located in Portsmouth (Portsmouth and Langstone Harbours) and four in the Thames Gateway (Benfleet Marshes, South Thames Estuary, Medway Estuary and the Swale in the Thames Gateway). Based on the reduction of a number of pressures and impacts observed in recent decades and the improvement of overall environmental quality, all six SSSIs are considered to be sustainable in the short and medium term. In the future, it is possible that the impacts of climate change, especially sea-level rise, might result in further reduction in the area and/or quality of intertidal habitats. Further integration between conservation and planning objectives (both for urban development and management of flood risk) at local level is needed to support the long-term sustainability of intertidal habitats

    Planck-LFI radiometers' spectral response

    Get PDF
    The Low Frequency Instrument (LFI) is an array of pseudo-correlation radiometers on board the Planck satellite, the ESA mission dedicated to precision measurements of the Cosmic Microwave Background. The LFI covers three bands centred at 30, 44 and 70 GHz, with a goal bandwidth of 20% of the central frequency. The characterization of the broadband frequency response of each radiometer is necessary to understand and correct for systematic effects, particularly those related to foreground residuals and polarization measurements. In this paper we present the measured band shape of all the LFI channels and discuss the methods adopted for their estimation. The spectral characterization of each radiometer was obtained by combining the measured spectral response of individual units through a dedicated RF model of the LFI receiver scheme. As a consistency check, we also attempted end-to-end spectral measurements of the integrated radiometer chain in a cryogenic chamber. However, due to systematic effects in the measurement setup, only qualitative results were obtained from these tests. The measured LFI bandpasses exhibit a moderate level of ripple, compatible with the instrument scientific requirements.Comment: 16 pages, 9 figures, this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Planck Intermediate Results. IV. The XMM-Newton validation programme for new Planck galaxy clusters

    Get PDF
    We present the final results from the XMM-Newton validation follow-up of new Planck galaxy cluster candidates. We observed 15 new candidates, detected with signal-to-noise ratios between 4.0 and 6.1 in the 15.5-month nominal Planck survey. The candidates were selected using ancillary data flags derived from the ROSAT All Sky Survey (RASS) and Digitized Sky Survey all-sky maps, with the aim of pushing into the low SZ flux, high-z regime and testing RASS flags as indicators of candidate reliability. 14 new clusters were detected by XMM, including 2 double systems. Redshifts lie in the range 0.2 to 0.9, with 6 clusters at z>0.5. Estimated M500 range from 2.5 10^14 to 8 10^14 Msun. We discuss our results in the context of the full XMM validation programme, in which 51 new clusters have been detected. This includes 4 double and 2 triple systems, some of which are chance projections on the sky of clusters at different z. We find that association with a RASS-BSC source is a robust indicator of the reliability of a candidate, whereas association with a FSC source does not guarantee that the SZ candidate is a bona fide cluster. Nevertheless, most Planck clusters appear in RASS maps, with a significance greater than 2 sigma being a good indication that the candidate is a real cluster. The full sample gives a Planck sensitivity threshold of Y500 ~ 4 10^-4 arcmin^2, with indication for Malmquist bias in the YX-Y500 relation below this level. The corresponding mass threshold depends on z. Systems with M500 > 5 10^14 Msun at z > 0.5 are easily detectable with Planck. The newly-detected clusters follow the YX-Y500 relation derived from X-ray selected samples. Compared to X-ray selected clusters, the new SZ clusters have a lower X-ray luminosity on average for their mass. There is no indication of departure from standard self-similar evolution in the X-ray versus SZ scaling properties. (abridged)Comment: accepted by A&

    Mathematical modelling of operation modes and performance evaluation of an innovative small-scale concentrated solar organic Rankine cycle plant

    Get PDF
    In this paper an innovative small-scale concentrated solar 2 kWe organic Rankine cycle plant coupled with a phase change material storage tank equipped with reversible heat pipes is investigated using a simulation analysis. The plant, intended for residential applications, is going to be built and tested under the European funded H2020 Innova MicroSolar project executed by the consortium of several Universities and industrial organizations, led by Northumbria University. The authors of this work used the design of the integrated system, developed by the consortium, to preliminary estimate the overall performance of the system in order to provide useful information for its forthcoming real operation. In particular, according to the varying ambient conditions, the influence of different operation modes of the prototype plant are evaluated. The dynamic simulation analysis has shown an interesting performance of the system in terms of annual operating hours, power production and conversion efficiencies. More precisely, the organic Rankine cycle unit is able to operate for more than 3100 h/year, achieving the design performance when solar power is sufficiently high, producing about 5100 kWhe/year. For the considered operating set-point temperatures of the thermal energy storage, the plant is able to reach high conversion efficiency also when the organic Rankine cycle unit is supplied by discharging the energy stored in the storage tank, for about 800 h/year. Hence, the work has provided some useful insights into the best working conditions of such micro combined heat and power system to be integrated in residential buildings. Moreover, the analysis could serve as a general guide for the design and optimization of the mutual interactions of the different subsystems in small-scale concentrated solar organic Rankine cycle plants

    Effect of Fourier filters in removing periodic systematic effects from CMB data

    Full text link
    We consider the application of high-pass Fourier filters to remove periodic systematic fluctuations from full-sky survey CMB datasets. We compare the filter performance with destriping codes commonly used to remove the effect of residual 1/f noise from timelines. As a realistic working case, we use simulations of the typical Planck scanning strategy and Planck Low Frequency Instrument noise performance, with spurious periodic fluctuations that mimic a typical thermal disturbance. We show that the application of Fourier high-pass filters in chunks always requires subsequent normalisation of induced offsets by means of destriping. For a complex signal containing all the astrophysical and instrumental components, the result obtained by applying filter and destriping in series is comparable to the result obtained by destriping only, which makes the usefulness of Fourier filters questionable for removing this kind of effects.Comment: 10 pages, 8 figures, published in Astronomy & Astrophysic
    corecore