1,427 research outputs found

    Data Streams from the Low Frequency Instrument On-Board the Planck Satellite: Statistical Analysis and Compression Efficiency

    Get PDF
    The expected data rate produced by the Low Frequency Instrument (LFI) planned to fly on the ESA Planck mission in 2007, is over a factor 8 larger than the bandwidth allowed by the spacecraft transmission system to download the LFI data. We discuss the application of lossless compression to Planck/LFI data streams in order to reduce the overall data flow. We perform both theoretical analysis and experimental tests using realistically simulated data streams in order to fix the statistical properties of the signal and the maximal compression rate allowed by several lossless compression algorithms. We studied the influence of signal composition and of acquisition parameters on the compression rate Cr and develop a semiempirical formalism to account for it. The best performing compressor tested up to now is the arithmetic compression of order 1, designed for optimizing the compression of white noise like signals, which allows an overall compression rate = 2.65 +/- 0.02. We find that such result is not improved by other lossless compressors, being the signal almost white noise dominated. Lossless compression algorithms alone will not solve the bandwidth problem but needs to be combined with other techniques.Comment: May 3, 2000 release, 61 pages, 6 figures coded as eps, 9 tables (4 included as eps), LaTeX 2.09 + assms4.sty, style file included, submitted for the pubblication on PASP May 3, 200

    Hip-Hopping Over the Great Firewall of China: Authenticity, Language and Race in the Global Hip Hop Nation

    Get PDF
    This paper explores how Chinese youth interact and relate to this form of music and culture, and what this adaptation reveals about authenticity, class, race and regionalization in the age of digitized communication. For this paper, I ethnographically observe how participants experience Chinese Hip Hop as part of the Global spread of Hip Hop, as a cultural phenomenon that relates cosmopolitan marginalized youth identity, digital censorship, shedding light on relations to race, class, nationality and globalization among college aged international Chinese students studying at Bard College in Annandale-On-Hudson New York

    CMB signal in WMAP 3yr data with FastICA

    Get PDF
    We present an application of the fast Independent Component Analysis (FastICA) to the WMAP 3yr data with the goal of extracting the CMB signal. We evaluate the confidence of our results by means of Monte Carlo simulations including CMB, foreground contaminations and instrumental noise specific of each WMAP frequency band. We perform a complete analysis involving all or a subset of the WMAP channels in order to select the optimal combination for CMB extraction, using the frequency scaling of the reconstructed component as a figure of merit. We found that the combination KQVW provides the best CMB frequency scaling, indicating that the low frequency foreground contamination in Q, V and W bands is better traced by the emission in the K band. The CMB angular power spectrum is recovered up to the degree scale, it is consistent within errors for all WMAP channel combination considered, and in close agreement with the WMAP 3yr results. We perform a statistical analysis of the recovered CMB pattern, and confirm the sky asymmetry reported in several previous works with independent techniques.Comment: 10 pages, 7 figures, submitted to MNRA

    Processing of Irradiated 241Am Targets by Ion Exchange and Extraction. EUR 4409.

    Get PDF
    Abstract Background Evidence supporting the effectiveness of care management programs for complex patients has been inconclusive. However, past reviews have not focused on complexity primarily defined by multimorbidity and healthcare utilization. We conducted a systematic review of care management interventions targeting the following three patient groups: adults with two or more chronic medical conditions, adults with at least one chronic medical condition and concurrent depression, and adults identified based solely on high past or predicted healthcare utilization. Methods Eligible studies were identified from PubMed, published between 06/01/2005 and 05/31/2015, and reported findings from a randomized intervention that tested a comprehensive, care management intervention. Identified interventions were grouped based on the three “complex” categories of interest (described above). Two investigators extracted data using a structured abstraction form and assessed RCT quality. Results We screened 989 article titles for eligibility from which 847 were excluded. After reviewing the remaining 142 abstracts, 83 articles were excluded. We reviewed the full-text of 59 full-text articles and identified 15 unique RCTs for the final analysis. Of these 15 studies, two focused on patients with two or more chronic medical conditions, seven on patients with at least one chronic medical condition and depression, and six on patients with high past or predicted healthcare utilization. Measured outcomes included utilization, chronic disease measures, and patient-reported outcomes. The seven studies targeting patients with at least one chronic medical condition and depression demonstrated significant improvement in depression symptoms (ranging from 9.2 to 48.7% improvement). Of the six studies that focused on high utilizers, two showed small reductions in utilization. The quality of the research methodology in most of the studies (12/15) was rated fair or poor. Conclusions Interventions were more likely to be successful when patients were selected based on having at least one chronic medical condition and concurrent depression, and when patient-reported outcomes were assessed. Future research should focus on the role of mental health in complex care management, finding better methods for identifying patients who would benefit most from care management, and determining which intervention components are needed for which patients

    Trade-off between angular resolution and straylight contamination in CMB anisotropy experiments. II. Straylight evaluation

    Get PDF
    Satellite CMB anisotropy missions and new generation of balloon-borne and ground experiments, make use of complex multi-frequency instruments at the focus of a meter class telescope. Between 70 GHz and 300 GHz, where foreground contamination is minimum, it is extremely important to reach the best trade-off between the improvement of the angular resolution and the minimization of the straylight contamination mainly due to the Galactic emission. We focus here, as a working case, on the 30 and 100 GHz channels of the Planck Low Frequency Instrument (LFI). We evaluate the GSC introduced by the most relevant Galactic foreground components for a reference set of optical configurations. We show that it is possible to improve the angular resolution of 5-7% by keeping the overall GSC below the level of few microKelvin. A comparison between the level of straylight introduced by the different Galactic components for different beam regions is presented. Simple approximate relations giving the rms and peak-to-peak levels of the GSC are provided. We compare the results obtained at 100 GHz with those at 30 GHz, where GSC is more critical. Finally, we compare the results based on Galactic foreground templates derived from radio and IR surveys with those based on WMAP maps including CMB and extragalactic source fluctuations.Comment: Submitted to A&A. Quality of the figures was degraded for size-related reason

    On the loss of telemetry data in full-sky surveys from space

    Full text link
    In this paper we discuss the issue of loosing telemetry (TM) data due to different reasons (e.g. spacecraft-ground transmissions) while performing a full-sky survey with space-borne instrumentation. This is a particularly important issue considering the current and future space missions (like Planck from ESA and WMAP from NASA) operating from an orbit far from Earth with short periods of visibility from ground stations. We consider, as a working case, the Low Frequency Instrument (LFI) on-board the Planck satellite albeit the approach developed here can be easily applied to any kind of experiment that makes use of an observing (scanning) strategy which assumes repeated pointings of the same region of the sky on different time scales. The issue is addressed by means of a Monte Carlo approach. Our analysis clearly shows that, under quite general conditions, it is better to cover the sky more times with a lower fraction of TM retained than less times with a higher guaranteed TM fraction. In the case of Planck, an extension of mission time to allow a third sky coverage with 95% of the total TM guaranteed provides a significant reduction of the probability to loose scientific information with respect to an increase of the total guaranteed TM to 98% with the two nominal sky coverages.Comment: 17 pages, 6 figures, accepted for publication on New Astronom
    • …
    corecore