7,248 research outputs found

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    New scenario for transition to slow 3D turbulence

    Get PDF
    Analytical non-perturbative study of the three-dimensional nonlinear stochastic partial differential equation with additive thermal noise, analogous to that proposed by V.N. Nikolaevskii [1]-[5]to describe longitudinal seismic waves, is presented. The equation has a threshold of short-wave instability and symmetry, providing long wave dynamics. New mechanism of quantum chaos generating in nonlinear dynamical systems with infinite number of degrees of freedom is proposed. The hypothesis is said, that physical turbulence could be identified with quantum chaos of considered type. It is shown that the additive thermal noise destabilizes dramatically the ground state of the Nikolaevskii system thus causing it to make a direct transition from a spatially uniform to a turbulent state.Comment: 23page

    Efficient use of bit planes in the generation of motion stimuli

    Get PDF
    The production of animated motion sequences on computer-controlled display systems presents a technical problem because large images cannot be transferred from disk storage to image memory at conventional frame rates. A technique is described in which a single base image can be used to generate a broad class of motion stimuli without the need for such memory transfers. This technique was applied to the generation of drifting sine-wave gratings (and by extension, sine wave plaids). For each drifting grating, sine and cosine spatial phase components are first reduced to 1 bit/pixel using a digital halftoning technique. The resulting pairs of 1-bit images are then loaded into pairs of bit planes of the display memory. To animate the patterns, the display hardware's color lookup table is modified on a frame-by-frame basis; for each frame the lookup table is set to display a weighted sum of the spatial sine and cosine phase components. Because the contrasts and temporal frequencies of the various components are mutually independent in each frame, the sine and cosine components can be counterphase modulated in temporal quadrature, yielding a single drifting grating. Using additional bit planes, multiple drifting gratings can be combined to form sine-wave plaid patterns. A large number of resultant plaid motions can be produced from a single image file because the temporal frequencies of all the components can be varied independently. For a graphics device having 8 bits/pixel, up to four drifting gratings may be combined, each having independently variable contrast and speed

    Quantization of Prior Probabilities for Hypothesis Testing

    Full text link
    Bayesian hypothesis testing is investigated when the prior probabilities of the hypotheses, taken as a random vector, are quantized. Nearest neighbor and centroid conditions are derived using mean Bayes risk error as a distortion measure for quantization. A high-resolution approximation to the distortion-rate function is also obtained. Human decision making in segregated populations is studied assuming Bayesian hypothesis testing with quantized priors

    Nonparametric estimation of the dynamic range of music signals

    Full text link
    The dynamic range is an important parameter which measures the spread of sound power, and for music signals it is a measure of recording quality. There are various descriptive measures of sound power, none of which has strong statistical foundations. We start from a nonparametric model for sound waves where an additive stochastic term has the role to catch transient energy. This component is recovered by a simple rate-optimal kernel estimator that requires a single data-driven tuning. The distribution of its variance is approximated by a consistent random subsampling method that is able to cope with the massive size of the typical dataset. Based on the latter, we propose a statistic, and an estimation method that is able to represent the dynamic range concept consistently. The behavior of the statistic is assessed based on a large numerical experiment where we simulate dynamic compression on a selection of real music signals. Application of the method to real data also shows how the proposed method can predict subjective experts' opinions about the hifi quality of a recording

    Optimal Image Reconstruction in Radio Interferometry

    Full text link
    We introduce a method for analyzing radio interferometry data which produces maps which are optimal in the Bayesian sense of maximum posterior probability density, given certain prior assumptions. It is similar to maximum entropy techniques, but with an exact accounting of the multiplicity instead of the usual approximation involving Stirling's formula. It also incorporates an Occam factor, automatically limiting the effective amount of detail in the map to that justified by the data. We use Gibbs sampling to determine, to any desired degree of accuracy, the multi-dimensional posterior density distribution. From this we can construct a mean posterior map and other measures of the posterior density, including confidence limits on any well-defined function of the posterior map.Comment: 41 pages, 11 figures. High resolution figures 8 and 9 available at http://www.astro.uiuc.edu/~bwandelt/SuttonWandelt200
    • …
    corecore