479 research outputs found

    The Future of Primordial Features with 21 cm Tomography

    Full text link
    Detecting a deviation from a featureless primordial power spectrum of fluctuations would give profound insight into the physics of the primordial Universe. Depending on their nature, primordial features can either provide direct evidence for the inflation scenario or pin down details of the inflation model. Thus far, using the cosmic microwave background (CMB) we have only been able to put stringent constraints on the amplitude of features, but no significant evidence has been found for such signals. Here we explore the limit of the experimental reach in constraining such features using 21 cm tomography at high redshift. A measurement of the 21 cm power spectrum from the Dark Ages is generally considered as the ideal experiment for early Universe physics, with potentially access to a large number of modes. We consider three different categories of theoretically motivated models: the sharp feature models, resonance models, and standard clock models. We study the improvements on bounds on features as a function of the total number of observed modes and identify parameter degeneracies. The detectability depends critically on the amplitude, frequency and scale-location of the features, as well as the angular and redshift resolution of the experiment. We quantify these effects by considering different fiducial models. Our forecast shows that a cosmic variance limited 21 cm experiment measuring fluctuations in the redshift range 30≀z≀10030\leq z \leq 100 with a 0.01-MHz bandwidth and sub-arcminute angular resolution could potentially improve bounds by several orders of magnitude for most features compared to current Planck bounds. At the same time, 21 cm tomography also opens up a unique window into features that are located on very small scales.Comment: Matches version accepted for publication. Changes made to forecasting; using k space instead of \ell space. Forecasted constraints significantly improved for some feature

    Detection of Mines in Acoustic Images using Higher Order Spectral Features

    Get PDF
    A new pattern-recognition algorithm detects approximately 90% of the mines hidden in the Coastal Systems Station Sonar0, 1, and 3 databases of cluttered acoustic images, with about 10% false alarms. Similar to other approaches, the algorithm presented here includes processing the images with an adaptive Wiener filter (the degree of smoothing depends on the signal strength in a local neighborhood) to remove noise without destroying the structural information in the mine shapes, followed by a two-dimensional FIR filter designed to suppress noise and clutter, while enhancing the target signature. A double peak pattern is produced as the FIR filter passes over mine highlight and shadow regions. Although the location, size, and orientation of this pattern within a region of the image can vary, features derived from higher order spectra (HOS) are invariant to translation, rotation, and scaling, while capturing the spatial correlations of mine-like objects. Classification accuracy is improved by combining features based on geometrical properties of the filter output with features based on HOS. The highest accuracy is obtained by fusing classification based on bispectral features with classification based on trispectral features

    FASTLens (FAst STatistics for weak Lensing) : Fast method for Weak Lensing Statistics and map making

    Full text link
    With increasingly large data sets, weak lensing measurements are able to measure cosmological parameters with ever greater precision. However this increased accuracy also places greater demands on the statistical tools used to extract the available information. To date, the majority of lensing analyses use the two point-statistics of the cosmic shear field. These can either be studied directly using the two-point correlation function, or in Fourier space, using the power spectrum. But analyzing weak lensing data inevitably involves the masking out of regions or example to remove bright stars from the field. Masking out the stars is common practice but the gaps in the data need proper handling. In this paper, we show how an inpainting technique allows us to properly fill in these gaps with only Nlog⁥NN \log N operations, leading to a new image from which we can compute straight forwardly and with a very good accuracy both the pow er spectrum and the bispectrum. We propose then a new method to compute the bispectrum with a polar FFT algorithm, which has the main advantage of avoiding any interpolation in the Fourier domain. Finally we propose a new method for dark matter mass map reconstruction from shear observations which integrates this new inpainting concept. A range of examples based on 3D N-body simulations illustrates the results.Comment: Final version accepted by MNRAS. The FASTLens software is available from the following link : http://irfu.cea.fr/Ast/fastlens.software.ph

    Detection of emotions in Parkinson's disease using higher order spectral features from brain's electrical activity

    Get PDF
    Non-motor symptoms in Parkinson's disease (PD) involving cognition and emotion have been progressively receiving more attention in recent times. Electroencephalogram (EEG) signals, being an activity of central nervous system, can reflect the underlying true emotional state of a person. This paper presents a computational framework for classifying PD patients compared to healthy controls (HC) using emotional information from the brain's electrical activity

    Bispectral reconstruction of speckle-degraded images

    Get PDF
    The bispectrum of a signal has useful properties such as being zero for a Gaussian random process, retaining both phase and magnitude information of the Fourier transform of a signal, and being insensitive to linear motion. It has found applications in a wide variety of fields. The use of these properties for reducing speckle in coherent imaging systems was investigated. It was found that the bispectrum could be used to restore speckle-degraded images. Coherent speckle noise is modeled as a multiplicative noise process. By using a logarithmic transformation, this speckle noise is converted to a signal independent, additive process which is close to Gaussian when an integrating aperture is used. Bispectral reconstruction of speckle-degraded images is performed on such logarithmically transformed images when we have independent multiple snapshots

    Parametrized modified gravity constraints after Planck

    Full text link
    We constrain f(R)f(R) and chameleon-type modified gravity in the framework of the Berstchinger-Zukin parametrization using the recent released Planck data, including both CMB temperature power spectrum and lensing potential power spectrum. Some other external data sets are included, such as BAO measurements from the 6dFGS, SDSS DR7 and BOSS DR9 surveys, HST H0H_0 measurement and supernovae from Union2.1 compilation. We also use WMAP9yr data for consistency check and comparison. For f(R)f(R) gravity, WMAP9yr results can only give quite a loose constraint on the modified gravity parameter B0B_0, which is related to the present value of the Compton wavelength of the extra scalar degree of freedom, B0<3.37B_0<3.37 at 95%C.L.95\% {\rm C.L.} We demonstrate that this constraint mainly comes from the late ISW effect. With only Planck CMB temperature power-spectrum data, we can improve the WMAP9yr result by a factor 3.73.7 (B0<0.91B_0<0.91 at 95%C.L.95\% {\rm C.L.}). If the Planck lensing potential power-spectrum data are also taken into account, the constraint can be further strenghtened by a factor 5.15.1 (B0<0.18B_0<0.18 at 95%C.L.95\% {\rm C.L.}). This major improvement mainly comes from the small-scale lensing signal. Furthermore, BAO, HST and supernovae data could slightly improve the B0B_0 bound (B0<0.12B_0<0.12 at 95%C.L.95\% {\rm C.L.}).For the chameleon-type model, we find that the data set which we used cannot constrain the Compton wavelength B0B_0 and the potential index ss of chameleon field, but can give a tight constraint on the parameter ÎČ1=1.043−0.104+0.163\beta_1=1.043^{+0.163}_{-0.104} at 95%C.L.95\% {\rm C.L.} (ÎČ1=1\beta_1=1 in general relativity), which accounts for the non-minimal coupling between the chameleon field and the matter component. In addition, we find that both modified gravity models we considered favor a relatively higher Hubble parameter than the concordance LCDM model in general relativity.Comment: Match to the published version. Several numerical bugs about modified gravity parameters removed, updated results are based on the analysis of new chains. B0B_0 constraint become loose, other parameter bounds do not change. One more figure added in order to explain the degeneracy of parameters between ÎČ1\beta_1 and B0B_0 in the chameleon-type model

    Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications

    Get PDF
    This reprint is dedicated to the study of brain activity related to emotional and attentional involvement as measured by Brain–computer interface (BCI) systems designed for different purposes. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action in the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game). These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional and attentional states by training a classification algorithm to recognize mental states. The success of BCI systems in contemporary neuroscientific research relies on the fact that they allow one to “think outside the lab”. The integration of technological solutions, artificial intelligence and cognitive science allowed and will allow researchers to envision more and more applications for the future. The clinical and everyday uses are described with the aim to invite readers to open their minds to imagine potential further developments
    • 

    corecore