37 research outputs found

    Microencapsulation of Phenolic Extracts from Cocoa Shells to Enrich Chocolate Bars

    Get PDF
    Cocoa bean shells were subjected to green extraction technologies, based on the absence of toxic organic solvents, to recover polyphenols; the extract was then encapsulated using a spray dryer and maltodextrin as coating agent. The best conditions observed in the spray drying tests (core-to-coating ratio 1:5; inlet temperature 150 °C; flow rate 6 ml min-1) were applied to produce the microcapsules used to enrich the same cocoa mass as the shells and processed for the preparation of the chocolate bars. Sensory analysis showed no significant differences between enriched chocolate bar and the unenriched reference one, except for the appearance. Both samples were then subjected to accelerated storage tests, at the end of which the polyphenols in the control chocolate bar (0.85 g 100 g-1) were reduced by about 50% (0.42 g 100 g-1), while in the enriched chocolate (1.17 g 100 g-1) by only 22% (0.97 g 100 g-1). The proposed process significantly enriched the chocolate bars with phenolic antioxidants recovered from cocoa waste without increasing the sensations of bitterness and astringency

    Spherical 3D Isotropic Wavelets

    Full text link
    Future cosmological surveys will provide 3D large scale structure maps with large sky coverage, for which a 3D Spherical Fourier-Bessel (SFB) analysis in spherical coordinates is natural. Wavelets are particularly well-suited to the analysis and denoising of cosmological data, but a spherical 3D isotropic wavelet transform does not currently exist to analyse spherical 3D data. The aim of this paper is to present a new formalism for a spherical 3D isotropic wavelet, i.e. one based on the SFB decomposition of a 3D field and accompany the formalism with a public code to perform wavelet transforms. We describe a new 3D isotropic spherical wavelet decomposition based on the undecimated wavelet transform (UWT) described in Starck et al. 2006. We also present a new fast Discrete Spherical Fourier-Bessel Transform (DSFBT) based on both a discrete Bessel Transform and the HEALPIX angular pixelisation scheme. We test the 3D wavelet transform and as a toy-application, apply a denoising algorithm in wavelet space to the Virgo large box cosmological simulations and find we can successfully remove noise without much loss to the large scale structure. We have described a new spherical 3D isotropic wavelet transform, ideally suited to analyse and denoise future 3D spherical cosmological surveys, which uses a novel Discrete Spherical Fourier-Bessel Transform. We illustrate its potential use for denoising using a toy model. All the algorithms presented in this paper are available for download as a public code called MRS3D at http://jstarck.free.fr/mrs3d.htmlComment: 9 pages + appendices. Public code can be downloaded at http://jstarck.free.fr/mrs3d.html Corrected typos and updated references. Accepted for publication in Astronomy and Astrophysic

    Observational biases in Lagrangian reconstructions of cosmic velocity fields

    Full text link
    Lagrangian reconstruction of large-scale peculiar velocity fields can be strongly affected by observational biases. We develop a thorough analysis of these systematic effects by relying on specially selected mock catalogues. For the purpose of this paper, we use the MAK reconstruction method, although any other Lagrangian reconstruction method should be sensitive to the same problems. We extensively study the uncertainty in the mass-to-light assignment due to luminosity incompleteness, and the poorly-determined relation between mass and luminosity. The impact of redshift distortion corrections is analyzed in the context of MAK and we check the importance of edge and finite-volume effects on the reconstructed velocities. Using three mock catalogues with different average densities, we also study the effect of cosmic variance. In particular, one of them presents the same global features as found in observational catalogues that extend to 80 Mpc/h scales. We give recipes, checked using the aforementioned mock catalogues, to handle these particular observational effects, after having introduced them into the mock catalogues so as to quantitatively mimic the most densely sampled currently available galaxy catalogue of the nearby universe. Once biases have been taken care of, the typical resulting error in reconstructed velocities is typically about a quarter of the overall velocity dispersion, and without significant bias. We finally model our reconstruction errors to propose an improved Bayesian approach to measure Omega_m in an unbiased way by comparing the reconstructed velocities to the measured ones in distance space, even though they may be plagued by large errors. We show that, in the context of observational data, a nearly unbiased estimator of Omega_m may be built using MAK reconstruction.Comment: 29 pages, 21 figures, 6 tables, Accepted by MNRAS on 2007 October 2. Received 2007 September 30; in original form 2007 July 2

    Bayesian reconstruction of the cosmological large-scale structure: methodology, inverse algorithms and numerical optimization

    Full text link
    We address the inverse problem of cosmic large-scale structure reconstruction from a Bayesian perspective. For a linear data model, a number of known and novel reconstruction schemes, which differ in terms of the underlying signal prior, data likelihood, and numerical inverse extra-regularization schemes are derived and classified. The Bayesian methodology presented in this paper tries to unify and extend the following methods: Wiener-filtering, Tikhonov regularization, Ridge regression, Maximum Entropy, and inverse regularization techniques. The inverse techniques considered here are the asymptotic regularization, the Jacobi, Steepest Descent, Newton-Raphson, Landweber-Fridman, and both linear and non-linear Krylov methods based on Fletcher-Reeves, Polak-Ribiere, and Hestenes-Stiefel Conjugate Gradients. The structures of the up-to-date highest-performing algorithms are presented, based on an operator scheme, which permits one to exploit the power of fast Fourier transforms. Using such an implementation of the generalized Wiener-filter in the novel ARGO-software package, the different numerical schemes are benchmarked with 1-, 2-, and 3-dimensional problems including structured white and Poissonian noise, data windowing and blurring effects. A novel numerical Krylov scheme is shown to be superior in terms of performance and fidelity. These fast inverse methods ultimately will enable the application of sampling techniques to explore complex joint posterior distributions. We outline how the space of the dark-matter density field, the peculiar velocity field, and the power spectrum can jointly be investigated by a Gibbs-sampling process. Such a method can be applied for the redshift distortions correction of the observed galaxies and for time-reversal reconstructions of the initial density field.Comment: 40 pages, 11 figure

    Towards a definitive symptom structure of obsessive-compulsive disorder: A factor and network analysis of 87 distinct symptoms in 1366 individuals

    Get PDF
    Background The symptoms of obsessive-compulsive disorder (OCD) are highly heterogeneous and it is unclear what is the optimal way to conceptualize this heterogeneity. This study aimed to establish a comprehensive symptom structure model of OCD across the lifespan using factor and network analytic techniques. Methods A large multinational cohort of well-characterized children, adolescents, and adults diagnosed with OCD (N = 1366) participated in the study. All completed the Dimensional Yale-Brown Obsessive-Compulsive Scale, which contains an expanded checklist of 87 distinct OCD symptoms. Exploratory and confirmatory factor analysis were used to outline empirically supported symptom dimensions, and interconnections among the resulting dimensions were established using network analysis. Associations between dimensions and sociodemographic and clinical variables were explored using structural equation modeling (SEM). Results Thirteen first-order symptom dimensions emerged that could be parsimoniously reduced to eight broad dimensions, which were valid across the lifespan: Disturbing Thoughts, Incompleteness, Contamination, Hoarding, Transformation, Body Focus, Superstition, and Loss/Separation. A general OCD factor could be included in the final factor model without a significant decline in model fit according to most fit indices. Network analysis showed that Incompleteness and Disturbing Thoughts were most central (i.e. had most unique interconnections with other dimensions). SEM showed that the eight broad dimensions were differentially related to sociodemographic and clinical variables. Conclusions Future research will need to establish if this expanded hierarchical and multidimensional model can help improve our understanding of the etiology, neurobiology and treatment of OCD. © 2021 The Author(s). Published by Cambridge University Press

    Optimal ISW detection and joint likelihood for cosmological parameter estimation

    Full text link
    We analyse the local variance effect in the standard method for detecting the integrated Sachs-Wolfe effect (ISW) via cross-correlating the cosmic microwave background (CMB) with the large-scale structure (LSS). Local variance is defined as the systematic noise in the ISW detection that originates in the realisation of the matter distribution in the observed Universe. We show that the local variance contributes about 11 per cent to the total variance in the standard method, if a perfect and complete LSS survey up to z ~ 2 is assumed. Due to local variance, the estimated detection significance and cosmological parameter constraints in the standard method are biased. In this work, we present an optimal method of how to reduce the local variance effect in the ISW detection by working conditional on the LSS-data. The variance of the optimal method, and hence the signal-to-noise ratio, depends on the actual realisation of the matter distribution in the observed Universe. We show that for an ideal galaxy survey, the average signal-to-noise ratio is enhanced by about 7 per cent in the optimal method, as compared to the standard method. Furthermore, in the optimal method there is no need to estimate the covariance matrix by Monte Carlo simulations as in the standard method, which saves time and increases the accuracy. Finally, we derive the correct joint likelihood function for cosmological parameters given CMB- and LSS-data within the linear LSS formation regime, which includes a small coupling of the two datasets due to the ISW effect.Comment: Proof added to appendix. Minor changes to text and formulae. Results and figures unchanged. Matches published MNRAS versio

    Bayesian analysis of cosmic structures

    Full text link
    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales below ~ 10 h^{-1} Mpc for which higher order correlations would be required to describe the matter statistics. However, we confirm as it was recently shown in the context of Ly-alpha forest tomography that the Poisson-lognormal model provides the correct two-point statistics (or power-spectrum).Comment: 11 pages, 1 figure, report for the Astrostatistics and Data Mining workshop, La Palma, Spain, 30 May - 3 June 2011, to appear in Springer Series on Astrostatistic
    corecore