10,316,345 research outputs found

    Breaking the Degeneracy: Optimal Use of Three-point Weak Lensing Statistics

    Full text link
    We study the optimal use of third order statistics in the analysis of weak lensing by large-scale structure. These higher order statistics have long been advocated as a powerful tool to break measured degeneracies between cosmological parameters. Using ray-tracing simulations, incorporating important survey features such as a realistic depth-dependent redshift distribution, we find that a joint two- and three-point correlation function analysis is a much stronger probe of cosmology than the skewness statistic. We compare different observing strategies, showing that for a limited survey time there is an optimal depth for the measurement of third-order statistics, which balances statistical noise and cosmic variance against signal amplitude. We find that the chosen CFHTLS observing strategy was optimal and forecast that a joint two- and three-point analysis of the completed CFHTLS-Wide will constrain the amplitude of the matter power spectrum σ8\sigma_8 to 10% and the matter density parameter Ωm\Omega_m to 17%, a factor of ~2.5 improvement on the two-point analysis alone. Our error analysis includes all non-Gaussian terms, finding that the coupling between cosmic variance and shot noise is a non-negligible contribution which should be included in any future analytical error calculations.Comment: 27 pages, 13 figures, 3 table

    Some Good Reasons to Use Matched Filters for the Detection of Point Sources in CMB Maps

    Full text link
    In this draft we comment on the results concerning the performances of matched filters, scale adaptive filters and Mexican hat wavelet that recently appeared in literature in the context of point source detection in Cosmic Microwave Background maps. In particular, we show that, contrary to what has been claimed, the use of the matched filters still appear to be the most reliable and efficient method to disantangle point sources from the backgrounds, even when using detection criterion that, differently from the classic nσn\sigma thresholding rule, takes into account not only the height of the peaks in the signal corresponding to the candidate sources but also their curvature.Comment: Replacement after submission to A&A and referee's comments. Astronomy and Astrophysics, in press, JNL/2003/473

    The Use of Pivot Point in Ship Handling for Safer and More Accurate Ship Manoeuvring

    Get PDF
    The size of ships has increased notably over recent decades. The size of harbours and ports has however not grown in proportion. As a result ship manoeuvring in harbours and ports has become more problematic, and more of an art than a science. The ‘pivot point’ concept can be useful in analysing slow ship manoeuvring and has therefore been widely adopted by practitioners and training institutions. As a result many practitioners now routinely plan con-fined manoeuvring using the ‘pivot point’ concept. Traditionally the ‘pivot point’ concept has been defined in a number of contradictory and inaccurate ways leading to confusion and mystification. As a result many practitioners and trainers often rely on intuition to bridge the gap between reality and their flawed understanding of theory. In this presentation the theoretical aspect of the pivot point is reviewed and correct definitions put forward and applied to basic and ‘special’ ship’s manoeuvres

    Relations between three-point configuration space shear and convergence statistics

    Full text link
    With the growing interest in and ability of using weak lensing studies to probe the non-Gaussian properties of the matter density field, there is an increasing need for the study of suitable statistical measures, e.g. shear three-point statistics. In this paper we establish the relations between the three-point configuration space shear and convergence statistics, which are an important missing link between different weak lensing three-point statistics and provide an alternative way of relating observation and theory. The method we use also allows us to derive the relations between other two- and three-point correlation functions. We show the consistency of the relations obtained with already established results and demonstrate how they can be evaluated numerically. As a direct application, we use these relations to formulate the condition for E/B-mode decomposition of lensing three-point statistics, which is the basis for constructing new three-point statistics which allow for exact E/B-mode separation. Our work applies also to other two-dimensional polarization fields such as that of the Cosmic Microwave Background.Comment: 17 pages, 5 figures, submitted to A&

    Substitution Delone Sets with Pure Point Spectrum are Inter Model Sets

    Full text link
    The paper establishes an equivalence between pure point diffraction and certain types of model sets, called inter model sets, in the context of substitution point sets and substitution tilings. The key ingredients are a new type of coincidence condition in substitution point sets, which we call algebraic coincidence, and the use of a recent characterization of model sets through dynamical systems associated with the point sets or tilings.Comment: 29pages; revised version with update

    Use Case Point Approach Based Software Effort Estimation using Various Support Vector Regression Kernel Methods

    Full text link
    The job of software effort estimation is a critical one in the early stages of the software development life cycle when the details of requirements are usually not clearly identified. Various optimization techniques help in improving the accuracy of effort estimation. The Support Vector Regression (SVR) is one of several different soft-computing techniques that help in getting optimal estimated values. The idea of SVR is based upon the computation of a linear regression function in a high dimensional feature space where the input data are mapped via a nonlinear function. Further, the SVR kernel methods can be applied in transforming the input data and then based on these transformations, an optimal boundary between the possible outputs can be obtained. The main objective of the research work carried out in this paper is to estimate the software effort using use case point approach. The use case point approach relies on the use case diagram to estimate the size and effort of software projects. Then, an attempt has been made to optimize the results obtained from use case point analysis using various SVR kernel methods to achieve better prediction accuracy.Comment: 13 pages, 6 figures, 11 Tables, International Journal of Information Processing (IJIP
    corecore