6,642 research outputs found

    Statistical models with covariance constraints

    Get PDF
    Imperial Users onl

    Feasibility Study of Enhanced Arboricultural Education at the Arboretum

    Get PDF

    INFLUENCE OF MIST ON FLORAL INITIATION OF PHARBITIS NIL*

    Get PDF

    Why are two mistakes not worse than one?:a proposal for controlling the expected number of false claims

    Get PDF
    Multiplicity is common in clinical studies and the current standard is to use the familywise error rate to ensure that the errors are kept at a prespecified level. In this paper, we will show that, in certain situations, familywise error rate control does not account for all errors made. To counteract this problem, we propose the use of the expected number of false claims (EFC). We will show that a (weighted) Bonferroni approach can be used to control the EFC, discuss how a study that uses the EFC can be powered for co-primary, exchangeable, and hierarchical endpoints, and show how the weight for the weighted Bonferroni test can be determined in this manner

    Stellar Photometry and Astrometry with Discrete Point Spread Functions

    Full text link
    The key features of the MATPHOT algorithm for precise and accurate stellar photometry and astrometry using discrete Point Spread Functions are described. A discrete Point Spread Function (PSF) is a sampled version of a continuous PSF which describes the two-dimensional probability distribution of photons from a point source (star) just above the detector. The shape information about the photon scattering pattern of a discrete PSF is typically encoded using a numerical table (matrix) or a FITS image file. Discrete PSFs are shifted within an observational model using a 21-pixel-wide damped sinc function and position partial derivatives are computed using a five-point numerical differentiation formula. Precise and accurate stellar photometry and astrometry is achieved with undersampled CCD observations by using supersampled discrete PSFs that are sampled 2, 3, or more times more finely than the observational data. The precision and accuracy of the MATPHOT algorithm is demonstrated by using the C-language MPD code to analyze simulated CCD stellar observations; measured performance is compared with a theoretical performance model. Detailed analysis of simulated Next Generation Space Telescope observations demonstrate that millipixel relative astrometry and millimag photometric precision is achievable with complicated space-based discrete PSFs. For further information about MATPHOT and MPD, including source code and documentation, see http://www.noao.edu/staff/mighell/matphotComment: 19 pages, 22 figures, accepted for publication in MNRA

    Graphics for uncertainty

    Get PDF
    Graphical methods such as colour shading and animation, which are widely available, can be very effective in communicating uncertainty. In particular, the idea of a ‘density strip’ provides a conceptually simple representation of a distribution and this is explored in a variety of settings, including a comparison of means, regression and models for contingency tables. Animation is also a very useful device for exploring uncertainty and this is explored particularly in the context of flexible models, expressed in curves and surfaces whose structure is of particular interest. Animation can further provide a helpful mechanism for exploring data in several dimensions. This is explored in the simple but very important setting of spatiotemporal data

    Wavelet entropy and fractional Brownian motion time series

    Full text link
    We study the functional link between the Hurst parameter and the Normalized Total Wavelet Entropy when analyzing fractional Brownian motion (fBm) time series--these series are synthetically generated. Both quantifiers are mainly used to identify fractional Brownian motion processes (Fractals 12 (2004) 223). The aim of this work is understand the differences in the information obtained from them, if any.Comment: 10 pages, 2 figures, submitted to Physica A for considering its publicatio

    Statistically Stable Estimates of Variance in Radioastronomical Observations as Tools for RFI Mitigation

    Full text link
    A selection of statistically stable (robust) algorithms for data variance calculating has been made. Their properties have been analyzed via computer simulation. These algorithms would be useful if adopted in radio astronomy observations in the presence of strong sporadic radio frequency interference (RFI). Several observational results have been presented here to demonstrate the effectiveness of these algorithms in RFI mitigation

    A scheduling theory framework for GPU tasks efficient execution

    Get PDF
    Concurrent execution of tasks in GPUs can reduce the computation time of a workload by overlapping data transfer and execution commands. However it is difficult to implement an efficient run- time scheduler that minimizes the workload makespan as many execution orderings should be evaluated. In this paper, we employ scheduling theory to build a model that takes into account the device capabili- ties, workload characteristics, constraints and objec- tive functions. In our model, GPU tasks schedul- ing is reformulated as a flow shop scheduling prob- lem, which allow us to apply and compare well known methods already developed in the operations research field. In addition we develop a new heuristic, specif- ically focused on executing GPU commands, that achieves better scheduling results than previous tech- niques. Finally, a comprehensive evaluation, showing the suitability and robustness of this new approach, is conducted in three different NVIDIA architectures (Kepler, Maxwell and Pascal).Proyecto TIN2016- 0920R, Universidad de Málaga (Campus de Excelencia Internacional Andalucía Tech) y programa de donación de NVIDIA Corporation
    corecore