8 research outputs found

    A Simulated Annealing Strategy for Cluster Detection

    No full text
    We discuss and implement a new strategy for spatial cluster detection. A test statistic based on the likelihood ratio is used, as formulated by Kulldorff and Nagarwalla. Differently from these authors, our test is not restricted to the detection of clusters with fixed shape, such as rectangular or circular shape, but it looks for connected clusters with arbitrary geometry. This could be advantageous in real situations, where we frequently find spatial clusters along rivers or transport ways, for example. A new technique of adaptive simulated annealing is developed, focused on the problem of finding the local maxima of a certain likelihood function over the space of the connected subgraphs of the graph associated to the map of populations and geo-referenced cases. This algorithm has applications to the study of disease clusters and hot-spots of criminality. We present a study case for homicides in the city of Belo Horizonte, Brazil

    Asymptotics for a Weighted Least Squares Estimator of the Disease Onset Distribution Function for . . .

    No full text
    In carcinogenicity experiments with animals where the tumour is not palpable it is common to observe only the time of death of the animal, the cause of death (the tumour or another independent cause, as sacrifice) and whether the tumour was present at the time of death. These last two indicator variables are evaluated after an autopsy. A weighted least squares estimator for the distribution function of the disease onset was proposed by van der Laan et al. (1997). Some asymptotic properties of their estimator are established. A minimax lower bound for the estimation of the disease onset distribution is obtained, as well as the local asymptotic distribution for their estimator.

    Universidade Federal de Minas Gerais

    No full text
    This paper addresses the problem of optimally inserting idle time into a single-machine schedule when the sequence is fixed and the cost of each job is a convex function of its completion time. We propose a pseudo-polynomial time algorithm to find a solution within some tolerance of optimality in the solution space. The proposed algorithm generalises several previous works related to the subject

    Universidade Federal de Minas Gerais

    No full text
    This paper will be concentrated on the estimation of the optimal bandwidth, h opt . Before starting the study of the problem, we cite some properties of F n . More details can be seen in Nadaraya (1964) and Bessegato (2002). The expectance and the variance of F n are given by E = F (x) + h C 2 + o , and F (x)[1 F (x)] C 1 + o (h/n) respectively where C 1 = 2 F # (x) zW (z) w(z)dz and C 2 = We note that the estimator is biased and the bias does not depend directly on the sample size n, but depend on

    Universidade Federal de Minas Gerais

    No full text
    Sensory evaluations to determine the shelf life of food products are routinely conducted in food experimentation. In such experiments, trainned panelists are asked to judge food attributes by reference to a scale of numbers (scores varying from 0 to 6 for example). The "failure time" associated to a product unit under test is usually defined as the time required to reach a cutt-o# point previously defined by the food company. Important issues associated with the planning and execution of this kind of testing are total sampling size, frequency of sample withdrawals, panel design, and statistical analysis of the panel data, to list a few. Di#erent approaches have been proposed for the analysis of this kind of data. In particular, Freitas, Borges and Ho (2001) proposed an alternative model based on a dichotomization of the score data and a Weibull as the underlying distribution for the time to failure. The model was applied to a real situation. The authors evaluated also through a simulation study, the bias and mean square error of the estimates obtained for percentiles and fraction defectives. The simulation study used only the same sample plan implemented in the real situation. In this paper we focus on the planning issues associated with these experiments. Sample plans are contrasted and compared in a simulation study, through the use of the approach proposed by Freitas, Borges and Ho (2001)

    DESIGN OF ECONOMICALLY OPTIMAL ZERO-DEFECT ACCEPTANCE SAMPLING WITH RECTIFICATION WHEN DIAGNOSIS ERRORS ARE

    No full text
    In this paper we present the optimum sampling size in zero-defect acceptance sampling with rectification under diagnosis errors. Its development is based on an economical model. The procedures are implemented in a program using the software Matlab and illustrated by an example
    corecore