169 research outputs found

    Incorporating Contextual Information in White Blood Cell Identification

    Get PDF
    In this paper we propose a technique to incorporate contextual information into object classification. In the real world there are cases where the identity of an object is ambiguous due to the noise in the measurements based on which the classification should be made. It is helpful to reduce the ambiguity by utilizing extra information referred to as context, which in our case is the identities of the accompanying objects. This technique is applied to white blood cell classification. Comparisons are made against "no context" approach, which demonstrates the superior classification performance achieved by using context. In our particular application, it significantly reduces false alarm rate and thus greatly reduces the cost due to expensive clinical tests

    Monotonicity Hints

    Get PDF
    A hint is any piece of side information about the target function to be learned. We consider the monotonicity hint, which states that the function to be learned is monotonic in some or all of the input variables. The application of monotonicity hints is demonstrated on two real-world problems- a credit card application task, and a problem in medical diagnosis. A measure of the monotonicity error of a candidate function is defined and an objective function for the enforcement of monotonicity is derived from Bayesian principles. We report experimental results which show that using monotonicity hints leads to a statistically significant improvement in performance on both problems

    Status and Prospects of Top-Quark Physics

    Full text link
    The top quark is the heaviest elementary particle observed to date. Its large mass of about 173 GeV/c^2 makes the top quark act differently than other elementary fermions, as it decays before it hadronises, passing its spin information on to its decay products. In addition, the top quark plays an important role in higher-order loop corrections to standard model processes, which makes the top quark mass a crucial parameter for precision tests of the electroweak theory. The top quark is also a powerful probe for new phenomena beyond the standard model. During the time of discovery at the Tevatron in 1995 only a few properties of the top quark could be measured. In recent years, since the start of Tevatron Run II, the field of top-quark physics has changed and entered a precision era. This report summarises the latest measurements and studies of top-quark properties and gives prospects for future measurements at the Large Hadron Collider (LHC).Comment: 76 pages, 35 figures, submitted to Progress in Particle and Nuclear Physic

    stairs and fire

    Get PDF

    Discutindo a educação ambiental no cotidiano escolar: desenvolvimento de projetos na escola formação inicial e continuada de professores

    Get PDF
    A presente pesquisa buscou discutir como a Educação Ambiental (EA) vem sendo trabalhada, no Ensino Fundamental e como os docentes desta escola compreendem e vem inserindo a EA no cotidiano escolar., em uma escola estadual do município de Tangará da Serra/MT, Brasil. Para tanto, realizou-se entrevistas com os professores que fazem parte de um projeto interdisciplinar de EA na escola pesquisada. Verificou-se que o projeto da escola não vem conseguindo alcançar os objetivos propostos por: desconhecimento do mesmo, pelos professores; formação deficiente dos professores, não entendimento da EA como processo de ensino-aprendizagem, falta de recursos didáticos, planejamento inadequado das atividades. A partir dessa constatação, procurou-se debater a impossibilidade de tratar do tema fora do trabalho interdisciplinar, bem como, e principalmente, a importância de um estudo mais aprofundado de EA, vinculando teoria e prática, tanto na formação docente, como em projetos escolares, a fim de fugir do tradicional vínculo “EA e ecologia, lixo e horta”.Facultad de Humanidades y Ciencias de la Educació

    Monotonicity and connectedness in learning systems

    Get PDF
    This thesis studies two properties- monotonicity and connectedness- in the context of machine learning. The first part of the thesis examines the role of monotonicity constraints in machine learning from both practical and theoretical perspectives. Two techniques for enforcing monotonicity in machine learning models are proposed. The first method adds to the objective function a penalty term measuring the degree to which the model violates monotonicity. The penalty term can be interpreted as a Bayesian prior favoring functions which obey monotonicity. This method has the potential to enforce monotonicity only approximately, making it appropriate for situations where strict monotonicity may not hold. The second approach consists of a model which is monotonic by virtue of functional form. This model is shown to have universal approximation capabilities with respect to the class M of monotonic functions. A variety of theoretical results are also presented regarding M. The generalization behavior of this class is shown to depend heavily on the probability distribution over the input space. Although the VC dimension of M is [infinity], the VC entropy (i.e., the expected number of dichotomies) is modest for many distributions, allowing us to obtain bounds on the generalization error. Monte Carlo techniques for estimating the capacity and VC entropy of M are presented. The second part of the thesis considers broader issues in learning theory. Generalization error bounds based on the VC dimension describe a function class by counting the number of dichotomies it induces. In this thesis, a more detailed characterization is presented which takes into account the diversity of a set of dichotomies in addition to its cardinality. Many function classes in common usage are shown to possess a property called connectedness. Models with this property induce dichotomy sets which are highly clustered and have little diversity. We derive an improvement to the VC bound which applies to function classes with the connectedness property

    Monotonic Networks

    Get PDF
    Monotonicity is a constraint which arises in many application domains. We present a machine learning model, the monotonic network, for which monotonicity can be enforced exactly, i.e., by virtue of functional form. A straightforward method for implementingand training a monotonic network is described. Monotonic networks are proven to be universal approximators of continuous, differentiable monotonic functions. We apply monotonic networks to a real-world task in corporate bond rating prediction and compare them to other approaches. 1 Introduction Several recent papers in machine learning have emphasized the importance of priors and domain-specific knowledge. In their well-known presentation of the biasvariance tradeoff (Geman and Bienenstock, 1992), Geman and Bienenstock conclude by arguing that the crucial issue in learning is the determination of the "right biases " which constrain the model in the appropriate way given the task at hand. The No-Free-Lunch theorem of Wolpert (Wolpert,..

    Generalization Bounds for Connected Function Classes

    No full text
    We derive an improvement to the Vapnik-Chervonenkis bounds on generalization error which applies to many commonly used function classes, including feedforward neural networks. The VC analysis simply counts the number of dichotomies induced by a function class on a set of input vectors. We can achieve a richer description of the flexibility of a function class by taking into account the diversity of the set of dichotomies induced by a class in addition to the cardinality. We show that many popular machine learning models have a property called connectedness. These models induce dichotomies which are highly clustered. By taking this property into account, a tighter bound on out-of-sample performance can be obtained. 1 Introduction Although the Vapnik-Chervonenkis analysis of learning systems (Vapnik and Chervonenkis, 1971, Vapnik, 1982, 1995) has provided valuable insight into the process of learning, the VC bounds are rarely useful when evaluated numerically. The bounds typically requi..
    corecore