22 research outputs found

    Produzir de outro modo

    Get PDF
    Research on farming systems enables the the integrated understanding of economic and agronomic reasons that lead farmers to intensify their productive methods with damaging consequences to the environment. This is the methodological background for the elaboration of proposals that seek to harmonize specialized com crop production with the reduction of inputs without discouraging farmers by poorer economic results.As pesquisas em sistemas de produção contribuem para a compreensão integrada das razões econômicas e agronômicas que levam os agricultores a intensificar seus métodos produtivos com consequências nefastas para o meio ambiente. Esta é a base metodológica para a elaboração de propostas que procurem compatibilizar a produgção especializada de grãos com a redução no uso de insumos, sem que os resultados econõmicos para o agricultor sejam desestimulantes

    Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences

    Get PDF
    International audienceWe consider fitting uncategorical data to a parametric family of distributions by means of tests based on (h, ϕ)-divergence estimates. The class of (h, ϕ)-divergences, introduced in Salicrú et al. (1993), includes the well-known classes of φ-divergences, of Bregman divergences and of distortion measures. The most classic are Kullback-Leibler, Rényi and Tsallis divergences. Most of (h, ϕ)-divergences are associated to (h, ϕ)-entropy functionals, e.g., Kullback-Leibler divergence to Shannon entropy. Distributions maximizing (h, ϕ)-entropies under moment constraints are involved in numerous applications and are also of theoretic interest. Besides the family of exponential distributions maximizing Shannon entropy, see, e.g., Bercher (2014) for an overview of various information inequalities involving the so-called q-Gaussian distributions, i.e., distributions maximizing Rényi (or Tsallis) entropy under variance constraints. For distributions maximizing Shannon or Rényi entropy under moment constraints, the related divergence is well known to reduce to an entropy difference. Then estimating divergence reduces to estimating entropy; see Girardin and Lequesne (2013a, 2013b). A commonly used non-parametric procedure for estimating entropy is the nearest neighbors method; see Vasicek (1976) for Shannon entropy and Leonenko et al. (2008) for Rényi entropy. Vasicek (1976) deduced a test of normality whose statistics involves Shannon entropy difference, thus opening the way to numerous authors who adapted or extended the procedure to obtain goodness-of-fit tests for various sub-families of exponential distributions. Recently, Girardin and Lequesne (2013b) considered goodness-of-fit tests for q-Gaussian distribu-tions (among which the non-standard Student distribution arises as a meaningful example) based on Rényi's divergence and entropy differences. Further, we will show how this methodology may extend to families of distributions maximizing other (h, ϕ)-entropies

    Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences

    Get PDF
    International audienceWe consider fitting uncategorical data to a parametric family of distributions by means of tests based on (h, ϕ)-divergence estimates. The class of (h, ϕ)-divergences, introduced in Salicrú et al. (1993), includes the well-known classes of φ-divergences, of Bregman divergences and of distortion measures. The most classic are Kullback-Leibler, Rényi and Tsallis divergences. Most of (h, ϕ)-divergences are associated to (h, ϕ)-entropy functionals, e.g., Kullback-Leibler divergence to Shannon entropy. Distributions maximizing (h, ϕ)-entropies under moment constraints are involved in numerous applications and are also of theoretic interest. Besides the family of exponential distributions maximizing Shannon entropy, see, e.g., Bercher (2014) for an overview of various information inequalities involving the so-called q-Gaussian distributions, i.e., distributions maximizing Rényi (or Tsallis) entropy under variance constraints. For distributions maximizing Shannon or Rényi entropy under moment constraints, the related divergence is well known to reduce to an entropy difference. Then estimating divergence reduces to estimating entropy; see Girardin and Lequesne (2013a, 2013b). A commonly used non-parametric procedure for estimating entropy is the nearest neighbors method; see Vasicek (1976) for Shannon entropy and Leonenko et al. (2008) for Rényi entropy. Vasicek (1976) deduced a test of normality whose statistics involves Shannon entropy difference, thus opening the way to numerous authors who adapted or extended the procedure to obtain goodness-of-fit tests for various sub-families of exponential distributions. Recently, Girardin and Lequesne (2013b) considered goodness-of-fit tests for q-Gaussian distribu-tions (among which the non-standard Student distribution arises as a meaningful example) based on Rényi's divergence and entropy differences. Further, we will show how this methodology may extend to families of distributions maximizing other (h, ϕ)-entropies

    Anti-microbial activity of Mucosal Associated Invariant T cells

    Get PDF
    International audienceMucosal associated invariant T (MAIT) lymphocytes are characterized by two evolutionarily conserved features: an invariant TCRα chain and restriction by the MHC-related protein, MR1. Here we show that MAIT cells are activated by cells infected with different strains of bacteria and yeasts, but not viruses, both in human and mouse. This activation requires cognate interaction between the invariant T cell receptor (TCR) and MR1, which can present a bacteria-derived ligand. In humans, we observe a striking diminution of MAIT cell blood-numbers in patients with bacterial infections such as tuberculosis. In mouse, MAIT cells protect against infections by Mycobacterium and Escherichia coli. Thus, MAIT cells are evolutionarily conserved innate-like lymphocytes that sense and help fight off microbial infections
    corecore