4,095 research outputs found

    Modelling distributed lag effects in epidemiological time series studies

    Get PDF
    The paper argues that much of the existing literature on air pollution and mortality deals only with the transient effects of air pollution. Policy, on the other hand, needs to know when, whether and to what extent pollution-induced increases in mortality are reversed. This involves modelling the entire distributed lag effects of air pollution. Borrowing from econometrics this paper presents a method by which distributed lag effects can be estimated parsimoniously but plausibly estimated. The paper presents a time series study into the relationship between ambient levels of air pollution and daily mortality counts for Manchester employing this technique. Black Smoke is shown to have a highly significant effect on mortality counts in the short term. Nevertheless we find that 80 percent of the deaths attributable to BS would have occurred anyway within one week whereas the remaining 20 percent of individuals would otherwise have enjoyed a normal life expectancy

    The amenity value of the Italian climate

    Get PDF
    The hedonic price literature suggests that locations with more favourable characteristics should display compensating wage and house price differentials. Estimates of the marginal willingness to pay for small changes in climate variables are derived using the hedonic price technique applied to Italian data. A hedonic price model was specified in terms of January and July averages. There exists considerable empirical support for the hypothesis that amenity values for climate are embedded in the labour and housing market. Italians would prefer a drier climate during the winter months, but higher summertime temperatures are shown to reduce welfare. These results may have relevance to the task of determining the economic impact of future climate change

    Modelling distributed lag effects in mortality and air pollution studies: the case of Santiago

    Get PDF
    Most of the epidemiological literature on air pollution and mortality deals only with single or dual pollutant models whose results are hard to interpret and of questionable value from the policy perspective. In addition, much of the existing literature deals only with the very short-term effects of air pollution whereas policy makers need to know when, whether and to what extent pollution-induced increases in mortality counts are reversed. This involves modelling the infinite distributed lag effects of air pollution. Borrowing from econometrics this paper presents a method by which the infinite distributed lag effects can be estimated parsimoniously but plausibly estimated. The paper presents a time series study into the relationship between ambient levels of air pollution and daily mortality counts for Santiago employing this technique which confirms that the infinite lag effects are highly significant. It is also shown that day to day variations in NO2 concentrations and in the concentrations of both fine and coarse particulates are associated with short-term variations in death rates. These findings are made in the context of a model that simultaneously includes six different pollutants. Evidence is found pointing to the operation of a very short term harvesting effect

    Valuing congestion costs in the British Museum

    Get PDF
    Museums are potentially congestible resources because the exhibits they contain are, in any relevant sense of the word, irreproducible. Insofar as visitor congestion diminishes the value of individuals’ visits it constitutes an additional reason for charging for admission to museums, albeit one not previously considered. A policy of free access to a museum containing unique treasures may dissipate the economic benefits of the museum. Within the context of an empirical study undertaken for the British Museum using stated preference techniques it is shown that the congestion cost posed by the marginal visitor is quite high. Notwithstanding the argument that visits to the museum may possess external benefits, this points to the desirability of instigating charges for admission. Furthermore, it is shown that the marginal congestion cost decreases at least over a range as visitor numbers increase. In other words beyond certain levels introducing more visitors does not worsen congestion. This suggests that, contrary to what is often assumed, charging more during periods of high demand may be undesirable. Insofar as congestion is a widespread phenomenon in important museums, galleries and sites of historical heritage the issues raised in this paper as well as the methodology devised to determine congestion costs could have widespread application

    Galactic Cannibalism: the Origin of the Magellanic Stream

    Full text link
    We are in a privileged location in the Universe which allows us to observe galactic interactions from close range -- the merger of our two nearest dwarf satellite galaxies, the LMC and SMC. It is important to understand the local merger process before we can have confidence in understanding mergers at high redshift. We present high resolution Nbody+SPH simulations of the disruption of the LMC and SMC and the formation of the Magellanic Stream, and discuss the implications for galaxy formation and evolution.Comment: 2 pages, 1 figure, to appear in "The Evolution of Galaxies II: Basic Building Blocks", (2002) ed. M. Sauvage et al. (Kluwer

    High-resolution N-body Simulations of Galactic Cannibalism: The Magellanic Stream

    Full text link
    Hierarchical clustering represents the favoured paradigm for galaxy formation throughout the Universe; due to its proximity, the Magellanic system offers one of the few opportunities for astrophysicists to decompose the full six-dimensional phase-space history of a satellite in the midst of being cannibalised by its host galaxy. The availability of improved observational data for the Magellanic Stream and parallel advances in computational power has led us to revisit the canonical tidal model describing the disruption of the Small Magellanic Cloud and the consequent formation of the Stream. We suggest improvements to the tidal model in light of these recent advances.Comment: 6 pages, 4 figures, LaTeX (gcdv.sty). Refereed contribution to the 5th Galactic Chemodynamics conference held in Swinburne, July 2003. Accepted for publication in PASA. Version with high resolution figures available at http://astronomy.swin.edu.au/staff/tconnors/publications.htm

    Implementation of a geometrically informed and energetically constrained mesoscale eddy parameterization in an ocean circulation model

    Get PDF
    The global stratification and circulation of the ocean and their sensitivities to changes in forcing depend crucially on the representation of the mesoscale eddy field. Here, a geometrically informed and energetically constrained parameterization framework for mesoscale eddies --- termed GEOMETRIC --- is proposed and implemented in three-dimensional primitive equation channel and sector models. The GEOMETRIC framework closes mesoscale eddy fluxes according to the standard Gent--McWilliams scheme, but with the eddy transfer coefficient constrained by the depth-integrated eddy energy field, provided through a prognostic eddy energy budget evolving with the mean state. It is found that coarse resolution calculations employing GEOMETRIC broadly reproduce model sensitivities of the eddy permitting reference calculations in the emergent circumpolar transport, meridional overturning circulation profile and the depth-integrated eddy energy signature; in particular, eddy saturation emerges in the sector configuration. Some differences arise, attributed here to the simple prognostic eddy energy budget employed, to be improved upon in future investigations. The GEOMETRIC framework thus proposes a shift in paradigm, from a focus on how to close for eddy fluxes, to focusing on the representation of eddy energetics.Comment: 19 pages, 9 figures, submitted to Journal of Physical Oceanography; comments welcome. (Copyright statement: see section 7a of https://www.ametsoc.org/ams/index.cfm/publications/ethical-guidelines-and-ams-policies/ams-copyright-policy/

    Uneconomical Diagnosis of Cladograms: Comments on Wheeler and Nixon's Method for Sankoff Optimization

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/74972/1/j.1096-0031.1997.tb00249.x.pd

    Contrastive Learning Can Find An Optimal Basis For Approximately View-Invariant Functions

    Full text link
    Contrastive learning is a powerful framework for learning self-supervised representations that generalize well to downstream supervised tasks. We show that multiple existing contrastive learning methods can be reinterpreted as learning kernel functions that approximate a fixed positive-pair kernel. We then prove that a simple representation obtained by combining this kernel with PCA provably minimizes the worst-case approximation error of linear predictors, under a straightforward assumption that positive pairs have similar labels. Our analysis is based on a decomposition of the target function in terms of the eigenfunctions of a positive-pair Markov chain, and a surprising equivalence between these eigenfunctions and the output of Kernel PCA. We give generalization bounds for downstream linear prediction using our Kernel PCA representation, and show empirically on a set of synthetic tasks that applying Kernel PCA to contrastive learning models can indeed approximately recover the Markov chain eigenfunctions, although the accuracy depends on the kernel parameterization as well as on the augmentation strength.Comment: Published at ICLR 202

    Efficient FPT algorithms for (strict) compatibility of unrooted phylogenetic trees

    Full text link
    In phylogenetics, a central problem is to infer the evolutionary relationships between a set of species XX; these relationships are often depicted via a phylogenetic tree -- a tree having its leaves univocally labeled by elements of XX and without degree-2 nodes -- called the "species tree". One common approach for reconstructing a species tree consists in first constructing several phylogenetic trees from primary data (e.g. DNA sequences originating from some species in XX), and then constructing a single phylogenetic tree maximizing the "concordance" with the input trees. The so-obtained tree is our estimation of the species tree and, when the input trees are defined on overlapping -- but not identical -- sets of labels, is called "supertree". In this paper, we focus on two problems that are central when combining phylogenetic trees into a supertree: the compatibility and the strict compatibility problems for unrooted phylogenetic trees. These problems are strongly related, respectively, to the notions of "containing as a minor" and "containing as a topological minor" in the graph community. Both problems are known to be fixed-parameter tractable in the number of input trees kk, by using their expressibility in Monadic Second Order Logic and a reduction to graphs of bounded treewidth. Motivated by the fact that the dependency on kk of these algorithms is prohibitively large, we give the first explicit dynamic programming algorithms for solving these problems, both running in time 2O(k2)â‹…n2^{O(k^2)} \cdot n, where nn is the total size of the input.Comment: 18 pages, 1 figur
    • …
    corecore