773 research outputs found
Causality and Association: The Statistical and Legal Approaches
This paper discusses different needs and approaches to establishing
``causation'' that are relevant in legal cases involving statistical input
based on epidemiological (or more generally observational or population-based)
information. We distinguish between three versions of ``cause'': the first
involves negligence in providing or allowing exposure, the second involves
``cause'' as it is shown through a scientifically proved increased risk of an
outcome from the exposure in a population, and the third considers ``cause'' as
it might apply to an individual plaintiff based on the first two. The
population-oriented ``cause'' is that commonly addressed by statisticians, and
we propose a variation on the Bradford Hill approach to testing such causality
in an observational framework, and discuss how such a systematic series of
tests might be considered in a legal context. We review some current legal
approaches to using probabilistic statements, and link these with the
scientific methodology as developed here. In particular, we provide an approach
both to the idea of individual outcomes being caused on a balance of
probabilities, and to the idea of material contribution to such outcomes.
Statistical terminology and legal usage of terms such as ``proof on the balance
of probabilities'' or ``causation'' can easily become confused, largely due to
similar language describing dissimilar concepts; we conclude, however, that a
careful analysis can identify and separate those areas in which a legal
decision alone is required and those areas in which scientific approaches are
useful.Comment: Published in at http://dx.doi.org/10.1214/07-STS234 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Lung cancer and passive smoking: reconciling the biochemical and epidemiological approaches.
The accurate determination of exposure to environmental tobacco smoke is notoriously difficult. There have been to date two approaches to determining this exposure in the study of association of passive smoking and lung cancer: the biochemical approach, using cotinine in the main as a marker, and the epidemiological approach. Typically results of the former have yielded much lower relative risk than the latter, and have tended to be ignored in favour of the latter, although there has been considerable debate as to the logical basis for this. We settle this question by showing that, using the epidemiologically based meta-analysis technique of Wald et al. (1986), and misclassification models in the EPA Draft Review (1990), one arrives using all current studies at a result which is virtually identical with the biochemically-based conclusions of Darby and Pike (1988) or Repace and Lowry (1990). The conduct of this meta-analysis itself raises a number of important methodological questions, including the validity of inclusion of studies, the use of estimates adjusted for covariates, and the statistical significance of estimates based on meta-analysis of the epidemiological data. The best estimate of relative risk from spousal smoking is shown to be approximately 1.05-1.10, based on either of these approaches; but it is suggested that considerable extra work is needed to establish whether this is significantly raised
Microscopic model of diffusion limited aggregation and electrodeposition in the presence of levelling molecules
A microscopic model of the effect of unbinding in diffusion limited
aggregation based on a cellular automata approach is presented. The geometry
resembles electrochemical deposition - ``ions'' diffuse at random from the top
of a container until encountering a cluster in contact with the bottom, to
which they stick. The model exhibits dendritic (fractal) growth in the
diffusion limited case. The addition of a field eliminates the fractal nature
but the density remains low. The addition of molecules which unbind atoms from
the aggregate transforms the deposit to a 100% dense one (in 3D). The molecules
are remarkably adept at avoiding being trapped. This mimics the effect of
so-called ``leveller'' molecules which are used in electrochemical deposition
Go Back Full Screen
⢠Global warming is a statistically confirmed long-term phenomenon. ⢠Somewhat surprisingly, its most visible consequence is: â not the warming itself but â the increased climate variability. ⢠In this talk, we explain why increased climate variability is more visible than the global warming itself. ⢠In this explanation, use general system theory ideas. A Simplified System-... Towards the Second..
Proton lifetime bounds from chirally symmetric lattice QCD
We present results for the matrix elements relevant for proton decay in Grand
Unified Theories (GUTs). The calculation is performed at a fixed lattice
spacing a^{-1}=1.73(3) GeV using 2+1 flavors of domain wall fermions on
lattices of size 16^3\times32 and 24^3\times64 with a fifth dimension of length
16. We use the indirect method which relies on an effective field theory
description of proton decay, where we need to estimate the low energy
constants, \alpha = -0.0112(25) GeV^3 and \beta = 0.0120(26) GeV^3. We relate
these low energy constants to the proton decay matrix elements using leading
order chiral perturbation theory. These can then be combined with experimental
bounds on the proton lifetime to bound parameters of individual GUTs.Comment: 17 pages, 9 Figure
Trust, professionalism and regulation: a critical comparison of Medicine and Law
Background & Aims: Trust, professionalism and regulation are complex social phenomena, which are contextually dependent and dynamic. This project aims to explore the concept of âtrustâ in Law and Medicine - questioning what it means to be a âtrustworthyâ professional and how these understandings relate to ideas of professionalism and regulation. Methods: This study draws on a comprehensive review of the literature and interviews with thirty participants from within, or related to, the UK legal and medical professions. Participants included practitioners, those creating and implementing policy, and public representatives. Data was analysed using the âlogics approachâ from Political Discourse Theory (PDT). This helped us draw out taken-for-granted ideas and beliefs about trust, professionalism and regulation and expose these to critique. Results: Participants highly valued patient/client trust, seeing it as fundamental to the functioning of their professional âserviceâ. Trust was seen as attributed primarily to the individual practitioner and maintained through demonstrating measurable âprofessionalismâ. Practitioners were understood to be individually responsible for preserving their image as a âgood professionalâ, via evidencing their âprofessionalismâ to the patient/client and the regulator. Discussion: Current ways-of-thinking about trust permitted trust in individuals to be maintained, even when trust in the professions as a whole was challenged. However, for medical professionals particularly, this was predicated on a need to âevidenceâ that one was a âgood professionalâ through intensive and continual regulation. This created an increased dependency on a âtrust-industryâ of regulatory bodies and systems. This project critically questions how regulation shapes and impacts trust in the professions. It is a problem-driven approach, which seeks to break with current patterns of thinking and question: âwhat might be possible instead?â This opens up an ideological space and new viewpoints, whereby audiences are encouraged to consider future change
Recommended from our members
Guidelines for human gene nomenclature.
Standardized gene naming is crucial for effective communication about genes, and as genomics becomes increasingly important in healthcare, the need for a consistent language for human genes becomes ever more vital. Here we present the current HUGO Gene Nomenclature Committee (HGNC) guidelines for naming not only protein-coding but also RNA genes and pseudogenes, and outline the changes in approach and ethos that have resulted from the discoveries of the last few decades.National Human Genome Research Institute (NHGRI) grant U24HG003345 (1.5.2018-30.4.-2023)
Wellcome Trust grant 208349/Z/17/Z (1.9.2017-31.8.2022
Random billiards with wall temperature and associated Markov chains
By a random billiard we mean a billiard system in which the standard specular
reflection rule is replaced with a Markov transition probabilities operator P
that, at each collision of the billiard particle with the boundary of the
billiard domain, gives the probability distribution of the post-collision
velocity for a given pre-collision velocity. A random billiard with
microstructure (RBM) is a random billiard for which P is derived from a choice
of geometric/mechanical structure on the boundary of the billiard domain. RBMs
provide simple and explicit mechanical models of particle-surface interaction
that can incorporate thermal effects and permit a detailed study of
thermostatic action from the perspective of the standard theory of Markov
chains on general state spaces.
We focus on the operator P itself and how it relates to the
mechanical/geometric features of the microstructure, such as mass ratios,
curvatures, and potentials. The main results are as follows: (1) we
characterize the stationary probabilities (equilibrium states) of P and show
how standard equilibrium distributions studied in classical statistical
mechanics, such as the Maxwell-Boltzmann distribution and the Knudsen cosine
law, arise naturally as generalized invariant billiard measures; (2) we obtain
some basic functional theoretic properties of P. Under very general conditions,
we show that P is a self-adjoint operator of norm 1 on an appropriate Hilbert
space. In a simple but illustrative example, we show that P is a compact
(Hilbert-Schmidt) operator. This leads to the issue of relating the spectrum of
eigenvalues of P to the features of the microstructure;(3) we explore the
latter issue both analytically and numerically in a few representative
examples;(4) we present a general algorithm for simulating these Markov chains
based on a geometric description of the invariant volumes of classical
statistical mechanics
Water, Climate, and Social Change in a Fragile Landscape
We present here and in the companion papers an analysis of sustainability in the Middle Rio Grande region of the U.S.-Mexico border and propose an interdisciplinary research agenda focused on the coupled human and natural dimensions of water resources sustainability in the face of climate and social change in an international border region. Key threats to water sustainability in the Middle Rio Grande River region include: (1) increasing salinization of surface and ground water, (2) increasing water demand from a growing population in the El Paso/Ciudad Juarez area on top of an already high base demand from irrigated agriculture, (3) water quality impacts from agricultural, municipal, and industrial discharges to the river, (4) changing regional climate that portends increased frequency and intensity of droughts interspersed with more intensive rainfall and flooding events, and (5) disparate water planning and management systems between different states in the U.S. and between the U.S. and Mexico. In addition to these challenges, there is an increasing demand from a significant regional population who is (and has been historically) underserved in terms of access to affordable potable water. To address these challenges to water resources sustainability, we have focused on: (1) the determinants of resilience and transformability in an ecological/social setting on an international border and how they can be measured and predicted; and (2) the drivers of change ... what are they (climate, social, etc.) and how are they impacting the coupled human and natural dimensions of water sustainability on the border? To tackle these challenges, we propose a research agenda based on a complex systems approach that focuses on the linkages and feedbacks of the natural, built/managed, and social dimensions of the surface and groundwater budget of the region. The approach that we propose incorporates elements of systems analysis, complexity science, and the use of modeling tools such as scenario planning and back-casting to link the quantitative with the qualitative. This approach is unique for our region, as are our bi-national focus and our conceptualization of water capital . In particular, the concept of water capital provides the basis for a new interdisciplinary paradigm that integrates social, economic, and natural sectors within a systems framework in order to understand and characterize water resources sustainability. This proposed approach would not only provide a framework for water sustainability decision making for our bi-national region at the local, state, and federal levels, but could serve as a model for similar border regions and/or international rivers in arid and semi-arid regions in the Middle East, Africa, Asia, and Latin America
Visual parameter optimisation for biomedical image processing
Background: Biomedical image processing methods require users to optimise input parameters to ensure high quality
output. This presents two challenges. First, it is difficult to optimise multiple input parameters for multiple
input images. Second, it is difficult to achieve an understanding of underlying algorithms, in particular, relationships
between input and output.
Results: We present a visualisation method that transforms usersâ ability to understand algorithm behaviour by
integrating input and output, and by supporting exploration of their relationships. We discuss its application to a
colour deconvolution technique for stained histology images and show how it enabled a domain expert to
identify suitable parameter values for the deconvolution of two types of images, and metrics to quantify
deconvolution performance. It also enabled a breakthrough in understanding by invalidating an underlying
assumption about the algorithm.
Conclusions: The visualisation method presented here provides analysis capability for multiple inputs and outputs
in biomedical image processing that is not supported by previous analysis software. The analysis supported by our
method is not feasible with conventional trial-and-error approaches
- âŚ