1,491 research outputs found
Defining and refining effectiveness: Applying narrative and dialogue methods in aid monitoring and evaluation
In this chapter we argue that definitions of 'effectiveness' should be negotiated and that this can be assisted through the careful selection of monitoring and evaluation methods. A flexible view of 'effectiveness' is necessary because the world is complex and ever-changing, and even more importantly because the concept is contended. Inevitably, different stakeholders in development aid hold different perspectives about what 'effectiveness' means in their context. We propose that the use of certain types of qualitative methods in monitoring and evaluation is an important way to promote dialogue on the different definitions of effectiveness between stakeholders. In support of our argument, we demonstrate how two open-ended inquiry methods were applied in the field to first define and then refine the meaning of effectiveness. We also apply a validation framework to test the quality of these methods and to generate insight into the strengths and weaknesses of their application in the field. A key strength of these methods is found to be their attention to capturing diversity of perspective. In addition, both the narrative and dialogue methods are seen to indeed facilitate the negotiation of meaning of effectiveness between the different project stakeholders. We conclude that a thoughtful and flexible approach to monitoring and evaluation that incorporates such qualitative methods enables effectiveness to be defined and refined, and is conducive to more appropriate, better managed aid.© 2007 by Nova Science Publishers, Inc. All rights reserved
Consistency of Age Reporting on Death Certificates and Social Security Administration Records Among Elderly African-American Decedents
This paper investigates the quality of age reporting in vital statistics and Social Security/Medicare data among elderly African-Americans. The authors examine whether the death certificate or Social Security age is more likely to reflect accurately the decedents\u27 true age at death by matching their sample to the US Censuses of 1900, 1910 and 1920, and identify factors associated with consistency of age reporting on death certificates and social security records. The results reveal significant discrepancies in age at death data. Birth record availability and literacy were identified as key predictors of age agreement. The match to an early-life census record showed greater agreement with Social Security age than with death certificate age at death. The results have implications for the collection of age information in surveys of elderly African-Americans
Case-control study to detect protective factors on pig farms with low Salmonella prevalence
The prevalence of Salmonella in UK pigs is amongst the highest in Europe, highlighting the risk to public health and the need to investigate on-farm controls. The objective of this study was to identify factors currently in operation on pig farms that had maintained a low Salmonella seroprevalence. For this purpose a case-control study was designed and pig farms with a low (\u3c10%) seroprevalence were compared against two randomly selected control farms, sharing the same geographical region and production type. A total of 11,452 samples, including pooled and individual floor faeces and environmental samples from pigs and their vicinity were tested and prevalence examined. In addition, detailed questionnaires were completed during the farm visits to collect descriptive data for risk factor analysis. It was shown that control farms had significantly higher prevalence compared to the case farms (19.4% and 4.3% for pooled and 6.7% and 0.1% for individual samples, respectively). The two risk factor analyses identified multiple variables associated with Salmonella prevalence including variables related to feed, effectiveness of cleaning and disinfection, biosecurity and batch production
Recommended from our members
Improving the condition number of estimated covariance matrices
High dimensional error covariance matrices and their inverses are used to weight the
contribution of observation and background information in data assimilation procedures. As
observation error covariance matrices are often obtained by sampling methods, estimates are
often degenerate or ill-conditioned, making it impossible to invert an observation error
covariance matrix without the use of techniques to reduce its condition number. In this paper
we present new theory for two existing methods that can be used to ‘recondition’ any covariance
matrix: ridge regression, and the minimum eigenvalue method. We compare these methods
with multiplicative variance inflation, which cannot alter the condition number of a matrix, but
is often used to account for neglected correlation information. We investigate the impact of
reconditioning on variances and correlations of a general covariance matrix in both a theoretical
and practical setting. Improved theoretical understanding provides guidance to users regarding
method selection, and choice of target condition number. The new theory shows that, for the
same target condition number, both methods increase variances compared to the original
matrix, with larger increases for ridge regression than the minimum eigenvalue method. We
prove that the ridge regression method strictly decreases the absolute value of off-diagonal
correlations. Theoretical comparison of the impact of reconditioning and multiplicative
variance inflation on the data assimilation objective function shows that variance inflation alters
information across all scales uniformly, whereas reconditioning has a larger effect on scales
corresponding to smaller eigenvalues. We then consider two examples: a general correlation
function, and an observation error covariance matrix arising from interchannel correlations. The
minimum eigenvalue method results in smaller overall changes to the correlation matrix than
ridge regression, but can increase off-diagonal correlations. Data assimilation experiments reveal
that reconditioning corrects spurious noise in the analysis but underestimates the true signal
compared to multiplicative variance inflation
Exosome-mediated shuttling of microRNA-29 regulates HIV Tat and morphine-mediated neuronal dysfunction.
Neuronal damage is a hallmark feature of HIV-associated neurological disorders (HANDs). Opiate drug abuse accelerates the incidence and progression of HAND; however, the mechanisms underlying the potentiation of neuropathogenesis by these drugs remain elusive. Opiates such as morphine have been shown to enhance HIV transactivation protein Tat-mediated toxicity in both human neurons and neuroblastoma cells. In the present study, we demonstrate reduced expression of the tropic factor platelet-derived growth factor (PDGF)-B with a concomitant increase in miR-29b in the basal ganglia region of the brains of morphine-dependent simian immunodeficiency virus (SIV)-infected macaques compared with the SIV-infected controls. In vitro relevance of these findings was corroborated in cultures of astrocytes exposed to morphine and HIV Tat that led to increased release of miR-29b in exosomes. Subsequent treatment of neuronal SH-SY5Y cell line with exosomes from treated astrocytes resulted in decreased expression of PDGF-B, with a concomitant decrease in viability of neurons. Furthermore, it was shown that PDGF-B was a target for miR-29b as evidenced by the fact that binding of miR-29 to the 3\u27-untranslated region of PDGF-B mRNA resulted in its translational repression in SH-SY5Y cells. Understanding the regulation of PDGF-B expression may provide insights into the development of potential therapeutic targets for neuronal loss in HIV-1-infected opiate abusers
A dependent nominal type theory
Nominal abstract syntax is an approach to representing names and binding
pioneered by Gabbay and Pitts. So far nominal techniques have mostly been
studied using classical logic or model theory, not type theory. Nominal
extensions to simple, dependent and ML-like polymorphic languages have been
studied, but decidability and normalization results have only been established
for simple nominal type theories. We present a LF-style dependent type theory
extended with name-abstraction types, prove soundness and decidability of
beta-eta-equivalence checking, discuss adequacy and canonical forms via an
example, and discuss extensions such as dependently-typed recursion and
induction principles
Spectral structure and decompositions of optical states, and their applications
We discuss the spectral structure and decomposition of multi-photon states.
Ordinarily `multi-photon states' and `Fock states' are regarded as synonymous.
However, when the spectral degrees of freedom are included this is not the
case, and the class of `multi-photon' states is much broader than the class of
`Fock' states. We discuss the criteria for a state to be considered a Fock
state. We then address the decomposition of general multi-photon states into
bases of orthogonal eigenmodes, building on existing multi-mode theory, and
introduce an occupation number representation that provides an elegant
description of such states that in many situations simplifies calculations.
Finally we apply this technique to several example situations, which are highly
relevant for state of the art experiments. These include Hong-Ou-Mandel
interference, spectral filtering, finite bandwidth photo-detection, homodyne
detection and the conditional preparation of Schr\"odinger Kitten and Fock
states. Our techniques allow for very simple descriptions of each of these
examples.Comment: 12 page
Local temperature and ecological similarity drive distributional dynamics of tropical mammals worldwide
AimIdentifying the underlying drivers of speciesâ distributional dynamics is critical for predicting change and managing biological diversity. While anthropogenic factors such as climate change can affect species distributions through time, other naturally occurring ecological processes can also have an influence. Theory predicts that interactions between species can influence distributional dynamics, yet empirical evidence remains sparse. A powerful approach is to monitor and model local colonization and extinctionâ the processes that generate change in distributions over timeâ and to identify their abiotic and biotic associations. Intensive cameraâ trap monitoring provides an opportunity to assess the role of temperature and species interactions in the colonization and extinction dynamics of tropical mammals, many of which are species of conservation concern. Using data from a panâ tropical monitoring network, we examined how shortâ term local temperature change and ecological similarity between species (a proxy for the strength of species interactions) influenced the processes that drive distributional shifts.LocationTropical forests worldwide.Time period2007â 2016.Major taxa studiedTerrestrial mammals.MethodsWe used dynamic occupancy models to assess the influence of the abiotic and biotic environment on the distributional dynamics of 42 mammal populations from 36 species on 7 tropical elevation gradients around the world.ResultsOverall, temperature, ecological similarity, or both, were linked to colonization or extinction dynamics in 29 populations. For six species, the effect of temperature depended upon the local mammal community similarity. This result suggests that the way in which temperature influences local colonization and extinction dynamics depends on local mammal community composition.Main conclusionsThese results indicate that varying temperatures influence tropical mammal distributions in surprising ways and suggest that interactions between species mediate distributional dynamics.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/149732/1/geb12908.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/149732/2/geb12908_am.pd
Mathematical practice, crowdsourcing, and social machines
The highest level of mathematics has traditionally been seen as a solitary
endeavour, to produce a proof for review and acceptance by research peers.
Mathematics is now at a remarkable inflexion point, with new technology
radically extending the power and limits of individuals. Crowdsourcing pulls
together diverse experts to solve problems; symbolic computation tackles huge
routine calculations; and computers check proofs too long and complicated for
humans to comprehend.
Mathematical practice is an emerging interdisciplinary field which draws on
philosophy and social science to understand how mathematics is produced. Online
mathematical activity provides a novel and rich source of data for empirical
investigation of mathematical practice - for example the community question
answering system {\it mathoverflow} contains around 40,000 mathematical
conversations, and {\it polymath} collaborations provide transcripts of the
process of discovering proofs. Our preliminary investigations have demonstrated
the importance of "soft" aspects such as analogy and creativity, alongside
deduction and proof, in the production of mathematics, and have given us new
ways to think about the roles of people and machines in creating new
mathematical knowledge. We discuss further investigation of these resources and
what it might reveal.
Crowdsourced mathematical activity is an example of a "social machine", a new
paradigm, identified by Berners-Lee, for viewing a combination of people and
computers as a single problem-solving entity, and the subject of major
international research endeavours. We outline a future research agenda for
mathematics social machines, a combination of people, computers, and
mathematical archives to create and apply mathematics, with the potential to
change the way people do mathematics, and to transform the reach, pace, and
impact of mathematics research.Comment: To appear, Springer LNCS, Proceedings of Conferences on Intelligent
Computer Mathematics, CICM 2013, July 2013 Bath, U
- …