6,756,354 research outputs found
Join Up, Scale Up: How Integration Can Defeat Disease and Poverty
This report addresses three of the core areas: primary healthcare, clean water and sanitation, and nutrition -- that are essential to achieving the MDGs. It highlights examples across 17 countries of how bringing different development approaches together (ie. integration) is working to help tackle poverty and disease and calls on the international community, including donor and developing country governments, to prioritize and invest in these joined-up programs. The experiences and lessons learned from the case studies described in this report show real world examples of how to make integration work and why it's so important to do so
Scale-up of a system for hydrocarbon production by electrochemical reduction of CO2
This work addresses the scaling up of a system for electrochemical reduction of CO2 to produce hydrocarbons that can be used as fuel for a regenerative energy storage cycle. Challenges involved in such a task are mentioned. Scalingup results of a system based on electrodes of high surface area with modified copper deposits are described. Current densities around 100 mA/cm2 were obtained. This corresponds to the current density threshold that enables technological applications. At potentials as negative as -1.6 V it was observed that CO2 reduction still dominated over hydrogen evolution reaction
Estimating population size using the network scale up method
We develop methods for estimating the size of hard-to-reach populations from
data collected using network-based questions on standard surveys. Such data
arise by asking respondents how many people they know in a specific group
(e.g., people named Michael, intravenous drug users). The Network Scale up
Method (NSUM) is a tool for producing population size estimates using these
indirect measures of respondents' networks. Killworth et al. [Soc. Netw. 20
(1998a) 23-50, Evaluation Review 22 (1998b) 289-308] proposed maximum
likelihood estimators of population size for a fixed effects model in which
respondents' degrees or personal network sizes are treated as fixed. We extend
this by treating personal network sizes as random effects, yielding principled
statements of uncertainty. This allows us to generalize the model to account
for variation in people's propensity to know people in particular subgroups
(barrier effects), such as their tendency to know people like themselves, as
well as their lack of awareness of or reluctance to acknowledge their contacts'
group memberships (transmission bias). NSUM estimates also suffer from recall
bias, in which respondents tend to underestimate the number of members of
larger groups that they know, and conversely for smaller groups. We propose a
data-driven adjustment method to deal with this. Our methods perform well in
simulation studies, generating improved estimates and calibrated uncertainty
intervals, as well as in back estimates of real sample data. We apply them to
data from a study of HIV/AIDS prevalence in Curitiba, Brazil. Our results show
that when transmission bias is present, external information about its likely
extent can greatly improve the estimates. The methods are implemented in the
NSUM R package.Comment: Published at http://dx.doi.org/10.1214/15-AOAS827 in the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Preferred antiretroviral drugs for the next decade of scale up
Global commitments aim to provide antiretroviral therapy (ART) to 15 million people living with HIV by 2015, and recent studies have demonstrated the potential for widespread ART to prevent HIV transmission. Increasingly, countries are adapting their national guidelines to start ART earlier, for both clinical and preventive benefits. To maximize the benefits of ART in resource-limited settings, six key principles need to guide ART choice: simplicity, tolerability and safety, durability, universal applicability, affordability and heat stability. Currently available drugs, combined with those in late-stage clinical development, hold great promise to simplify treatment in the short term. Over the longer-term, newer technologies, such as long-acting formulations and nanotechnology, could radically alter the treatment paradigm. This commentary reviews recommendations made in an expert consultation on treatment scale up in resource-limited settings
Models of LHC Diphoton Excesses Valid up to the Planck scale
We discuss a possibility to explain the LHC diphoton excesses at GeV by
the new scalar that couples to the gauge bosons through the loop of new
massive particles with Standard Model charges. We assume that the new particles
decay into the Standard Model particles at the tree level. We systematically
examine the models that preserve the vacuum stability and the perturbativity up
to the Planck scale. When we take scalars for the new particles, we find that
only a few diquark and dilepton models can explain the observed diphoton cross
section without conflicting the experimental mass bounds. When we take
vector-like fermions for the new particles, we find rather different situations
depending on whether their couplings to are scalar or pseudoscalar type. In
the former case, a few models are allowed if we introduce only one species of
fermions. The more fermions we introduce, the more models are allowed. In the
later case, the most of the models are allowed because of the large coupling
between and photon. It is interesting that the allowed mass regions of the
scalar particles might be reached by the next lepton colliders.Comment: 35 pages, 4 figures; the maximum values of
for the scalar extensions are recalculated, typos corrected, references added
(v2); references added, typos corrected, version to appear in PRD (v3
- …
