2,496 research outputs found
Growth or decline in the Church of England during the decade of Evangelism: did the Churchmanship of the Bishop matter?
The Decade of Evangelism occupied the attention of the Church of England throughout the 1990s. The present study employs the statistics routinely published by the Church of England in order to assess two matters: the extent to which these statistics suggest that the 43 individual dioceses finished the decade in a stronger or weaker position than they had entered it and the extent to which, according to these statistics, the performance of dioceses led by bishops shaped in the Evangelical tradition differed from the performance of dioceses led by bishops shaped in the Catholic tradition. The data demonstrated that the majority of dioceses were performing less effectively at the end of the decade than at the beginning, in terms of a range of membership statistics, and that the rate of decline varied considerably from one diocese to another. The only exception to the trend was provided by the diocese of London, which experienced some growth. The data also demonstrated that little depended on the churchmanship of the diocesan bishop in shaping diocesan outcomes on the performance indicators employed in the study
Recommended from our members
Internet searching produces misleading findings regarding violent deaths in crisis settings: short report
Donor and agency priorities are influenced by a variety of political, social, and media-related forces that can have a profound impact on response and resource provision. We have attempted to assess how well internet searches articulate the span of violent death rates for five current ācrisisā settings. In three graduate classes (2 public health, 1 information science) at US universities, during a four month period in 2017ā2018, we asked approximately 60 graduate students to conduct an internet search to determine which of five countries had the highest and lowest āviolence specific mortality rateā: Venezuela, Syria, Yemen, Central African Republic (CAR), or Mali. Students were divided into groups of three, and within each group explored this question by three approaches. Many graduate students in all groups could not determine the relative rates, especially which country had the lowest violence specific mortality rate. Of the 34 searches that identified a highest violent death rate country, 27.5 (81%) concluded it was Venezuela, followed by Syria (4.5; 13%), Mali (1; 3%) and CAR (1; 3%). Of the 26 searches that identified a least violent death rate 21.5 (83%) reported either CAR or Mali, followed by Yemen (2.5; 10%) and Syria (2; 8%). Aside from lack of data on CAR and Mali, students were perplexed about whether to include suicides or executions in the measure. This resulted in almost half of all inquiries unable to estimate a highest and lowest rate among these five countries. Where conclusions were drawn, it is likely the internet drew students to the opposite conclusion from reality. There are several reasons for this discordance, such as differing categories of violent deaths as defined by the World Health Organization, and search engine algorithms. It is probable, however, that larger issues of connectivity of individual societies with each other and the outside world are playing a profound role in the deceptive results found in this exercise. This insight emphasizes the internetās under-reporting in the worldās most poor and remote locations, and highlights the importance of primary data collection and reporting in such settings
Commentary: Ensuring health statistics in conflict are evidence-based
The author argues that measuring mortality in conflict settings is fraught with limitations which mostly result in under-estimation of mortality. Some recent publications on this subject have been based upon convenient surveillance processes, or even press reports. The author calls for vigilance against such studies and argues that war related surveillance-based mortality estimates should include measures of sensitivity and representativeness
Reporting Iraqi civilian fatalities in a time of war
In February, 2007, the Associated Press (AP) conducted a poll of 1,002 adults in the United States about their attitudes towards the war in Iraq. Respondents were remarkably accurate estimating the current death toll of US soldiers, yet were grossly inaccurate in estimating the current death toll of Iraqi civilians. We conducted a search of newspapers reports to determine the extent of the discrepancy between reporting Coalition and Iraqi civilian deaths, hypothesizing that there would be an over-representation of Coalition deaths compared to Iraqi civilian deaths. We examined 11 U.S. newspapers and 5 non-U.S. newspapers using electronic databases or newspaper web-archives, to record any reports between March 2003 and March 2008 of Coalition and Iraqi deaths that included a numeric indicator. Reports were described as "events" where they described a specific occurrence involving fatalities and "tallies" when they mentioned the number of deaths over a period of time. We recorded the number of events and tallies related to Coalition deaths, Iraqi civilian deaths, and Iraqi combatant deaths U.S. newspapers report more events and tallies related to Coalition deaths than Iraqi civilian deaths, although there are substantially different proportions amongst the different U.S. newspapers. In four of the five non-US newspapers, the pattern was reversed. This difference in reporting trends may partly explain the discrepancy in how well people are informed about U.S. and Iraqi civilian fatalities in Iraq. Furthermore, this calls into question the role of the media in reporting and sustaining armed conflict, and the extent to which newspaper and other media reports can be used as data to assess fatalities or trends in the time of war
Quark Mass Textures and sin 2 beta
Recent precise measurements of sin 2 beta from the B-factories (BABAR and
BELLE) and a better known strange quark mass from lattice QCD make precision
tests of predictive texture models possible. The models tested include those
hierarchical N-zero textures classified by Ramond, Roberts and Ross, as well as
any other hierarchical matrix Ansatz with non-zero 12 = 21 and vanishing 11 and
13 elements. We calculate the maximally allowed value for sin 2 beta in these
models and show that all the aforementioned models with vanishing 11 and 13
elements are ruled out at the 3 sigma level. While at present sin 2 beta and
|Vub/Vcb| are equally good for testing N-zero texture models, in the near
future the former will surpass the latter in constraining power.Comment: 1+20 pages, 2 figures, JHEP3 clas
Punctuated equilibria and 1/f noise in a biological coevolution model with individual-based dynamics
We present a study by linear stability analysis and large-scale Monte Carlo
simulations of a simple model of biological coevolution. Selection is provided
through a reproduction probability that contains quenched, random interspecies
interactions, while genetic variation is provided through a low mutation rate.
Both selection and mutation act on individual organisms. Consistent with some
current theories of macroevolutionary dynamics, the model displays
intermittent, statistically self-similar behavior with punctuated equilibria.
The probability density for the lifetimes of ecological communities is well
approximated by a power law with exponent near -2, and the corresponding power
spectral densities show 1/f noise (flicker noise) over several decades. The
long-lived communities (quasi-steady states) consist of a relatively small
number of mutualistically interacting species, and they are surrounded by a
``protection zone'' of closely related genotypes that have a very low
probability of invading the resident community. The extent of the protection
zone affects the stability of the community in a way analogous to the height of
the free-energy barrier surrounding a metastable state in a physical system.
Measures of biological diversity are on average stationary with no discernible
trends, even over our very long simulation runs of approximately 3.4x10^7
generations.Comment: 20 pages RevTex. Minor revisions consistent with published versio
Photometric Monitoring of Open Clusters I. The Survey
Open clusters, which have age, abundance, and extinction information from
studies of main-sequence turn off stars, are the ideal location in which to
determine the mass-luminosity-radius relation for low-mass stars. We have
undertaken a photometric monitoring survey of open clusters in the Galaxy
designed to detect low-mass eclipsing binary systems through variations in
their relative light curves. Our aim is to provide an improved calibration of
the mass-luminosity-radius relation for low-mass stars and brown dwarfs, to
test stellar structure and evolution models, and to help quantify the
contribution of low-mass stars to the global mass census in the Galaxy. In this
paper we present our survey, describing the data and outlining the analysis
techniques. We study six nearby open clusters, with a range of ages from to 4 Gyr and metallicities from approximately solar to -0.2dex. We monitor
a field-of-view of > 1 square degree per target cluster, well beyond the
characteristic cluster radius, over timescales of hours, days, and months with
a sampling rate optimised for the detection of eclipsing binaries with periods
of hours to days. Our survey depth is designed to detect eclipse events in a
binary with a primary star of \lesssim 0.3~M_{\sun}. Our data have a
photometric precision of mmag at .Comment: 50 pages, 18 figures, accepted for publication in A
The structure of the PapD-PapGII pilin complex reveals an open and flexible P5 pocket
P pili are hairlike polymeric structures that mediate binding of uropathogenic Escherichia coli to the surface of the kidney via the PapG adhesin at their tips. PapG is composed of two domains: a lectin domain at the tip of the pilus followed by a pilin domain that comprises the initial polymerizing subunit of the 1,000-plus-subunit heteropolymeric pilus fiber. Prior to assembly, periplasmic pilin domains bind to a chaperone, PapD. PapD mediates donor strand complementation, in which a beta strand of PapD temporarily completes the pilin domain's fold, preventing premature, nonproductive interactions with other pilin subunits and facilitating subunit folding. Chaperone-subunit complexes are delivered to the outer membrane usher where donor strand exchange (DSE) replaces PapD's donated beta strand with an amino-terminal extension on the next incoming pilin subunit. This occurs via a zip-in-zip-out mechanism that initiates at a relatively accessible hydrophobic space termed the P5 pocket on the terminally incorporated pilus subunit. Here, we solve the structure of PapD in complex with the pilin domain of isoform II of PapG (PapGIIp). Our data revealed that PapGIIp adopts an immunoglobulin fold with a missing seventh strand, complemented in parallel by the G1 PapD strand, typical of pilin subunits. Comparisons with other chaperone-pilin complexes indicated that the interactive surfaces are highly conserved. Interestingly, the PapGIIp P5 pocket was in an open conformation, which, as molecular dynamics simulations revealed, switches between an open and a closed conformation due to the flexibility of the surrounding loops. Our study reveals the structural details of the DSE mechanism
Streamlining Ground Station Network Compatibility Test for Small Satellites
A team of eight subject matter experts at NASA Goddard Space Flight Center (GSFC) completed a Lean Six Sigma project to identify process improvements for the compatibility test process for small satellites planning to use the NASA Near Earth Network (NEN). Ground station network compatibility testing is designed to reduce the risk to missions by resolving issues between the spacecraft's flight communication and navigation components and the ground systems prior to launch. Compatibility testing, which consists of a series of tests performed over a period of months and documented in reports, is an important step meant to prevent post-launch anomalies that could lead to expensive troubleshooting or mission failure. Compared to traditional missions, small satellite missions typically have a smaller budget and compressed schedules, which can result in small satellite projects' willingness to accept the risk associated with less comprehensive compatibility testing. Optimization and or refinement of the compatibility test process for small satellite missions could alleviate some of the pressures inherent with these factors. The goal of the Lean Six Sigma project was to develop alternative scalable methods of compatibility testing for small satellites. The Lean Six Sigma approach and the results of the project are reviewed in this paper
Magnetic intensity loss and core diagenesis in long-core samples from the East Cortez Basin and the San Nicolas Basin (California Borderland)
- ā¦