3,856 research outputs found
Recommended from our members
The mechanism and kinetics of methyl isobutyl ketone synthesis from acetone over ion-exchanged hydroxyapatite
The synthesis of methyl isobutyl ketone (MIBK) can be carried out by the condensation of acetone in the presence of hydrogen over a supported metal catalyst. Previous studies have shown that hydroxyapatite is an excellent catalyst for condensation reactions. The present investigation was undertaken in order to elucidate the reaction mechanism and site requirements for acetone coupling to MIBK over a physical mixture of hydroxyapatite and Pd/SiO2. The reaction is found to proceed by consecutive aldol addition to form diacetone alcohol (DAA), dehydration of DAA to mesityl oxide (MO), and hydrogenation of MO to MIBK. The products formed by feeding DAA and MO reveal that aldol addition of acetone is rapid and reversible, and that the subsequent dehydration of DAA is rate-limiting. Pyridine and CO2 titration show that aldol dehydration occurs over basic sites via an E1cB mechanism. A series of cation-substituted hydroxyapatite samples were prepared by ion-exchange to further investigate the role of acid-base strength on catalyst performance. Characterization of these samples by PXRD, BET, ICP-OES, XPS, CO2-TPD, and Raman spectroscopy demonstrated that the exchange procedure used does not affect the bulk properties of hydroxyapatite. DFT calculations reveal that in addition to affecting the Lewis acidity/basicity of the support, the size of the cation plays a significant role in the chemistry: cations that are too large (Ba2+) or too small (Mg2+) adversely affect reaction rates due to excessive stabilization of intermediate species. Strontium-exchanged hydroxyapatite was found to be the most active catalyst because it promoted α-hydrogen abstraction and C–O bond cleavage of DAA efficiently
Evidence of a high incidence of subclinically affected calves in a herd of cattle with fatal cases of Bovine Neonatal Pancytopenia (BNP).
BACKGROUND: Bovine Neonatal Pancytopenia (BNP) is a disease of calves characterised by bone marrow trilineage hypoplasia, mediated by ingestion of alloantibodies in colostrum. Suspected subclinical forms of BNP have been reported, suggesting that observed clinical cases may not represent the full extent of the disease. However to date there are no objective data available on the incidence of subclinical disease or its temporal distribution. This study aimed to 1) ascertain whether subclinical BNP occurs and, if so, to determine the incidence on an affected farm and 2) determine whether there is evidence of temporal clustering of BNP cases on this farm. To achieve these aims, haematological screening of calves born on the farm during one calving season was carried out, utilising blood samples collected at defined ages. These data were then analysed in comparison to data from both known BNP-free control animals and histopathologically confirmed BNP cases. An ordinal logistic regression model was used to create a composite haematology score to predict the probabilities of calves being normal, based on their haematology measurements at 10–14 days old. RESULTS: This study revealed that 15% (21 of 139) of the clinically normal calves on this farm had profoundly abnormal haematology (<5% chance of being normal) and could be defined as affected by subclinical BNP. Together with clinical BNP cases, this gave the study farm a BNP incidence of 18%. Calves with BNP were found to be distributed throughout the calving period, with no clustering, and no significant differences in the date of birth of cases or subclinical cases were found compared to the rest of the calves. This study did not find any evidence of increased mortality or increased time from birth to sale in subclinical BNP calves but, as the study only involved a single farm and adverse effects may be determined by other inter-current diseases it remains possible that subclinical BNP has a detrimental impact on the health and productivity of calves under certain circumstances. CONCLUSIONS: Subclinical BNP was found to occur at a high incidence in a herd of cattle with fatal cases of BNP
Recommended from our members
Zeolite-Catalyzed Isobutene Amination: Mechanism and Kinetics
Amination of isobutene with NH was investigated over Brønsted acidic zeolites at 1 atm and 453-483 K. To compare catalytic activities over different zeolites, the measured reaction rates are normalized by the number of active sites determined by tert-butylamine temperature-programmed desorption (TPD). Small- A nd medium-pore zeolites with one-dimensional channels exhibit low activity because of pore blockage by adsorbed tert-butylammonium ions. However, turnover frequencies and activation energies are not sensitive to framework identity, as long as the active site is accessible to isobutene and tert-butylamine. Kinetic measurements and FTIR spectroscopy reveal that the Brønsted acid sites in MFI are covered predominantly with tert-butylammonium ions under reaction conditions. The desorption of tert-butylamine is assisted by the concurrent adsorption of isobutene. DFT simulations show that at very low tert-butylamine partial pressures, for example, at the inlet to the reactor, tert-butylamine desorption is rate-limiting. However, at sufficiently high tert-butylamine partial pressures (>0.03 kPa), protonation of isobutene to the corresponding carbenium ion limits the rate of amination.
Temporal influence over the Last.fm social network
In a previous result, we showed that the influence of social contacts spreads information about new artists through the Last.fm social network. We successfully decomposed influence from effects of trends, global popularity, and homophily or shared environment of friends. In this paper, we present our new experiments that use a mathematically sound formula for defining and measuring the influence in the network. We provide new baseline and influence models and evaluation measures, both batch and online, for real-time recommendations with very strong temporal aspects. Our experiments are carried over the 2-year “scrobble” history of 70,000 Last.fm users. In our results, we formally define and distil the effect of social influence. In addition, we provide new models and evaluation measures for real-time recommendations with very strong temporal aspects. © 2015, Springer-Verlag Wien
Why is it difficult to implement e-health initiatives? A qualitative study
<b>Background</b> The use of information and communication technologies in healthcare is seen as essential for high quality and cost-effective healthcare. However, implementation of e-health initiatives has often been problematic, with many failing to demonstrate predicted benefits. This study aimed to explore and understand the experiences of implementers - the senior managers and other staff charged with implementing e-health initiatives and their assessment of factors which promote or inhibit the successful implementation, embedding, and integration of e-health initiatives.<p></p>
<b>Methods</b> We used a case study methodology, using semi-structured interviews with implementers for data collection. Case studies were selected to provide a range of healthcare contexts (primary, secondary, community care), e-health initiatives, and degrees of normalization. The initiatives studied were Picture Archiving and Communication System (PACS) in secondary care, a Community Nurse Information System (CNIS) in community care, and Choose and Book (C&B) across the primary-secondary care interface. Implementers were selected to provide a range of seniority, including chief executive officers, middle managers, and staff with 'on the ground' experience. Interview data were analyzed using a framework derived from Normalization Process Theory (NPT).<p></p>
<b>Results</b> Twenty-three interviews were completed across the three case studies. There were wide differences in experiences of implementation and embedding across these case studies; these differences were well explained by collective action components of NPT. New technology was most likely to 'normalize' where implementers perceived that it had a positive impact on interactions between professionals and patients and between different professional groups, and fit well with the organisational goals and skill sets of existing staff. However, where implementers perceived problems in one or more of these areas, they also perceived a lower level of normalization.<p></p>
<b>Conclusions</b> Implementers had rich understandings of barriers and facilitators to successful implementation of e-health initiatives, and their views should continue to be sought in future research. NPT can be used to explain observed variations in implementation processes, and may be useful in drawing planners' attention to potential problems with a view to addressing them during implementation planning
The Hubble Constant
I review the current state of determinations of the Hubble constant, which
gives the length scale of the Universe by relating the expansion velocity of
objects to their distance. There are two broad categories of measurements. The
first uses individual astrophysical objects which have some property that
allows their intrinsic luminosity or size to be determined, or allows the
determination of their distance by geometric means. The second category
comprises the use of all-sky cosmic microwave background, or correlations
between large samples of galaxies, to determine information about the geometry
of the Universe and hence the Hubble constant, typically in a combination with
other cosmological parameters. Many, but not all, object-based measurements
give values of around 72-74km/s/Mpc , with typical errors of 2-3km/s/Mpc.
This is in mild discrepancy with CMB-based measurements, in particular those
from the Planck satellite, which give values of 67-68km/s/Mpc and typical
errors of 1-2km/s/Mpc. The size of the remaining systematics indicate that
accuracy rather than precision is the remaining problem in a good determination
of the Hubble constant. Whether a discrepancy exists, and whether new physics
is needed to resolve it, depends on details of the systematics of the
object-based methods, and also on the assumptions about other cosmological
parameters and which datasets are combined in the case of the all-sky methods.Comment: Extensively revised and updated since the 2007 version: accepted by
Living Reviews in Relativity as a major (2014) update of LRR 10, 4, 200
Developing a digital intervention for cancer survivors: an evidence-, theory- and person-based approach
This paper illustrates a rigorous approach to developing digital interventions using an evidence-, theory- and person-based approach. Intervention planning included a rapid scoping review which identified cancer survivors’ needs, including barriers and facilitators to intervention success. Review evidence (N=49 papers) informed the intervention’s Guiding Principles, theory-based behavioural analysis and logic model. The intervention was optimised based on feedback on a prototype intervention through interviews (N=96) with cancer survivors and focus groups with NHS staff and cancer charity workers (N=31). Interviews with cancer survivors highlighted barriers to engagement, such as concerns about physical activity worsening fatigue. Focus groups highlighted concerns about support appointment length and how to support distressed participants. Feedback informed intervention modifications, to maximise acceptability, feasibility and likelihood of behaviour change. Our systematic method for understanding user views enabled us to anticipate and address important barriers to engagement. This methodology may be useful to others developing digital interventions
Reduced Diversity and High Sponge Abundance on a Sedimented Indo-Pacific Reef System: Implications for Future Changes in Environmental Quality
Although coral reef health across the globe is declining as a result of anthropogenic impacts, relatively little is known of how environmental variability influences reef organisms other than corals and fish. Sponges are an important component of coral reef fauna that perform many important functional roles and changes in their abundance and diversity as a result of environmental change has the potential to affect overall reef ecosystem functioning. In this study, we examined patterns of sponge biodiversity and abundance across a range of environments to assess the potential key drivers of differences in benthic community structure. We found that sponge assemblages were significantly different across the study sites, but were dominated by one species Lamellodysidea herbacea (42% of all sponges patches recorded) and that the differential rate of sediment deposition was the most important variable driving differences in abundance patterns. Lamellodysidea herbacea abundance was positively associated with sedimentation rates, while total sponge abundance excluding Lamellodysidea herbacea was negatively associated with rates of sedimentation. Overall variation in sponge assemblage composition was correlated with a number of variables although each variable explained only a small amount of the overall variation. Although sponge abundance remained similar across environments, diversity was negatively affected by sedimentation, with the most sedimented sites being dominated by a single sponge species. Our study shows how some sponge species are able to tolerate high levels of sediment and that any transition of coral reefs to more sedimented states may result in a shift to a low diversity sponge dominated system, which is likely to have subsequent effects on ecosystem functioning. © 2014 Powell et al
Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices
The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008
A Measurement of Rb using a Double Tagging Method
The fraction of Z to bbbar events in hadronic Z decays has been measured by
the OPAL experiment using the data collected at LEP between 1992 and 1995. The
Z to bbbar decays were tagged using displaced secondary vertices, and high
momentum electrons and muons. Systematic uncertainties were reduced by
measuring the b-tagging efficiency using a double tagging technique. Efficiency
correlations between opposite hemispheres of an event are small, and are well
understood through comparisons between real and simulated data samples. A value
of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is
statistical and the second systematic. The uncertainty on Rc, the fraction of Z
to ccbar events in hadronic Z decays, is not included in the errors. The
dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the
deviation of Rc from the value 0.172 predicted by the Standard Model. The
result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the
Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European
Physical Journal
- …
