2,831 research outputs found
Achiral phenolic N-oxides as additives: an alternative strategy for asymmetric cyanosilylation of ketones
The activation of chiral titanium(IV) complexes with additives, phenolic N-oxides, is found to provide an alternative strategy for asymmetric cyanosilylation of ketones in excellent yield With LIP to 82%, ee. (C) 2004 Elsevier Ltd. All rights reserved
Identification of genes differentially expressed in Jining Grey and Liaoning Cashmere goats ovaries
To search for genes controlling high prolificacy of Chinese indigenous goats, differential display reverse transcription-polymerase chain reaction (DDRT-PCR) was used to screen differentially expressed cDNA bands in the sexually matured ovaries of 3-year-old prolific Jining Grey goats and monotocous Liaoning Cashmere goats with 24 combinations of three anchored primers and eight arbitrary primers. 22 expressed sequence tags (ESTs) were proved to be the positive bands by Northern hybridization. They comprised 10 known ESTs and 12 ESTs without homologous sequences in the GenBank. These results indicate that several genes such as GATA-4, metallothionein-like protein, CAT genes and unknown ESTs (CV983340 and CV983341) were expressed only in Jining Grey goats.Keywords: Differential display reverse transcription-polymerase chain reaction, goat, ovary, prolificacyAfrican Journal of Biotechnology Vol. 12(27), pp. 4408-441
Interpreting a 1 fb^-1 ATLAS Search in the Minimal Anomaly Mediated Supersymmetry Breaking Model
Recent LHC data significantly extend the exclusion limits for supersymmetric
particles, particularly in the jets plus missing transverse momentum channels.
The most recent such data have so far been interpreted by the experiment in
only two different supersymmetry breaking models: the constrained minimal
supersymmetric standard model (CMSSM) and a simplified model with only squarks
and gluinos and massless neutralinos. We compare kinematical distributions of
supersymmetric signal events predicted by the CMSSM and anomaly mediated
supersymmetry breaking (mAMSB) before calculating exclusion limits in mAMSB. We
obtain a lower limit of 900 GeV on squark and gluino masses at the 95%
confidence level for the equal mass limit, tan(beta)=10 and mu>0.Comment: 18 pages, 11 figure
Dark Matter, Muon g-2 and Other SUSY Constraints
Recent developments constraining the SUSY parameter space are reviewed within
the framework of SUGRA GUT models. The WMAP data is seen to reduce the error in
the density of cold dark matter by about a factor of four, implying that the
lightest stau is only 5 -10 GeV heavier than the lightest neutralino when m_0,
m_{1/2} < 1 TeV. The CMD-2 re-analysis of their data has reduced the
disagreement between the Standard Model prediction and the Brookhaven
measurement of the muon magnetic moment to 1.9 sigma, while using the tau decay
data plus CVC, the disagreement is 0.7 sigma. (However, the two sets of data
remain inconsistent at the 2.9 sigma level.) The recent Belle and BABAR
measurements of the B -> phi K CP violating parameters and branching ratios are
discussed. They are analyzed theoretically within the BBNS improved
factorization method. The CP parameters are in disagreement with the Standard
Model at the 2.7 sigma level, and the branching ratios are low by a factor of
two or more over most of the parameter space. It is shown that both anomalies
can naturally be accounted for by adding a non-universal cubic soft breaking
term at M_G mixing the second and third generations.Comment: 16 pages, 7 figures, plenary talk at Beyond The Desert '03, Castle
Ringberg, Germany, June 9, 2003. Typos correcte
Maverick dark matter at colliders
Assuming that dark matter is a weakly interacting massive particle (WIMP)
species X produced in the early Universe as a cold thermal relic, we study the
collider signal of pp or ppbar -> XXbar + jets and its distinguishability from
standard-model background processes associated with jets and missing energy. We
assume that the WIMP is the sole particle related to dark matter within reach
of the LHC--a "maverick" particle--and that it couples to quarks through a
higher dimensional contact interaction. We simulate the WIMP final-state signal
XXbar + jet and dominant standard-model (SM) background processes and find that
the dark-matter production process results in higher energies for the colored
final state partons than do the standard-model background processes, resulting
in more QCD radiation and a higher jet multiplicity. As a consequence, the
detectable signature of maverick dark matter is an excess over standard-model
expectations of events consisting of large missing transverse energy, together
with large leading jet transverse momentum and scalar sum of the transverse
momenta of the jets. Existing Tevatron data and forthcoming LHC data can
constrain (or discover!) maverick dark matter.Comment: 11 pages, 7 figure
Slepton mass-splittings as a signal of LFV at the LHC
Precise measurements of slepton mass-splittings might represent a powerful
tool to probe supersymmetric (SUSY) lepton flavour violation (LFV) at the LHC.
We point out that mass-splittings of the first two generations of sleptons are
especially sensitive to LFV effects involving transitions. If these
mass-splittings are LFV induced, high-energy LFV processes like the neutralino
decay {\nt}_2\to\nt_1\tau^{\pm}\mu^{\mp} as well as low-energy LFV processes
like are unavoidable. We show that precise slepton
mass-splitting measurements and LFV processes both at the high- and low-energy
scales are highly complementary in the attempt to (partially) reconstruct the
flavour sector of the SUSY model at work. The present study represents another
proof of the synergy and interplay existing between the LHC, i.e. the {\em
high-energy frontier}, and high-precision low-energy experiments, i.e. the {\em
high-intensity frontier}.Comment: 11 pages, 5 figures. v2: added discussion on backgrounds, added
references, version to be published on JHE
Quality Appraisal in Systematic Literature Reviews of Studies Eliciting Health State Utility Values: Conceptual Considerations.
BACKGROUND: The increasing number of studies that generate health state utility values (HSUVs) and the impact of HSUVs on cost-utility analyses make a robust tailored quality appraisal (QA) tool for systematic reviews of these studies necessary. OBJECTIVE: This study aimed to address conceptual issues regarding QA in systematic reviews of studies eliciting HSUVs by establishing a consensus on the definitions, dimensions and scope of a QA tool specific to this context. METHODS: A modified Delphi method was used in this study. An international multidisciplinary panel of seven experts was purposively assembled. The experts engaged in two anonymous online survey rounds. After each round, the experts received structured and controlled feedback on the previous phase. Controlled feedback allowed the experts to re-evaluate and adjust their positions based on collective insights. Following these surveys, a virtual face-to-face meeting was held to resolve outstanding issues. Consensus was defined a priori at all stages of the modified Delphi process. RESULTS: The response rates to the first-round and second-round questionnaires and the virtual consensus meeting were 100%, 86% and 71%, respectively. The entire process culminated in a consensus on the definitions of scientific quality, QA, the three QA dimensions-reporting, relevance and methodological quality-and the scope of a QA tool specific to studies that elicit HSUVs. CONCLUSIONS: Achieving this consensus marks a pivotal step towards developing a QA tool specific to systematic reviews of studies eliciting HSUVs. Future research will build on this foundation, identify QA items, signalling questions and response options, and develop a QA tool specific to studies eliciting HSUVs
On encoding symbol degrees of array BP-XOR codes
Low density parity check (LDPC) codes, LT codes and digital fountain techniques have received significant attention from both academics and industry in the past few years. By employing the underlying ideas of efficient Belief Propagation (BP) decoding process (also called iterative message passing decoding process) on binary erasure channels (BEC) in LDPC codes, Wang has recently introduced the concept of array BP-XOR codes and showed the necessary and sufficient conditions for MDS [k + 2,k] and [n,2] array BP-XOR codes. In this paper, we analyze the encoding symbol degree requirements for array BP-XOR codes and present new necessary conditions for array BP-XOR codes. These new necessary conditions are used as a guideline for constructing several array BP-XOR codes and for presenting a complete characterization (necessary and sufficient conditions) of degree two array BP-XOR codes and for designing new edge-colored graphs. Meanwhile, these new necessary conditions are used to show that the codes by Feng, Deng, Bao, and Shen in IEEE Transactions on Computers are incorrect
Solubility evaluation of murine hybridoma antibodies
The successful development of antibody therapeutics depends on the molecules having properties that are suitable for manufacturing, as well as use by patients. Because high solubility is a desirable property for antibodies, screening for solubility has become an essential step during the early candidate selection process. In considering the screening process, we formed a hypothesis that hybridoma antibodies are filtered by nature to possess high solubility and tested this hypothesis using a large number of murine hybridoma-derived antibodies. Using the cross-interaction chromatography (CIC) method, we screened the solubility of 92 murine hybridoma-derived monoclonal antibodies and found that all of these molecules exhibited CIC profiles that are indicative of high solubility (>100mg/mL). Further investigations revealed that variable region N-linked glycosylation or isoelectric parameters are unlikely to contribute to the high solubility of these antibodies. These results support the general hypothesis that hybridoma monoclonal antibodies are highly soluble
- …