22,172 research outputs found
The impact of the ATLAS zero-lepton, jets and missing momentum search on a CMSSM fit
Recent ATLAS data significantly extend the exclusion limits for
supersymmetric particles. We examine the impact of such data on global fits of
the constrained minimal supersymmetric standard model (CMSSM) to indirect and
cosmological data. We calculate the likelihood map of the ATLAS search, taking
into account systematic errors on the signal and on the background. We validate
our calculation against the ATLAS determinaton of 95% confidence level
exclusion contours. A previous CMSSM global fit is then re-weighted by the
likelihood map, which takes a bite at the high probability density region of
the global fit, pushing scalar and gaugino masses up.Comment: 16 pages, 7 figures. v2 has bigger figures and fixed typos. v3 has
clarified explanation of our handling of signal systematic
The Turkey Ig-like receptor family: identification, expression and function.
The chicken leukocyte receptor complex located on microchromosome 31 encodes the chicken Ig-like receptors (CHIR), a vastly expanded gene family which can be further divided into three subgroups: activating CHIR-A, bifunctional CHIR-AB and inhibitory CHIR-B. Here, we investigated the presence of CHIR homologues in other bird species. The available genome databases of turkey, duck and zebra finch were screened with different strategies including BLAST searches employing various CHIR sequences, and keyword searches. We could not identify CHIR homologues in the distantly related zebra finch and duck, however, several partial and complete sequences of CHIR homologues were identified on chromosome 3 of the turkey genome. They were designated as turkey Ig-like receptors (TILR). Using cDNA derived from turkey blood and spleen RNA, six full length TILR could be amplified and further divided according to the typical sequence features into one activating TILR-A, one inhibitory TILR-B and four bifunctional TILR-AB. Since the TILR-AB sequences all displayed the critical residues shown to be involved in binding to IgY, we next confirmed the IgY binding using a soluble TILR-AB1-huIg fusion protein. This fusion protein reacted with IgY derived from various gallinaceous birds, but not with IgY from other bird species. Finally, we tested various mab directed against CHIR for their crossreactivity with either turkey or duck leukocytes. Whereas no staining was detectable with duck cells, the CHIR-AB1 specific mab 8D12 and the CHIR-A2 specific mab 13E2 both reacted with a leukocyte subpopulation that was further identified as thrombocytes by double immunofluorescence employing B-cell, T-cell and thrombocyte specific reagents. In summary, although the turkey harbors similar LRC genes as the chicken, their distribution seems to be distinct with predominance on thrombocytes rather than lymphocytes
Reconstructing particle masses from pairs of decay chains
A method is proposed for determining the masses of the new particles N,X,Y,Z
in collider events containing a pair of effectively identical decay chains Z to
Y+jet, Y to X+l_1, X to N+l_2, where l_1, l_2 are opposite-sign same-flavour
charged leptons and N is invisible. By first determining the upper edge of the
dilepton invariant mass spectrum, we reduce the problem to a curve for each
event in the 3-dimensional space of mass-squared differences. The region
through which most curves pass then determines the unknown masses. A
statistical approach is applied to take account of mismeasurement of jet and
missing momenta. The method is easily visualized and rather robust against
combinatorial ambiguities and finite detector resolution. It can be successful
even for small event samples, since it makes full use of the kinematical
information from every event.Comment: 12 pages, 5 figure
Bayesian Conditioning, the Reflection Principle, and Quantum Decoherence
The probabilities a Bayesian agent assigns to a set of events typically
change with time, for instance when the agent updates them in the light of new
data. In this paper we address the question of how an agent's probabilities at
different times are constrained by Dutch-book coherence. We review and attempt
to clarify the argument that, although an agent is not forced by coherence to
use the usual Bayesian conditioning rule to update his probabilities, coherence
does require the agent's probabilities to satisfy van Fraassen's [1984]
reflection principle (which entails a related constraint pointed out by
Goldstein [1983]). We then exhibit the specialized assumption needed to recover
Bayesian conditioning from an analogous reflection-style consideration.
Bringing the argument to the context of quantum measurement theory, we show
that "quantum decoherence" can be understood in purely personalist
terms---quantum decoherence (as supposed in a von Neumann chain) is not a
physical process at all, but an application of the reflection principle. From
this point of view, the decoherence theory of Zeh, Zurek, and others as a story
of quantum measurement has the plot turned exactly backward.Comment: 14 pages, written in memory of Itamar Pitowsk
A re-randomisation design for clinical trials
Background: Recruitment to clinical trials is often problematic, with many trials failing to recruit to their target sample size. As a result, patient care may be based on suboptimal evidence from underpowered trials or non-randomised studies. Methods: For many conditions patients will require treatment on several occasions, for example, to treat symptoms of an underlying chronic condition (such as migraines, where treatment is required each time a new episode occurs), or until they achieve treatment success (such as fertility, where patients undergo treatment on multiple occasions until they become pregnant). We describe a re-randomisation design for these scenarios, which allows each patient to be independently randomised on multiple occasions. We discuss the circumstances in which this design can be used. Results: The re-randomisation design will give asymptotically unbiased estimates of treatment effect and correct type I error rates under the following conditions: (a) patients are only re-randomised after the follow-up period from their previous randomisation is complete; (b) randomisations for the same patient are performed independently; and (c) the treatment effect is constant across all randomisations. Provided the analysis accounts for correlation between observations from the same patient, this design will typically have higher power than a parallel group trial with an equivalent number of observations. Conclusions: If used appropriately, the re-randomisation design can increase the recruitment rate for clinical trials while still providing an unbiased estimate of treatment effect and correct type I error rates. In many situations, it can increase the power compared to a parallel group design with an equivalent number of observations
The fine-tuning price of the early LHC
LHC already probed and excluded half of the parameter space of the
Constrained Minimal Supersymmetric Standard Model allowed by previous
experiments. Only about 0.3% of the CMSSM parameter space survives. This
fraction rises to about 0.9% if the bound on the Higgs mass can be
circumvented.Comment: 7 pages. v3: updated with new bounds from ATLAS and CMS at 1.1/fb
presented at the EPS-HEP-2011 conferenc
Recommended from our members
Sixteen years of bathymetry and waves at San Diego beaches.
Sustained, quantitative observations of nearshore waves and sand levels are essential for testing beach evolution models, but comprehensive datasets are relatively rare. We document beach profiles and concurrent waves monitored at three southern California beaches during 2001-2016. The beaches include offshore reefs, lagoon mouths, hard substrates, and cobble and sandy (medium-grained) sediments. The data span two energetic El Niño winters and four beach nourishments. Quarterly surveys of 165 total cross-shore transects (all sites) at 100 m alongshore spacing were made from the backbeach to 8 m depth. Monthly surveys of the subaerial beach were obtained at alongshore-oriented transects. The resulting dataset consists of (1) raw sand elevation data, (2) gridded elevations, (3) interpolated elevation maps with error estimates, (4) beach widths, subaerial and total sand volumes, (5) locations of hard substrate and beach nourishments, (6) water levels from a NOAA tide gauge (7) wave conditions from a buoy-driven regional wave model, and (8) time periods and reaches with alongshore uniform bathymetry, suitable for testing 1-dimensional beach profile change models
On Convergence and Threshold Properties of Discrete Lotka-Volterra Population Protocols
In this work we focus on a natural class of population protocols whose
dynamics are modelled by the discrete version of Lotka-Volterra equations. In
such protocols, when an agent of type (species) interacts with an agent
of type (species) with as the initiator, then 's type becomes
with probability . In such an interaction, we think of as the
predator, as the prey, and the type of the prey is either converted to that
of the predator or stays as is. Such protocols capture the dynamics of some
opinion spreading models and generalize the well-known Rock-Paper-Scissors
discrete dynamics. We consider the pairwise interactions among agents that are
scheduled uniformly at random. We start by considering the convergence time and
show that any Lotka-Volterra-type protocol on an -agent population converges
to some absorbing state in time polynomial in , w.h.p., when any pair of
agents is allowed to interact. By contrast, when the interaction graph is a
star, even the Rock-Paper-Scissors protocol requires exponential time to
converge. We then study threshold effects exhibited by Lotka-Volterra-type
protocols with 3 and more species under interactions between any pair of
agents. We start by presenting a simple 4-type protocol in which the
probability difference of reaching the two possible absorbing states is
strongly amplified by the ratio of the initial populations of the two other
types, which are transient, but "control" convergence. We then prove that the
Rock-Paper-Scissors protocol reaches each of its three possible absorbing
states with almost equal probability, starting from any configuration
satisfying some sub-linear lower bound on the initial size of each species.
That is, Rock-Paper-Scissors is a realization of a "coin-flip consensus" in a
distributed system. Some of our techniques may be of independent value
Characterization and topical delivery of phenylethyl resorcinol
Objective:
Phenylethyl resorcinol (PR) has been used widely in the personal care industry as a novel skin lightening ingredient. Surprisingly, there is only limited information describing the physicochemical properties of this active. Therefore, the primary objective of this study was to perform a comprehensive characterization of PR. A secondary objective was to investigate the delivery of this molecule to mammalian skin.
Methods:
Phenylethyl resorcinol was characterized using differential scanning calorimetry (DSC), thermogravimetric analysis (TGA) and nuclear magnetic resonance (NMR). A new high‐performance liquid chromatographic (HPLC) method for analysis of PR was developed and validated. The log P (octanol water partition coefficient), value, solubility and short‐term stability of PR in a series of vehicles were also determined using HPLC. The evaporation of the selected vehicles was examined using dynamic vapour sorption (DVS). The permeation profiles of PR were investigated under finite dose conditions in porcine and human skin.
Results:
The melting point of PR was determined to be 79.13 °C and the measured log P (octanol water partition coefficient) at 21 °C was 3.35 ± 0.03. The linearity of the HPLC analytical method was confirmed with an r2 value of 0.99. Accuracy of the method was evaluated by average recovery rates at three tested concentrations, and the values ranged from 99 to 106%. The limit of detection (LOD) and limit of quantification (LOQ) were 0.19 and 0.57 μg mL−1, respectively. The solubility of PR in PG, DMI, glycerol was within the range of 367 to 877 mg mL−1. The stability of PR in tested solvents was also confirmed by the 72 h stability studies. From the DVS studies, 70–125% of applied formulations were recovered at 24 h. The permeation through porcine skin at 24 h ranged from 4 to 13 μg cm−2, while the corresponding amounts of PR delivered through human skin were 2 to 10 μg cm−2.
Conclusion:
The physicochemical properties of PR confirm it is suitable for dermal delivery. In this study, propylene glycol was the most promising vehicle for PR delivery to human skin. Future work will expand the range of vehicles studied and explore the percutaneous absorption from more complex formulations
- …