1,226 research outputs found
Improving introspection to inform free will regarding the choice by healthy individuals to use or not use cognitive enhancing drugs
A commentary in Nature entitled "Towards responsible use of cognitive-enhancing drugs by the healthy" (Greely et al 2008 Nature 456: 702–705) offers an opportunity to move toward a humane societal appreciation of mind-altering drugs. Using cognitive enhancing drugs as an exemplar, this article presents a series of hypotheses concerning how an individual might learn optimal use. The essence of the proposal is that individuals can cultivate sensitivity to the effects of ever-smaller amounts of psychoactive drugs thereby making harm less likely and benign effects more probable. Four interrelated hypotheses are presented and briefly discussed. 1. Humans can learn to discriminate ever-smaller doses of at least some mind-altering drugs; a learning program can be designed or discovered that will have this outcome. 2. The skill to discriminate drugs and dose can be generalized, i.e. if learned with one drug a second one is easier and so on. 3. Cultivating this skill/knack would be beneficial in leading to choices informed by a more accurate sense of mind-body interactions. 4. From a philosophical point of view learning the effects of ever-smaller doses of psychoactive agents offers a novel path into and to transcend the objective/subjective barrier and the mind/body problem
Signal and noise in bridging PCR
BACKGROUND: In a variant of the standard PCR reaction termed bridging, or jumping, PCR the primer-bound sequences are originally on separate template molecules. Bridging can occur if, and only if, the templates contain a region of sequence similarity. A 3' end of synthesis in one round of synthesis that terminates in this region of similarity can prime on the other. In principle, Bridging PCR (BPCR) can detect a subpopulation of one template that terminates synthesis in the region of sequence shared by the other template. This study considers the sensitivity and noise of BPCR as a quantitative assay for backbone interruptions. Bridging synthesis is also important to some methods for computing with DNA. RESULTS: In this study, BPCR was tested over a 328 base pair segment of the E. coli lac operon and a signal to noise ratio (S/N) of approximately 10 was obtained under normal PCR conditions with Taq polymerase. With special precautions in the case of Taq or by using the Stoffel fragment the S/N was improved to 100, i.e. 1 part of cut input DNA yielded the same output as 100 parts of intact input DNA. CONCLUSIONS: In the E. coli lac operator region studied here, depending on details of protocol, between 3 and 30% per kilobase of final PCR product resulted from bridging. Other systems are expected to differ in the proportion of product that is bridged consequent to PCR protocol and the sequence analyzed. In many cases physical bridging during PCR will have no informational consequence because the bridged templates are of identical sequence, but in a number of special cases bridging creates, or, destroys, information
Jet Trimming
Initial state radiation, multiple interactions, and event pileup can
contaminate jets and degrade event reconstruction. Here we introduce a
procedure, jet trimming, designed to mitigate these sources of contamination in
jets initiated by light partons. This procedure is complimentary to existing
methods developed for boosted heavy particles. We find that jet trimming can
achieve significant improvements in event reconstruction, especially at high
energy/luminosity hadron colliders like the LHC.Comment: 20 pages, 11 figures, 3 tables - Minor changes to text/figure
General Neutralino NLSPs at the Early LHC
Gauge mediated supersymmetry breaking (GMSB) is a theoretically
well-motivated framework with rich and varied collider phenomenology. In this
paper, we study the Tevatron limits and LHC discovery potential for a wide
class of GMSB scenarios in which the next-to-lightest superpartner (NLSP) is a
promptly-decaying neutralino. These scenarios give rise to signatures involving
hard photons, 's, 's, jets and/or higgses, plus missing energy. In order
to characterize these signatures, we define a small number of minimal spectra,
in the context of General Gauge Mediation, which are parameterized by the mass
of the NLSP and the gluino. Using these minimal spectra, we determine the most
promising discovery channels for general neutralino NLSPs. We find that the
2010 dataset can already cover new ground with strong production for all NLSP
types. With the upcoming 2011-2012 dataset, we find that the LHC will also have
sensitivity to direct electroweak production of neutralino NLSPs.Comment: 26 page
Anticoagulant vs. antiplatelet therapy in patients with cryptogenic stroke and patent foramen ovale: an individual participant data meta-analysis
Aims The preferred antithrombotic strategy for secondary prevention in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO) is unknown. We pooled multiple observational studies and used propensity score-based methods to estimate the comparative effectiveness of oral anticoagulation (OAC) compared with antiplatelet therapy (APT). Methods and results Individual participant data from 12 databases of medically treated patients with CS and PFO were analysed with Cox regression models, to estimate database-specific hazard ratios (HRs) comparing OAC with APT, for both the primary composite outcome [recurrent stroke, transient ischaemic attack (TIA), or death] and stroke alone. Propensity scores were applied via inverse probability of treatment weighting to control for confounding. We synthesized database-specific HRs using random-effects meta-analysis models. This analysis included 2385 (OAC = 804and APT = 1581) patients with 227 composite endpoints (stroke/TIA/death). The difference between OAC and APT was not statistically significant for the primary composite outcome [adjusted HR = 0.76, 95% confidence interval (CI) 0.52-1.12] or for the secondary outcome of stroke alone (adjusted HR = 0.75, 95% CI 0.44-1.27). Results were consistent in analyses applying alternative weighting schemes, with the exception that OAC had a statistically significant beneficial effect on the composite outcome in analyses standardized to the patient population who actually received APT (adjusted HR = 0.64, 95% CI 0.42-0.99). Subgroup analyses did not detect statistically significant heterogeneity of treatment effects across clinically important patient groups. Conclusion We did not find a statistically significant difference comparing OAC with APT; our results justify randomized trials comparing different antithrombotic approaches in these patient
Measuring the Polarization of Boosted Hadronic Tops
We propose a new technique for measuring the polarization of hadronically
decaying boosted top quarks. In particular, we apply a subjet-based technique
to events where the decay products of the top are clustered within a single
jet. The technique requires neither b-tagging nor W-reconstruction, and does
not rely on assumptions about either the top production mechanism or the
sources of missing energy in the event. We include results for various new
physics scenarios made with different Monte Carlo generators to demonstrate the
robustness of the technique.Comment: v2: version accepted for publication in JHE
Heavy Squarks at the LHC
The LHC, with its seven-fold increase in energy over the Tevatron, is capable
of probing regions of SUSY parameter space exhibiting qualitatively new
collider phenomenology. Here we investigate one such region in which first
generation squarks are very heavy compared to the other superpartners. We find
that the production of these squarks, which is dominantly associative, only
becomes rate-limited at mSquark > 4(5) TeV for L~10(100) fb-1. However,
discovery of this scenario is complicated because heavy squarks decay primarily
into a jet and boosted gluino, yielding a dijet-like topology with missing
energy (MET) pointing along the direction of the second hardest jet. The result
is that many signal events are removed by standard jet/MET anti-alignment cuts
designed to guard against jet mismeasurement errors. We suggest replacing these
anti-alignment cuts with a measurement of jet substructure that can
significantly extend the reach of this channel while still removing much of the
background. We study a selection of benchmark points in detail, demonstrating
that mSquark= 4(5) TeV first generation squarks can be discovered at the LHC
with L~10(100)fb-1
A qualitative study of stakeholders' perspectives on the social network service environment
Over two billion people are using the Internet at present, assisted by the mediating activities of software agents which deal with the diversity and complexity of information. There are, however, ethical issues due to the monitoring-and-surveillance, data mining and autonomous nature of software agents. Considering the context, this study aims to comprehend stakeholders' perspectives on the social network service environment in order to identify the main considerations for the design of software agents in social network services in the near future. Twenty-one stakeholders, belonging to three key stakeholder groups, were recruited using a purposive sampling strategy for unstandardised semi-structured e-mail interviews. The interview data were analysed using a qualitative content analysis method. It was possible to identify three main considerations for the design of software agents in social network services, which were classified into the following categories: comprehensive understanding of users' perception of privacy, user type recognition algorithms for software agent development and existing software agents enhancement
New Physics Signals in Longitudinal Gauge Boson Scattering at the LHC
We introduce a novel technique designed to look for signatures of new physics
in vector boson fusion processes at the TeV scale. This functions by measuring
the polarization of the vector bosons to determine the relative longitudinal to
transverse production. In studying this ratio we can directly probe the high
energy E^2-growth of longitudinal vector boson scattering amplitudes
characteristic of models with non-Standard Model (SM) interactions. We will
focus on studying models parameterized by an effective Lagrangian that include
a light Higgs with non-SM couplings arising from TeV scale new physics
associated with the electroweak symmetry breaking, although our technique can
be used in more general scenarios. We will show that this technique is stable
against the large uncertainties that can result from variations in the
factorization scale, improving upon previous studies that measure cross section
alone
Two-Body B Meson Decays to and -- Observation of {'}K$
In a sample of 6.6 million produced B mesons we have observed decays B ->
eta' K, with branching fractions BR(B+ -> eta' K+ = 6.5 +1.5 -1.4 +- 0.9) x
and BR(B0 -> eta' K0 = 4.7 +2.7 -2.0 +- 0.9) x . We have
searched with comparable sensitivity for 17 related decays to final states
containing an eta or eta' meson accompanied by a single particle or low-lying
resonance. Our upper limits for these constrain theoretical interpretations of
the B -> eta' K signal.Comment: 12 page postscript file, postscript file also available through
http://w4.lns.cornell.edu/public/CLN
- …