13,775 research outputs found
Anomaly Detection for Resonant New Physics with Machine Learning
Despite extensive theoretical motivation for physics beyond the Standard
Model (BSM) of particle physics, searches at the Large Hadron Collider (LHC)
have found no significant evidence for BSM physics. Therefore, it is essential
to broaden the sensitivity of the search program to include unexpected
scenarios. We present a new model-agnostic anomaly detection technique that
naturally benefits from modern machine learning algorithms. The only
requirement on the signal for this new procedure is that it is localized in at
least one known direction in phase space. Any other directions of phase space
that are uncorrelated with the localized one can be used to search for
unexpected features. This new method is applied to the dijet resonance search
to show that it can turn a modest 2 sigma excess into a 7 sigma excess for a
model with an intermediate BSM particle that is not currently targeted by a
dedicated search.Comment: Replaced with short PRL version. 7 pages, 2 figures. Revised long
version will be submitted separatel
L-branes
The superembedding approach to -branes is used to study a class of
-branes which have linear multiplets on the worldvolume. We refer to these
branes as L-branes. Although linear multiplets are related to scalar multiplets
(with 4 or 8 supersymmetries) by dualising one of the scalars of the latter to
a -form field strength, in many geometrical situations it is the linear
multiplet version which arises naturally. Furthermore, in the case of 8
supersymmetries, the linear multiplet is off-shell in contrast to the scalar
multiplet. The dynamics of the L-branes are obtained by using a systematic
procedure for constructing the Green-Schwarz action from the superembedding
formalism. This action has a Dirac-Born-Infeld type structure for the -form.
In addition, a set of equations of motion is postulated directly in superspace,
and is shown to agree with the Green-Schwarz equations of motion.Comment: revised version, minor changes, references added, 22 pages, no
figures, LaTe
Manipulating Memory Associations Changes Decision-making Preferences in a Preconditioning Task
Memories of past experiences can guide our decisions. Thus, if memories are undermined or distorted, decision making should be affected. Nevertheless, little empirical research has been done to examine the role of memory in reinforcement decision-making . We hypothesized that if memories guide choices in a conditioning decision-making task, then manipulating these memories would result in a change of decision preferences to gain reward. We manipulated participants’ memories by providing false feedback that their memory associations were wrong before they made decisions that could lead them to win money . Participants’ memory ratings decreased significantly after receiving false feedback. More importantly, we found that false feedback led participants’ decision bias to disappear after their memory associations were undermined . Our results suggest that reinforcement decision-making can be altered by fasle feedback on memories . The results are discussed using memory mechanisms such as spreading activation theories
Low-degree multi-spectral p-mode fitting
We combine unresolved-Sun velocity and intensity observations at multiple wavelengths from the Helioseismic and Magnetic Imager and Atmospheric Imaging Array onboard the Solar Dynamics Observatory to investigate the possibility of multi-spectral mode-frequency estimation at low spherical harmonic degree. We test a simple multi-spectral algorithm using a common line width and frequency for each mode and a separate amplitude, background and asymmetry parameter, and compare the results with those from fits to the individual spectra. The preliminary results suggest that this approach may provide a more stable fit than using the observables separately
Mapping Observations of DNC and HN^13C in Dark Cloud Cores
We present results of mapping observations of the DNC, HN^13C, and H^13CO^+
lines (J=1-0) toward 4 nearby dark cloud cores, TMC-1, L1512, L1544, and L63,
along with observations of the DNC and HN^13C lines (J=2-1) toward selected
positions. By use of statistical equilibrium calculations based on the LVG
model, the H_2 densities are derived to be (1.4-5.5)*10^5 cm^-3, and the
[DNC]/[HN^13C] ratios are derived to be 1.25-5.44 with a typical uncertainty by
a factor of 2. The observed [DNC]/[HNC] ratios range from 0.02 to 0.09,
assuming the [^12C]/[^13C] ratio of 60. Distributions of DNC and HN^13C are
generally similar to each other, whereas the distribution of H^13CO^+ is more
extended than those of DNC and HN^13C, indicating that they reside in an inner
part of the cores than HCO^+. The [DNC]/[HN^13C] ratio is rather constant
within each core, although a small systematic gradients are observed in TMC-1
and L63. Particularly, no such systematic gradient is found in L1512 and L1544,
where a significant effect of depletion of molecules is reported toward the
central part of the cores. This suggests that the [DNC]/[HNC] ratio would not
be very sensitive to depletion factor, unlike the [DCO^+]/[HCO^+] ratio. On the
other hand, the core to core variation of the [DNC]/[HNC] ratio, which range an
order of magnitude, is more remarkable than the variation within each core.
These results are interpreted qualitatively by a combination of three competing
time-dependent processes; gas-phase deuterium fractionation, depletion of
molecules onto grain surface, and dynamical evolution of a core.Comment: 22 pages, 8 EPS figures, aasLaTex 5.0, accepted to The Astrophysical
Journa
Recommended from our members
Methods of Studying False Memory
The study of memory is one of those domains in psychology which has clear practical relevance. Think, for example, about people with Alzheimer’s disease. Devastating dysfunction experienced by these patients makes it abundantly evident that our memory constitutes an overarching and critical role in our daily life. However, in the study of memory, there is another memory phenomenon that also carries with it enormous theoretical and practical implications, namely, memory illusions. That is, people frequently claim that they remember details or even an entire event that never actually happened. These false memories can have serious consequences when they appear in the testimony of witness, victims, or suspects in legal cases (Howe & Knott, 2015; Otgaar, De Ruiter, Howe, Hoetmer, & van Reekum, in press). A person, for example, might falsely remember that he/she was sexually abused when he/she was a child, and this memory illusion might lead to false accusations that may result in wrongful convictions
Fibre Bundles and Generalised Dimensional Reduction
We study some geometrical and topological aspects of the generalised
dimensional reduction of supergravities in D=11 and D=10 dimensions, which give
rise to massive theories in lower dimensions. In these reductions, a global
symmetry is used in order to allow some of the fields to have a non-trivial
dependence on the compactifying coordinates. Global consistency in the internal
space imposes topological restrictions on the parameters of the
compactification as well as the structure of the space itself. Examples that we
consider include the generalised reduction of the type IIA and type IIB
theories on a circle, and also the massive ten-dimensional theory obtained by
the generalised reduction of D=11 supergravity.Comment: 23 pages, Late
A Clifford analysis approach to superspace
A new framework for studying superspace is given, based on methods from
Clifford analysis. This leads to the introduction of both orthogonal and
symplectic Clifford algebra generators, allowing for an easy and canonical
introduction of a super-Dirac operator, a super-Laplace operator and the like.
This framework is then used to define a super-Hodge coderivative, which,
together with the exterior derivative, factorizes the Laplace operator. Finally
both the cohomology of the exterior derivative and the homology of the Hodge
operator on the level of polynomial-valued super-differential forms are
studied. This leads to some interesting graphical representations and provides
a better insight in the definition of the Berezin-integral.Comment: 15 pages, accepted for publication in Annals of Physic
A Reference-Free Algorithm for Computational Normalization of Shotgun Sequencing Data
Deep shotgun sequencing and analysis of genomes, transcriptomes, amplified
single-cell genomes, and metagenomes has enabled investigation of a wide range
of organisms and ecosystems. However, sampling variation in short-read data
sets and high sequencing error rates of modern sequencers present many new
computational challenges in data interpretation. These challenges have led to
the development of new classes of mapping tools and {\em de novo} assemblers.
These algorithms are challenged by the continued improvement in sequencing
throughput. We here describe digital normalization, a single-pass computational
algorithm that systematizes coverage in shotgun sequencing data sets, thereby
decreasing sampling variation, discarding redundant data, and removing the
majority of errors. Digital normalization substantially reduces the size of
shotgun data sets and decreases the memory and time requirements for {\em de
novo} sequence assembly, all without significantly impacting content of the
generated contigs. We apply digital normalization to the assembly of microbial
genomic data, amplified single-cell genomic data, and transcriptomic data. Our
implementation is freely available for use and modification
- …
