17,021 research outputs found
Four-modulus "Swiss Cheese" chiral models
We study the 'Large Volume Scenario' on explicit, new, compact, four-modulus
Calabi-Yau manifolds. We pay special attention to the chirality problem pointed
out by Blumenhagen, Moster and Plauschinn. Namely, we thoroughly analyze the
possibility of generating neutral, non-perturbative superpotentials from
Euclidean D3-branes in the presence of chirally intersecting D7-branes. We find
that taking proper account of the Freed-Witten anomaly on non-spin cycles and
of the Kaehler cone conditions imposes severe constraints on the models.
Nevertheless, we are able to create setups where the constraints are solved,
and up to three moduli are stabilized.Comment: 40 pages, 10 figures, clarifying comments added, minor mistakes
correcte
The Statistics of Supersymmetric D-brane Models
We investigate the statistics of the phenomenologically important D-brane
sector of string compactifications. In particular for the class of intersecting
D-brane models, we generalise methods known from number theory to determine the
asymptotic statistical distribution of solutions to the tadpole cancellation
conditions. Our approach allows us to compute the statistical distribution of
gauge theoretic observables like the rank of the gauge group, the number of
chiral generations or the probability of an SU(N) gauge factor. Concretely, we
study the statistics of intersecting branes on T^2 and T^4/Z_2 and T^6/Z_2 x
Z_2 orientifolds. Intriguingly, we find a statistical correlation between the
rank of the gauge group and the number of chiral generations. Finally, we
combine the statistics of the gauge theory sector with the statistics of the
flux sector and study how distributions of gauge theoretic quantities are
affected.Comment: 62 pages, 31 figures, harvmac; v3: sections 3.2. + 3.7. added, figs.
7,28,29 added, figs. 24,25,26 corrected, refs. added, typos correcte
How to Find More Supernovae with Less Work: Object Classification Techniques for Difference Imaging
We present the results of applying new object classification techniques to
difference images in the context of the Nearby Supernova Factory supernova
search. Most current supernova searches subtract reference images from new
images, identify objects in these difference images, and apply simple threshold
cuts on parameters such as statistical significance, shape, and motion to
reject objects such as cosmic rays, asteroids, and subtraction artifacts.
Although most static objects subtract cleanly, even a very low false positive
detection rate can lead to hundreds of non-supernova candidates which must be
vetted by human inspection before triggering additional followup. In comparison
to simple threshold cuts, more sophisticated methods such as Boosted Decision
Trees, Random Forests, and Support Vector Machines provide dramatically better
object discrimination. At the Nearby Supernova Factory, we reduced the number
of non-supernova candidates by a factor of 10 while increasing our supernova
identification efficiency. Methods such as these will be crucial for
maintaining a reasonable false positive rate in the automated transient alert
pipelines of upcoming projects such as PanSTARRS and LSST.Comment: 25 pages; 6 figures; submitted to Ap
Constraining the string scale: from Planck to Weak and back again
String and field theory ideas have greatly influenced each other since the so
called second string revolution. We review this interrelation paying particular
attention to its phenomenological implications. Our guiding principle is the
radical shift in the way that we think about the fundamental scale, in
particular the way in which string models have been able to accommodate values
from the Planck GeV down to the electroweak scale
TeV.Comment: Invited review aimed at an experimental audienc
Yukawa Corrections from Four-Point Functions in Intersecting D6-Brane Models
We discuss corrections to the Yukawa matrices of the Standard Model (SM)
fermions in intersecting D-brane models due to four-point interactions.
Recently, an intersecting D-brane model has been found where it is possible to
obtain correct masses and mixings for all quarks as well as the tau lepton.
However, the masses for the first two charged leptons come close to the right
values but are not quite correct. Since the electron and muon are quite light,
it is likely that there are additional corrections to their masses which cannot
be neglected. With this in mind, we consider contributions to the SM fermion
mass matrices from four-point interactions. In an explicit model, we show that
it is indeed possible to obtain the SM fermion masses and mixings which are a
better match to those resulting from experimental data extrapolated at the
unification scale when these corrections are included. These corrections may
have broader application to other models.Comment: 24 pages, 4 figure
The Hanabi Challenge: A New Frontier for AI Research
From the early days of computing, games have been important testbeds for
studying how well machines can do sophisticated decision making. In recent
years, machine learning has made dramatic advances with artificial agents
reaching superhuman performance in challenge domains like Go, Atari, and some
variants of poker. As with their predecessors of chess, checkers, and
backgammon, these game domains have driven research by providing sophisticated
yet well-defined challenges for artificial intelligence practitioners. We
continue this tradition by proposing the game of Hanabi as a new challenge
domain with novel problems that arise from its combination of purely
cooperative gameplay with two to five players and imperfect information. In
particular, we argue that Hanabi elevates reasoning about the beliefs and
intentions of other agents to the foreground. We believe developing novel
techniques for such theory of mind reasoning will not only be crucial for
success in Hanabi, but also in broader collaborative efforts, especially those
with human partners. To facilitate future research, we introduce the
open-source Hanabi Learning Environment, propose an experimental framework for
the research community to evaluate algorithmic advances, and assess the
performance of current state-of-the-art techniques.Comment: 32 pages, 5 figures, In Press (Artificial Intelligence
A Domain-Independent Algorithm for Plan Adaptation
The paradigms of transformational planning, case-based planning, and plan
debugging all involve a process known as plan adaptation - modifying or
repairing an old plan so it solves a new problem. In this paper we provide a
domain-independent algorithm for plan adaptation, demonstrate that it is sound,
complete, and systematic, and compare it to other adaptation algorithms in the
literature. Our approach is based on a view of planning as searching a graph of
partial plans. Generative planning starts at the graph's root and moves from
node to node using plan-refinement operators. In planning by adaptation, a
library plan - an arbitrary node in the plan graph - is the starting point for
the search, and the plan-adaptation algorithm can apply both the same
refinement operators available to a generative planner and can also retract
constraints and steps from the plan. Our algorithm's completeness ensures that
the adaptation algorithm will eventually search the entire graph and its
systematicity ensures that it will do so without redundantly searching any
parts of the graph.Comment: See http://www.jair.org/ for any accompanying file
A Model For Implementing An Optimized Casino Degree Curriculum Within The Two-Year College
A research project was undertaken to develop a casino degree curriculum model for two-year colleges. The study began with an overview of the gambling industry and the advent of gaming curricula at various institutions of higher learning; The casino model was developed using Bloom\u27s Taxonomy for course structure and Kalani\u27s Model for curricular framework. Additional studies were conducted on curriculum development methods, and various other curriculum designs and models; The research design included data from two questionnaires and one personal interview instrument. Further data was provided by gaming employment statistics, gaming revenue statistics, and proprietary gaming school programs. This data was used to develop a proposed Casino Curriculum Model for Clark County Community College in Las Vegas. The model utilized a four-step approach to curriculum design encompassing (1) Demand Factor, (2) Selection Factor, (3) Skills and Knowledge Factor, (4) Curriculum Factor. The Demand Factor determined various gaming occupations available. The Selection Factor determined highest employment opportunities. The Skills and Knowledge Factor determined core and specialized learnings. The Curriculum Factor determined basic curricular elements. Also shown were model variations for specialized programs and a comparison between proposed and existing CCCC curriculum models; Recommendations included the development of various gaming certificates and degRees Further studies on the potential of gaming programs in other institutions and locales, and a greater vocational/technical emphasis for CCCC
- …