8,371 research outputs found
The hyperbolic lattice point problem in conjugacy classes
For Γ a cocompact or cofinite Fuchsian group, we study the hyperbolic lattice point problem in conjugacy classes, which is a modification of the classical hyperbolic lattice point problem. We use large sieve inequalities for the Riemann surfaces Γ∖ℍ to obtain average results for the error term, which are conjecturally optimal. We give a new proof of the error bound O(X2/3), due to Good. For SL2(ℤ) we interpret our results in terms of indefinite quadratic forms
Quenched disorder and spin-glass correlations in XY nematics
We present a theoretical study of the equilibrium ordering in a 3D XY nematic
system with quenched random disorder. Within this model, treated with the
replica trick and Gaussian variational method, the correlation length is
obtained as a function of the local nematic order parameter and the effective
disorder strength. These results clarify what happens in the limiting cases of
diminishing order parameter and disorder strength, that is near a phase
transition of a pure system. In particular, it is found that quenched disorder
is irrelevant as the order parameter tends to zero and hence does not change
the character of the continuous XY nematic to isotropic phase transition. We
discuss how these results compare with experiments and simulationsComment: 19 pages, 6 figures, corrected typo
An engine selection methodology for high fidelity serious games
Serious games represent the state-of-the-art in the convergence of electronic gaming technologies with instructional design principles and pedagogies. Whilst the selection criteria for entertainment game engines are often transparent, due to the range of available platforms and engines an emerging challenge is the choice of platform for serious games, whose selection often has substantially different objectives and technical requirements depending upon context and usage. Additionally, the convergence of training simulations with serious gaming, made possible by increasing hardware rendering capacity, is enabling the creation of high-fidelity serious games which challenge existing design and instructional approaches. This paper highlights some of the differences between the technical requisites of high-fidelity serious and leisure games, and proposes a selection methodology based upon these emergent characteristics. The case study of part of a high-fidelity model of Ancient Rome is used to compare aspects of the four different game engines according to elements defined in the proposed methodology
Game engines selection framework for high-fidelity serious applications
Serious games represent the state-of-the-art in the convergence of electronic gaming technologies with instructional design principles and pedagogies. Despite the value of high-fidelity content in engaging learners and providing realistic training environments, building games which deliver high levels of visual and functional realism is a complex, time consuming and expensive process. Therefore, commercial game engines, which provide a development environment and resources to more rapidly create high-fidelity virtual worlds, are increasingly used for serious as well as for entertainment applications. Towards this intention, the authors propose a new framework for the selection of game engines for serious applications and sets out five elements for analysis of engines in order to create a benchmarking approach to the validation of game engine selection. Selection criteria for game engines and the choice of platform for Serious Games are substantially different from entertainment games, as Serious Games have very different objectives, emphases and technical requirements. In particular, the convergence of training simulators with serious games, made possible by increasing hardware rendering capacity is enabling the creation of high-fidelity serious games, which challenge existing instructional approaches. This paper overviews several game engines that are suitable for high-fidelity serious games, using the proposed framework
Nematic-Isotropic Transition with Quenched Disorder
Nematic elastomers do not show the discontinuous, first-order, phase
transition that the Landau-De Gennes mean field theory predicts for a
quadrupolar ordering in 3D. We attribute this behavior to the presence of
network crosslinks, which act as sources of quenched orientational disorder. We
show that the addition of weak random anisotropy results in a singular
renormalization of the Landau-De Gennes expression, adding an energy term
proportional to the inverse quartic power of order parameter Q. This reduces
the first-order discontinuity in Q. For sufficiently high disorder strength the
jump disappears altogether and the phase transition becomes continuous, in some
ways resembling the supercritical transitions in external field.Comment: 12 pages, 4 figures, to be published on PR
Levels of interaction: a user-guided experience in large-scale virtual environments
This paper investigates a range of challenges faced in the design of a serious game, teaching history to a player immersed in an 'open' virtual environment. In the context of this paper, such an environment is described as an exploratory, expansive virtual world within which a user may interact in a non-linear, situated fashion with both the environment and virtual characters. The main contribution of this paper consists in the introduction of the levels of interaction (LoI), a novel framework designed to assist in the creation of interactions between the player and characters. The LoI approach also addresses the necessity for balancing computational efficiency with the need to provide believable and interactive virtual characters, by allowing varying degrees of animation, display and, ultimately, interaction detail. This paper demonstrates the challenges faced when implementing such a technique, as well as the potential benefits it brings
Combined collider constraints on neutralinos and charginos
Searches for supersymmetric electroweakinos have entered a crucial phase, as
the integrated luminosity of the Large Hadron Collider is now high enough to
compensate for their weak production cross-sections. Working in a framework
where the neutralinos and charginos are the only light sparticles in the
Minimal Supersymmetric Standard Model, we use gambit to perform a detailed
likelihood analysis of the electroweakino sector. We focus on the impacts of
recent ATLAS and CMS searches with 36 fb of 13 TeV proton-proton
collision data. We also include constraints from LEP and invisible decays of
the and Higgs bosons. Under the background-only hypothesis, we show that
current LHC searches do not robustly exclude any range of neutralino or
chargino masses. However, a pattern of excesses in several LHC analyses points
towards a possible signal, with neutralino masses of = (8-155,
103-260, 130-473, 219-502) GeV and chargino masses of
= (104-259, 224-507) GeV
at the 95% confidence level. The lightest neutralino is mostly bino, with a
possible modest Higgsino or wino component. We find that this excess has a
combined local significance of , subject to a number of cautions. If
one includes LHC searches for charginos and neutralinos conducted with 8 TeV
proton-proton collision data, the local significance is lowered to 2.9.
We briefly consider the implications for dark matter, finding that the correct
relic density can be obtained through the Higgs-funnel and -funnel
mechanisms, even assuming that all other sparticles are decoupled. All samples,
gambit input files and best-fit models from this study are available on Zenodo.Comment: 38 pages, 16 figures, v3 is the version accepted by EPJ
- …
