474 research outputs found
Relevance-based Word Embedding
Learning a high-dimensional dense representation for vocabulary terms, also
known as a word embedding, has recently attracted much attention in natural
language processing and information retrieval tasks. The embedding vectors are
typically learned based on term proximity in a large corpus. This means that
the objective in well-known word embedding algorithms, e.g., word2vec, is to
accurately predict adjacent word(s) for a given word or context. However, this
objective is not necessarily equivalent to the goal of many information
retrieval (IR) tasks. The primary objective in various IR tasks is to capture
relevance instead of term proximity, syntactic, or even semantic similarity.
This is the motivation for developing unsupervised relevance-based word
embedding models that learn word representations based on query-document
relevance information. In this paper, we propose two learning models with
different objective functions; one learns a relevance distribution over the
vocabulary set for each query, and the other classifies each term as belonging
to the relevant or non-relevant class for each query. To train our models, we
used over six million unique queries and the top ranked documents retrieved in
response to each query, which are assumed to be relevant to the query. We
extrinsically evaluate our learned word representation models using two IR
tasks: query expansion and query classification. Both query expansion
experiments on four TREC collections and query classification experiments on
the KDD Cup 2005 dataset suggest that the relevance-based word embedding models
significantly outperform state-of-the-art proximity-based embedding models,
such as word2vec and GloVe.Comment: to appear in the proceedings of The 40th International ACM SIGIR
Conference on Research and Development in Information Retrieval (SIGIR '17
The Effect of CKD Fineness for Karbala Cement Plant on the Engineering Properties of Cement When add it as a Partial Replacement
Cement kiln dust (CKD) is a waste of cement manufacture. The disposal CKD management has becomes an environmental challenge. In order to overcome this problem, researchers are carried out to find out active means to utilize it in various applications .One of these applications is adding it as partial substitution of cement. The aim of this study is investigating the effect of CKDs fineness on the engineering properties of cement, when utilized as a partial replacement. the CKD was grind by jet mill and classify into 4 groups according fineness (3000 , 6000 , 8000 ,10000 ) cm2/gm then prepared blends with (5 ,10 , 15 , 20 , 25 , 30 , 35 and 40) % replacement by CKD for each fineness ,.The results showed that increasing of fineness lead to increasing of water demand for consistency,. Setting time(initial and final) retarded with increase fineness of CKD, and the compressive strength of samples contain CKD up to 20% in fineness more than 6000 cm2/gm are enhanced. These results may came from the increasing specific area activity of CKD compounds which affected on cement hydration and improve in the cement hydrated particles packing and more denser and compact of cement hardened. Keywords: Cement, CKD, fineness, consistency, setting time, compressive strength, Karbal
Higgs for Graviton: Simple and Elegant Solution
A Higgs mechanism for gravity is presented, where four scalars with global
Lorentz symmetry are employed. We show that in the broken symmetry phase a
graviton absorbs all scalars and become massive spin 2 particle with five
degrees of freedom. The resulting theory is unitary and free of ghosts.Comment: 8 pages, References added. The decoupling of ghost state is analyzed
in detail
Supersymmetric QCD: Exact Results and Strong Coupling
We revisit two longstanding puzzles in supersymmetric gauge theories. The
first concerns the question of the holomorphy of the coupling, and related to
this the possible definition of an exact (NSVZ) beta function. The second
concerns instantons in pure gluodynamics, which appear to give sensible, exact
results for certain correlation functions, which nonetheless differ from those
obtained using systematic weak coupling expansions. For the first question, we
extend an earlier proposal of Arkani-Hamed and Murayama, showing that if their
regulated action is written suitably, the holomorphy of the couplings is
manifest, and it is easy to determine the renormalization scheme for which the
NSVZ formula holds. This scheme, however, is seen to be one of an infinite
class of schemes, each leading to an exact beta function; the NSVZ scheme,
while simple, is not selected by any compelling physical consideration. For the
second question, we explain why the instanton computation in the pure
supersymmetric gauge theory is not reliable, even at short distances. The
semiclassical expansion about the instanton is purely formal; if infrared
divergences appear, they spoil arguments based on holomorphy. We demonstrate
that infrared divergences do not occur in the perturbation expansion about the
instanton, but explain that there is no reason to think this captures all
contributions from the sector with unit topological charge. That one expects
additional contributions is illustrated by dilute gas corrections. These are
infrared divergent, and so difficult to define, but if non-zero give order one,
holomorphic, corrections to the leading result. Exploiting an earlier analysis
of Davies et al, we demonstrate that in the theory compactified on a circle of
radius beta, due to infrared effects, finite contributions indeed arise which
are not visible in the formal limit that beta goes to infinity.Comment: 28 pages, two references added, one typo correcte
UV-Completion by Classicalization
We suggest a novel approach to UV-completion of a class of non-renormalizable
theories, according to which the high-energy scattering amplitudes get
unitarized by production of extended classical objects (classicalons), playing
a role analogous to black holes, in the case of non-gravitational theories. The
key property of classicalization is the existence of a classicalizer field that
couples to energy-momentum sources. Such localized sources are excited in
high-energy scattering processes and lead to the formation of classicalons. Two
kinds of natural classicalizers are Nambu-Goldstone bosons (or, equivalently,
longitudinal polarizations of massive gauge fields) and scalars coupled to
energy-momentum type sources. Classicalization has interesting phenomenological
applications for the UV-completion of the Standard Model both with or without
the Higgs. In the Higgless Standard Model the high-energy scattering amplitudes
of longitudinal -bosons self-unitarize via classicalization, without the
help of any new weakly-coupled physics. Alternatively, in the presence of a
Higgs boson, classicalization could explain the stabilization of the hierarchy.
In both scenarios the high-energy scatterings are dominated by the formation of
classicalons, which subsequently decay into many particle states. The
experimental signatures at the LHC are quite distinctive, with sharp
differences in the two cases.Comment: 37 page
Massive Gravity: Exorcising the Ghost
We consider Higgs massive gravity [1,2] and investigate whether a nonlinear
ghost in this theory can be avoided. We show that although the theory
considered in [10,11] is ghost free in the decoupling limit, the ghost
nevertheless reappears in the fourth order away from the decoupling limit. We
also demonstrate that there is no direct relation between the value of the
Vainshtein scale and the existence of nonlinear ghost. We discuss how massive
gravity should be modified to avoid the appearance of the ghost.Comment: 16 page
Adhesion of volcanic ash particles under controlled conditions and implications for their deposition in gas turbines
A particular (representative) type of ash has been used in this study, having a particle size range of ~10-70 µm. Experimental particle adhesion rate data are considered in conjunction with CFD modeling of particle velocities and temperatures. This ash becomes soft above ~700˚C and it has been confirmed that a sharp increase is observed in the likelihood of adhesion as particle temperatures move into this range. Particle size is important and those in the approximate range 10-30 µm are most likely to adhere. This corresponds fairly closely with the size range that is most likely to enter a combustion chamber and turbine.This work forms part of a research programme funded by EPSRC (EP/K027530/1). In conjunction with this project, a consortium of partners has been set up under the PROVIDA ("PROtection against Volcanic ash Induced Damage in Aeroengines") banner and information about its operation is available at http://www.ccg.msm.cam.ac.uk/initiatives/provida. The invaluable assistance of Kevin Roberts (Materials Department in Cambridge) with operation of the plasma spray facility is gratefully acknowledged. The authors are also grateful to Dr. Margaret Hartley, of the University of Manchester, for kindly collecting the Laki ash (and several other types) during field trips to Iceland, which were funded by EasyJet.This is the author accepted manuscript. The final version is available from Wiley via http://dx.doi.org/10.1002/adem.201500371 In compliance with current EPSRC requirements, input data for the modelling described in this paper, including meshing and boundary condition specifications, are available at the following URL: www.ccg.msm.cam.ac.uk/publications/resources. These files can be downloaded and used in COMSOL Multiphysics packages. Data supplied are for a representative case
Ultrasensitive force and displacement detection using trapped ions
The ability to detect extremely small forces is vital for a variety of
disciplines including precision spin-resonance imaging, microscopy, and tests
of fundamental physical phenomena. Current force-detection sensitivity limits
have surpassed 1 (atto ) through coupling of micro or
nanofabricated mechanical resonators to a variety of physical systems including
single-electron transistors, superconducting microwave cavities, and individual
spins. These experiments have allowed for probing studies of a variety of
phenomena, but sensitivity requirements are ever-increasing as new regimes of
physical interactions are considered. Here we show that trapped atomic ions are
exquisitely sensitive force detectors, with a measured sensitivity more than
three orders of magnitude better than existing reports. We demonstrate
detection of forces as small as 174 (yocto ), with a
sensitivity 390 using crystals of Be
ions in a Penning trap. Our technique is based on the excitation of normal
motional modes in an ion trap by externally applied electric fields, detection
via and phase-coherent Doppler velocimetry, which allows for the discrimination
of ion motion with amplitudes on the scale of nanometers. These experimental
results and extracted force-detection sensitivities in the single-ion limit
validate proposals suggesting that trapped atomic ions are capable of detecting
of forces with sensitivity approaching 1 . We anticipate that
this demonstration will be strongly motivational for the development of a new
class of deployable trapped-ion-based sensors, and will permit scientists to
access new regimes in materials science.Comment: Expanded introduction and analysis. Methods section added. Subject to
press embarg
On Non-Linear Actions for Massive Gravity
In this work we present a systematic construction of the potentially
ghost-free non-linear massive gravity actions. The most general action can be
regarded as a 2-parameter deformation of a minimal massive action. Further
extensions vanish in 4 dimensions. The general mass term is constructed in
terms of a "deformed" determinant from which this property can clearly be seen.
In addition, our formulation identifies non-dynamical terms that appear in
previous constructions and which do not contribute to the equations of motion.
We elaborate on the formal structure of these theories as well as some of their
implications.Comment: v3: 22 pages, minor comments added, version to appear in JHE
- …