26,036 research outputs found
Beta Reduction is Invariant, Indeed (Long Version)
Slot and van Emde Boas' weak invariance thesis states that reasonable
machines can simulate each other within a polynomially overhead in time. Is
-calculus a reasonable machine? Is there a way to measure the
computational complexity of a -term? This paper presents the first
complete positive answer to this long-standing problem. Moreover, our answer is
completely machine-independent and based over a standard notion in the theory
of -calculus: the length of a leftmost-outermost derivation to normal
form is an invariant cost model. Such a theorem cannot be proved by directly
relating -calculus with Turing machines or random access machines,
because of the size explosion problem: there are terms that in a linear number
of steps produce an exponentially long output. The first step towards the
solution is to shift to a notion of evaluation for which the length and the
size of the output are linearly related. This is done by adopting the linear
substitution calculus (LSC), a calculus of explicit substitutions modelled
after linear logic and proof-nets and admitting a decomposition of
leftmost-outermost derivations with the desired property. Thus, the LSC is
invariant with respect to, say, random access machines. The second step is to
show that LSC is invariant with respect to the -calculus. The size
explosion problem seems to imply that this is not possible: having the same
notions of normal form, evaluation in the LSC is exponentially longer than in
the -calculus. We solve such an impasse by introducing a new form of
shared normal form and shared reduction, deemed useful. Useful evaluation
avoids those steps that only unshare the output without contributing to
-redexes, i.e., the steps that cause the blow-up in size.Comment: 29 page
(Leftmost-Outermost) Beta Reduction is Invariant, Indeed
Slot and van Emde Boas' weak invariance thesis states that reasonable
machines can simulate each other within a polynomially overhead in time. Is
lambda-calculus a reasonable machine? Is there a way to measure the
computational complexity of a lambda-term? This paper presents the first
complete positive answer to this long-standing problem. Moreover, our answer is
completely machine-independent and based over a standard notion in the theory
of lambda-calculus: the length of a leftmost-outermost derivation to normal
form is an invariant cost model. Such a theorem cannot be proved by directly
relating lambda-calculus with Turing machines or random access machines,
because of the size explosion problem: there are terms that in a linear number
of steps produce an exponentially long output. The first step towards the
solution is to shift to a notion of evaluation for which the length and the
size of the output are linearly related. This is done by adopting the linear
substitution calculus (LSC), a calculus of explicit substitutions modeled after
linear logic proof nets and admitting a decomposition of leftmost-outermost
derivations with the desired property. Thus, the LSC is invariant with respect
to, say, random access machines. The second step is to show that LSC is
invariant with respect to the lambda-calculus. The size explosion problem seems
to imply that this is not possible: having the same notions of normal form,
evaluation in the LSC is exponentially longer than in the lambda-calculus. We
solve such an impasse by introducing a new form of shared normal form and
shared reduction, deemed useful. Useful evaluation avoids those steps that only
unshare the output without contributing to beta-redexes, i.e. the steps that
cause the blow-up in size. The main technical contribution of the paper is
indeed the definition of useful reductions and the thorough analysis of their
properties.Comment: arXiv admin note: substantial text overlap with arXiv:1405.331
Exact beta function from the holographic loop equation of large-N QCD_4
We construct and study a previously defined quantum holographic effective
action whose critical equation implies the holographic loop equation of large-N
QCD_4 for planar self-avoiding loops in a certain regularization scheme. We
extract from the effective action the exact beta function in the given scheme.
For the Wilsonean coupling constant the beta function is exacly one loop and
the first coefficient agrees with its value in perturbation theory. For the
canonical coupling constant the exact beta function has a NSVZ form and the
first two coefficients agree with their value in perturbation theory.Comment: 42 pages, latex. The exponent of the Vandermonde determinant in the
quantum effective action has been changed, because it has been employed a
holomorphic rather than a hermitean resolution of identity in the functional
integral. Beta function unchanged. New explanations and references added,
typos correcte
On emergence in gauge theories at the 't Hooft limit
The aim of this paper is to contribute to a better conceptual understanding
of gauge quantum field theories, such as quantum chromodynamics, by discussing
a famous physical limit, the 't Hooft limit, in which the theory concerned
often simplifies.
The idea of the limit is that the number of colours (or charges) goes to
infinity. The simplifications that can happen in this limit, and that we will
consider, are: (i) the theory's Feynman diagrams can be drawn on a plane
without lines intersecting (called `planarity'); and (ii) the theory, or a
sector of it, becomes integrable, and indeed corresponds to a well-studied
system, viz. a spin chain. Planarity is important because it shows how a
quantum field theory can exhibit extended, in particular string-like,
structures; in some cases, this gives a connection with string theory, and thus
with gravity.
Previous philosophical literature about how one theory (or a sector, or
regime, of a theory) might be emergent from, and-or reduced to, another one has
tended to emphasize cases, such as occur in statistical mechanics, where the
system before the limit has finitely many degrees of freedom. But here, our
quantum field theories, including those on the way to the 't Hooft limit, will
have infinitely many degrees of freedom.
Nevertheless, we will show how a recent schema by Butterfield and taxonomy by
Norton apply to the quantum field theories we consider; and we will classify
three physical properties of our theories in these terms. These properties are
planarity and integrability, as in (i) and (ii) above; and the behaviour of the
beta-function reflecting, for example, asymptotic freedom.
Our discussion of these properties, especially the beta-function, will also
relate to recent philosophical debate about the propriety of assessing quantum
field theories, whose rigorous existence is not yet proven.Comment: 44 pp. arXiv admin note: text overlap with arXiv:1012.3983,
arXiv:hep-ph/9802419, arXiv:1012.3997 by other author
Dimensionally reduced gravity theories are asymptotically safe
4D Einstein gravity coupled to scalars and abelian gauge fields in its
2-Killing vector reduction is shown to be quasi-renormalizable to all loop
orders at the expense of introducing infinitely many essential couplings. The
latter can be combined into one or two functions of the `area radius'
associated with the two Killing vectors. The renormalization flow of these
couplings is governed by beta functionals expressible in closed form in terms
of the (one coupling) beta function of a symmetric space sigma-model.
Generically the matter coupled systems are asymptotically safe, that is the
flow possesses a non-trivial UV stable fixed point at which the trace anomaly
vanishes. The main exception is a minimal coupling of 4D Einstein gravity to
massless free scalars, in which case the scalars decouple from gravity at the
fixed point.Comment: 47 pages, Latex, 1 figur
A formulation of the Yang-Mills theory as a deformation of a topological field theory based on background field method and quark confinement problem
By making use of the background field method, we derive a novel reformulation
of the Yang-Mills theory which was proposed recently by the author to derive
quark confinement in QCD. This reformulation identifies the Yang-Mills theory
with a deformation of a topological quantum field theory. The relevant
background is given by the topologically non-trivial field configuration,
especially, the topological soliton which can be identified with the magnetic
monopole current in four dimensions. We argue that the gauge fixing term
becomes dynamical and that the gluon mass generation takes place by a
spontaneous breakdown of the hidden supersymmetry caused by the dimensional
reduction. We also propose a numerical simulation to confirm the validity of
the scheme we have proposed. Finally we point out that the gauge fixing part
may have a geometric meaning from the viewpoint of global topology where the
magnetic monopole solution represents the critical point of a Morse function in
the space of field configurations.Comment: 45 pages, 3 figures included in LaTe
Changes in the microsomal proteome of tomato fruit during ripening
The variations in the membrane proteome of tomato fruit pericarp during ripening have been investigated by mass spectrometry-based label-free proteomics. Mature green (MG30) and red ripe (R45) stages were chosen because they are pivotal in the ripening process: MG30 corresponds to the end of cellular expansion, when fruit growth has stopped and fruit starts ripening, whereas R45 corresponds to the mature fruit. Protein patterns were markedly different: among the 1315 proteins identified with at least two unique peptides, 145 significantly varied in abundance in the process of fruit ripening. The subcellular and biochemical fractionation resulted in GO term enrichment for organelle proteins in our dataset, and allowed the detection of low-abundance proteins that were not detected in previous proteomic studies on tomato fruits. Functional annotation showed that the largest proportion of identified proteins were involved in cell wall metabolism, vesicle-mediated transport, hormone biosynthesis, secondary metabolism, lipid metabolism, protein synthesis and degradation, carbohydrate metabolic processes, signalling and response to stress
Dynamic texture recognition using time-causal and time-recursive spatio-temporal receptive fields
This work presents a first evaluation of using spatio-temporal receptive
fields from a recently proposed time-causal spatio-temporal scale-space
framework as primitives for video analysis. We propose a new family of video
descriptors based on regional statistics of spatio-temporal receptive field
responses and evaluate this approach on the problem of dynamic texture
recognition. Our approach generalises a previously used method, based on joint
histograms of receptive field responses, from the spatial to the
spatio-temporal domain and from object recognition to dynamic texture
recognition. The time-recursive formulation enables computationally efficient
time-causal recognition. The experimental evaluation demonstrates competitive
performance compared to state-of-the-art. Especially, it is shown that binary
versions of our dynamic texture descriptors achieve improved performance
compared to a large range of similar methods using different primitives either
handcrafted or learned from data. Further, our qualitative and quantitative
investigation into parameter choices and the use of different sets of receptive
fields highlights the robustness and flexibility of our approach. Together,
these results support the descriptive power of this family of time-causal
spatio-temporal receptive fields, validate our approach for dynamic texture
recognition and point towards the possibility of designing a range of video
analysis methods based on these new time-causal spatio-temporal primitives.Comment: 29 pages, 16 figure
On the Trace Anomaly and the Anomaly Puzzle in N=1 Pure Yang-Mills
The trace anomaly of the energy-momentum tensor is usually quoted in the form
which is proportional to the beta function of the theory. However, there are in
general many definitions of gauge couplings depending on renormalization
schemes, and hence many beta functions. In particular, N=1 supersymmetric pure
Yang-Mills has the holomorphic gauge coupling whose beta function is one-loop
exact, and the canonical gauge coupling whose beta function is given by the
Novikov-Shifman-Vainshtein-Zakharov beta function. In this paper, we study
which beta function should appear in the trace anomaly in N=1 pure Yang-Mills.
We calculate the trace anomaly by employing the N=4 regularization of N=1 pure
Yang-Mills. It is shown that the trace anomaly is given by one-loop exact form
if the composite operator appearing in the trace anomaly is renormalized in a
preferred way. This result gives the simplest resolution to the anomaly puzzle
in N=1 pure Yang-Mills. The most important point is to examine in which scheme
the quantum action principle is valid, which is crucial in the derivation of
the trace anomaly.Comment: 25 pages, 1 figure; v2:slight correction in sec.5, minor addition in
appendi
- …