183,571 research outputs found
The Dirac equation, the concept of quanta, and the description of interactions in quantum electrodynamics
In this article the Dirac equation is used as a guideline to the historical emergence of the concept of quanta, associated with the quantum field. In Pascual Jordan’s approach, electrons as quanta result from the quantization of a classical field described by the Dirac equation. With this quantization procedure – also used for the electromagnetic field – the concept of quanta becomes a central piece in quantum electrodynamics. This does not seem to avoid the apparent impossibility of using the concept of quanta when interacting fields are considered together as a closed system. In this article it is defended that the type of analysis that leads to so drastic conclusions is avoidable if we look beyond the mathematical structure of the theory and take into account the physical ideas inscribed in this mathematical structure. In this case we see that in quantum electrodynamics we are not considering a closed system of interacting fields, what we have is a description of the interactions between distinct fields. In this situation the concept of quanta is central, the Fock space being the natural mathematical structure that permits maintaining the concept of quanta when considering the interaction between the fields
Approximate Computation and Implicit Regularization for Very Large-scale Data Analysis
Database theory and database practice are typically the domain of computer
scientists who adopt what may be termed an algorithmic perspective on their
data. This perspective is very different than the more statistical perspective
adopted by statisticians, scientific computers, machine learners, and other who
work on what may be broadly termed statistical data analysis. In this article,
I will address fundamental aspects of this algorithmic-statistical disconnect,
with an eye to bridging the gap between these two very different approaches. A
concept that lies at the heart of this disconnect is that of statistical
regularization, a notion that has to do with how robust is the output of an
algorithm to the noise properties of the input data. Although it is nearly
completely absent from computer science, which historically has taken the input
data as given and modeled algorithms discretely, regularization in one form or
another is central to nearly every application domain that applies algorithms
to noisy data. By using several case studies, I will illustrate, both
theoretically and empirically, the nonobvious fact that approximate
computation, in and of itself, can implicitly lead to statistical
regularization. This and other recent work suggests that, by exploiting in a
more principled way the statistical properties implicit in worst-case
algorithms, one can in many cases satisfy the bicriteria of having algorithms
that are scalable to very large-scale databases and that also have good
inferential or predictive properties.Comment: To appear in the Proceedings of the 2012 ACM Symposium on Principles
of Database Systems (PODS 2012
A General Framework for Fast Stagewise Algorithms
Forward stagewise regression follows a very simple strategy for constructing
a sequence of sparse regression estimates: it starts with all coefficients
equal to zero, and iteratively updates the coefficient (by a small amount
) of the variable that achieves the maximal absolute inner product
with the current residual. This procedure has an interesting connection to the
lasso: under some conditions, it is known that the sequence of forward
stagewise estimates exactly coincides with the lasso path, as the step size
goes to zero. Furthermore, essentially the same equivalence holds
outside of least squares regression, with the minimization of a differentiable
convex loss function subject to an norm constraint (the stagewise
algorithm now updates the coefficient corresponding to the maximal absolute
component of the gradient).
Even when they do not match their -constrained analogues, stagewise
estimates provide a useful approximation, and are computationally appealing.
Their success in sparse modeling motivates the question: can a simple,
effective strategy like forward stagewise be applied more broadly in other
regularization settings, beyond the norm and sparsity? The current
paper is an attempt to do just this. We present a general framework for
stagewise estimation, which yields fast algorithms for problems such as
group-structured learning, matrix completion, image denoising, and more.Comment: 56 pages, 15 figure
Reconstruction of shapes and refractive indices from backscattering experimental data using the adaptivity
We consider the inverse problem of the reconstruction of the spatially
distributed dielectric constant $\varepsilon_{r}\left(\mathbf{x}\right), \
\mathbf{x}\in \mathbb{R}^{3}n\left(\mathbf{x}\right) =\sqrt{\varepsilon_{r}\left(\mathbf{x}\right)}.\varepsilon_{r}\left(\mathbf{x}\right) $ is reconstructed using a
two-stage reconstruction procedure. In the first stage an approximately
globally convergent method proposed is applied to get a good first
approximation of the exact solution. In the second stage a locally convergent
adaptive finite element method is applied, taking the solution of the first
stage as the starting point of the minimization of the Tikhonov functional.
This functional is minimized on a sequence of locally refined meshes. It is
shown here that all three components of interest of targets can be
simultaneously accurately imaged: refractive indices, shapes and locations
Theory and Applications of Robust Optimization
In this paper we survey the primary research, both theoretical and applied,
in the area of Robust Optimization (RO). Our focus is on the computational
attractiveness of RO approaches, as well as the modeling power and broad
applicability of the methodology. In addition to surveying prominent
theoretical results of RO, we also present some recent results linking RO to
adaptable models for multi-stage decision-making problems. Finally, we
highlight applications of RO across a wide spectrum of domains, including
finance, statistics, learning, and various areas of engineering.Comment: 50 page
- …