2,583 research outputs found
Psychology and neurobiology of simple decisions
Patterns of neural firing linked to eye movement decisions show that behavioral decisions are predicted by the differential firing rates of cells coding selected and nonselected stimulus alternatives. These results can be interpreted using models developed in mathematical psychology to model behavioral decisions. Current models assume that decisions are made by accumulating noisy stimulus information until sufficient information for a response is obtained. Here, the models, and the techniques used to test them against response-time distribution and accuracy data, are described. Such models provide a quantitative link between the time-course of behavioral decisions and the growth of stimulus information in neural firing data. The question of how two-alternative decisions are made i
Efficient Bayesian inference via Monte Carlo and machine learning algorithms
MenciĂłn Internacional en el tĂtulo de doctorIn many fields of science and engineering, we are faced with an inverse problem where
we aim to recover an unobserved parameter or variable of interest from a set of observed
variables. Bayesian inference is a probabilistic approach for inferring this unknown parameter
that has become extremely popular, finding application in myriad problems in
fields such as machine learning, signal processing, remote sensing and astronomy. In
Bayesian inference, all the information about the parameter is summarized by the posterior
distribution. Unfortunately, the study of the posterior distribution requires the computation
of complicated integrals, that are analytically intractable and need to be approximated.
Monte Carlo is a huge family of sampling algorithms for performing optimization
and numerical integration that has become the main horsepower for carrying out Bayesian
inference. The main idea of Monte Carlo is that we can approximate the posterior distribution
by a set of samples, obtained by an iterative process that involves sampling from a
known distribution. Markov chain Monte Carlo (MCMC) and importance sampling (IS)
are two important groups of Monte Carlo algorithms. This thesis focuses on developing
and analyzing Monte Carlo algorithms (either MCMC, IS or combination of both)
under different challenging scenarios presented below. In summary, in this thesis we address
several important points, enumerated (a)–(f), that currently represent a challenge in
Bayesian inference via Monte Carlo. A first challenge that we address is the problematic
exploration of the parameter space by off-the-shelf MCMC algorithms when there
is (a) multimodality, or with (b) highly concentrated posteriors. Another challenge that
we address is the (c) proposal construction in IS. Furtheremore, in recent applications we
need to deal with (d) expensive posteriors, and/or we need to handle (e) noisy posteriors.
Finally, the Bayesian framework also offers a way of comparing competing hypothesis
(models) in a principled way by means of marginal likelihoods. Hence, a task that arises
as of fundamental importance is (f) marginal likelihood computation.
Chapters 2 and 3 deal with (a), (b), and (c). In Chapter 2, we propose a novel population
MCMC algorithm called Parallel Metropolis-Hastings Coupler (PMHC). PMHC is
very suitable for multimodal scenarios since it works with a population of states, instead
of a single one, hence allowing for sharing information. PMHC combines independent
exploration by the use of parallel Metropolis-Hastings algorithms, with cooperative exploration
by the use of a population MCMC technique called Normal Kernel Coupler.
In Chapter 3, population MCMC are combined with IS within the layered adaptive IS
(LAIS) framework. The combination of MCMC and IS serves two purposes. First, an
automatic proposal construction. Second, it aims at increasing the robustness, since the
MCMC samples are not used directly to form the sample approximation of the posterior.
The use of minibatches of data is proposed to deal with highly concentrated posteriors.
Other extensions for reducing the costs with respect to the vanilla LAIS framework, based on recycling and clustering, are discussed and analyzed.
Chapters 4, 5 and 6 deal with (c), (d) and (e). The use of nonparametric approximations
of the posterior plays an important role in the design of efficient Monte Carlo algorithms.
Nonparametric approximations of the posterior can be obtained using machine learning
algorithms for nonparametric regression, such as Gaussian Processes and Nearest Neighbors.
Then, they can serve as cheap surrogate models, or for building efficient proposal
distributions. In Chapter 4, in the context of expensive posteriors, we propose adaptive
quadratures of posterior expectations and the marginal likelihood using a sequential algorithm
that builds and refines a nonparametric approximation of the posterior. In Chapter
5, we propose Regression-based Adaptive Deep Importance Sampling (RADIS), an adaptive
IS algorithm that uses a nonparametric approximation of the posterior as the proposal
distribution. We illustrate the proposed algorithms in applications of astronomy and remote
sensing. Chapter 4 and 5 consider noiseless posterior evaluations for building the
nonparametric approximations. More generally, in Chapter 6 we give an overview and
classification of MCMC and IS schemes using surrogates built with noisy evaluations.
The motivation here is the study of posteriors that are both costly and noisy. The classification
reveals a connection between algorithms that use the posterior approximation as a
cheap surrogate, and algorithms that use it for building an efficient proposal. We illustrate
specific instances of the classified schemes in an application of reinforcement learning.
Finally, in Chapter 7 we study noisy IS, namely, IS when the posterior evaluations are
noisy, and derive optimal proposal distributions for the different estimators in this setting.
Chapter 8 deals with (f). In Chapter 8, we provide with an exhaustive review of methods
for marginal likelihood computation, with special focus on the ones based on Monte
Carlo. We derive many connections among the methods and compare them in several
simulations setups. Finally, in Chapter 9 we summarize the contributions of this thesis
and discuss some potential avenues of future research.Programa de Doctorado en IngenierĂa Matemática por la Universidad Carlos III de MadridPresidente: Valero Laparra PĂ©rez-Muelas.- Secretario: Michael Peter Wiper.- Vocal: Omer Deniz Akyildi
Balancing with Vibration: A Prelude for “Drift and Act” Balance Control
Stick balancing at the fingertip is a powerful paradigm for the study of the control of human balance. Here we show that the mean stick balancing time is increased by about two-fold when a subject stands on a vibrating platform that produces vertical vibrations at the fingertip (0.001 m, 15–50 Hz). High speed motion capture measurements in three dimensions demonstrate that vibration does not shorten the neural latency for stick balancing or change the distribution of the changes in speed made by the fingertip during stick balancing, but does decrease the amplitude of the fluctuations in the relative positions of the fingertip and the tip of the stick in the horizontal plane, A(x,y). The findings are interpreted in terms of a time-delayed “drift and act” control mechanism in which controlling movements are made only when controlled variables exceed a threshold, i.e. the stick survival time measures the time to cross a threshold. The amplitude of the oscillations produced by this mechanism can be decreased by parametric excitation. It is shown that a plot of the logarithm of the vibration-induced increase in stick balancing skill, a measure of the mean first passage time, versus the standard deviation of the A(x,y) fluctuations, a measure of the distance to the threshold, is linear as expected for the times to cross a threshold in a stochastic dynamical system. These observations suggest that the balanced state represents a complex time–dependent state which is situated in a basin of attraction that is of the same order of size. The fact that vibration amplitude can benefit balance control raises the possibility of minimizing risk of falling through appropriate changes in the design of footwear and roughness of the walking surfaces
Transience and Persistence in the Depositional Record of Continental Margins
Continental shelves and coastal plains are large persistent depositional landforms, which are stationary (nonmigrating) at their proximal ends and characterized by relatively steady long-term growth. In detail, however, their surface form and stratigraphic record is built of transient freely migrating landscape elements. We derive the timescales of crossover from transient to persistent topographic forms using empirical scaling relations for mean sediment accumulation as a function of averaging time, based upon tens of thousands of empirical measurements. A stochastic (noisy) diffusion model with drift predicts all the gross features of the empirical data. It satisfies first-order goals of describing both the surface morphology and stratigraphic completeness of depositional systems. The model crossover from noise-dominated to drift-dominated behavior corresponds to the empirical crossover from transport-dominated (autogenic) transient behavior to accommodation-dominated (subsidence) persistent behavior, which begins at timescales of 102–103 years and is complete by scales of 104–105 years. Because the same long-term scaling behavior emerges for off-shelf environments, it is not entirely explicable by steady subsidence. Fluctuations in sediment supply and routing probably have significant influence. At short-term (transient) scales, the exponents of the scaling relations vary with environment, particularly the prevalence of channeled sediment transport. At very small scales, modeling sediment transport as a diffusive process is inappropriate. Our results indicate that some of the timescales of interest for climate interpretation may fall within the transitional interval where neither accommodation nor transport processes are negligible and deconvolution is most challenging
Trajectory encounter volume as a diagnostic of mixing potential in fluid flows
© The Author(s), 2017. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Nonlinear Processes in Geophysics 24 (2017): 189-202, doi:10.5194/npg-24-189-2017.Fluid parcels can exchange water properties when
coming into contact with each other, leading to mixing. The
trajectory encounter mass and a related simplified quantity,
the encounter volume, are introduced as a measure of the
mixing potential of a flow. The encounter volume quantifies
the volume of fluid that passes close to a reference trajectory
over a finite time interval. Regions characterized by a
low encounter volume, such as the cores of coherent eddies,
have a low mixing potential, whereas turbulent or chaotic regions
characterized by a large encounter volume have a high
mixing potential. The encounter volume diagnostic is used
to characterize the mixing potential in three flows of increasing
complexity: the Duffing oscillator, the Bickley jet and
the altimetry-based velocity in the Gulf Stream extension region.
An additional example is presented in which the encounter
volume is combined with the u approach of Pratt et
al. (2016) to characterize the mixing potential for a specific
tracer distribution in the Bickley jet flow. Analytical relationships
are derived that connect the encounter volume to the
shear and strain rates for linear shear and linear strain flows,
respectively. It is shown that in both flows the encounter volume
is proportional to time.This work was supported by NSF grants
OCE-1154641, OCE-1558806 and EAR-1520825 as well as by
ONR grant N00014-11-10087 and NASA grant NNX14AH29G.
Publication of this article was supported by the Office of Naval
Research, grant no. N00014-16-1-2492
The History of the Quantitative Methods in Finance Conference Series. 1992-2007
This report charts the history of the Quantitative Methods in Finance (QMF) conference from its beginning in 1993 to the 15th conference in 2007. It lists alphabetically the 1037 speakers who presented at all 15 conferences and the titles of their papers.
- …