22,848 research outputs found
Human Motion Trajectory Prediction: A Survey
With growing numbers of intelligent autonomous systems in human environments,
the ability of such systems to perceive, understand and anticipate human
behavior becomes increasingly important. Specifically, predicting future
positions of dynamic agents and planning considering such predictions are key
tasks for self-driving vehicles, service robots and advanced surveillance
systems. This paper provides a survey of human motion trajectory prediction. We
review, analyze and structure a large selection of work from different
communities and propose a taxonomy that categorizes existing methods based on
the motion modeling approach and level of contextual information used. We
provide an overview of the existing datasets and performance metrics. We
discuss limitations of the state of the art and outline directions for further
research.Comment: Submitted to the International Journal of Robotics Research (IJRR),
37 page
InferenceMAP: Mapping of Single-Molecule Dynamics with Bayesian Inference
Single-particle tracking (SPT) grants unprecedented insight into cellular
function at the molecular scale [1]. Throughout the cell, the movement of
single-molecules is generally heterogeneous and complex. Hence, there is an
imperative to understand the multi-scale nature of single-molecule dynamics in
biological systems. We have previously shown that with high-density SPT,
spatial maps of the parameters that dictate molecule motion can be generated to
intricately describe cellular environments [2,3,4]. To date, however, there
exist no publically available tools that reconcile trajectory data to generate
the aforementioned maps. We address this void in the SPT community with
InferenceMAP: an interactive software package that uses a powerful Bayesian
method to map the dynamic cellular space experienced by individual
biomolecules.Comment: 56 page
Advances in computational modelling for personalised medicine after myocardial infarction
Myocardial infarction (MI) is a leading cause of premature morbidity and mortality worldwide. Determining which patients will experience heart failure and sudden cardiac death after an acute MI is notoriously difficult for clinicians. The extent of heart damage after an acute MI is informed by cardiac imaging, typically using echocardiography or sometimes, cardiac magnetic resonance (CMR). These scans provide complex data sets that are only partially exploited by clinicians in daily practice, implying potential for improved risk assessment. Computational modelling of left ventricular (LV) function can bridge the gap towards personalised medicine using cardiac imaging in patients with post-MI. Several novel biomechanical parameters have theoretical prognostic value and may be useful to reflect the biomechanical effects of novel preventive therapy for adverse remodelling post-MI. These parameters include myocardial contractility (regional and global), stiffness and stress. Further, the parameters can be delineated spatially to correspond with infarct pathology and the remote zone. While these parameters hold promise, there are challenges for translating MI modelling into clinical practice, including model uncertainty, validation and verification, as well as time-efficient processing. More research is needed to (1) simplify imaging with CMR in patients with post-MI, while preserving diagnostic accuracy and patient tolerance (2) to assess and validate novel biomechanical parameters against established prognostic biomarkers, such as LV ejection fraction and infarct size. Accessible software packages with minimal user interaction are also needed. Translating benefits to patients will be achieved through a multidisciplinary approach including clinicians, mathematicians, statisticians and industry partners
Bayesian modeling of networks in complex business intelligence problems
Complex network data problems are increasingly common in many fields of
application. Our motivation is drawn from strategic marketing studies
monitoring customer choices of specific products, along with co-subscription
networks encoding multiple purchasing behavior. Data are available for several
agencies within the same insurance company, and our goal is to efficiently
exploit co-subscription networks to inform targeted advertising of cross-sell
strategies to currently mono-product customers. We address this goal by
developing a Bayesian hierarchical model, which clusters agencies according to
common mono-product customer choices and co-subscription networks. Within each
cluster, we efficiently model customer behavior via a cluster-dependent mixture
of latent eigenmodels. This formulation provides key information on
mono-product customer choices and multiple purchasing behavior within each
cluster, informing targeted cross-sell strategies. We develop simple algorithms
for tractable inference, and assess performance in simulations and an
application to business intelligence
Bayesian cosmic density field inference from redshift space dark matter maps
We present a self-consistent Bayesian formalism to sample the primordial
density fields compatible with a set of dark matter density tracers after
cosmic evolution observed in redshift space. Previous works on density
reconstruction did not self-consistently consider redshift space distortions or
included an additional iterative distortion correction step. We present here
the analytic solution of coherent flows within a Hamiltonian Monte Carlo
posterior sampling of the primordial density field. We test our method within
the Zel'dovich approximation, presenting also an analytic solution including
tidal fields and spherical collapse on small scales using augmented Lagrangian
perturbation theory. Our resulting reconstructed fields are isotropic and their
power spectra are unbiased compared to the true one defined by our mock
observations. Novel algorithmic implementations are introduced regarding the
mass assignment kernels when defining the dark matter density field and
optimization of the time step in the Hamiltonian equations of motions. Our
algorithm, dubbed barcode, promises to be specially suited for analysis of the
dark matter cosmic web down to scales of a few Megaparsecs. This large scale
structure is implied by the observed spatial distribution of galaxy clusters
--- such as obtained from X-ray, SZ or weak lensing surveys --- as well as that
of the intergalactic medium sampled by the Lyman alpha forest or perhaps even
by deep hydrogen intensity mapping. In these cases, virialized motions are
negligible, and the tracers cannot be modeled as point-like objects. It could
be used in all of these contexts as a baryon acoustic oscillation
reconstruction algorithm.Comment: 34 pages, 25 figures, 1 table. Submitted to MNRAS. Accompanying code
at https://github.com/egpbos/barcod
- …