1,768 research outputs found
LIPIcs, Volume 251, ITCS 2023, Complete Volume
LIPIcs, Volume 251, ITCS 2023, Complete Volum
Refining Diffusion Planner for Reliable Behavior Synthesis by Automatic Detection of Infeasible Plans
Diffusion-based planning has shown promising results in long-horizon,
sparse-reward tasks by training trajectory diffusion models and conditioning
the sampled trajectories using auxiliary guidance functions. However, due to
their nature as generative models, diffusion models are not guaranteed to
generate feasible plans, resulting in failed execution and precluding planners
from being useful in safety-critical applications. In this work, we propose a
novel approach to refine unreliable plans generated by diffusion models by
providing refining guidance to error-prone plans. To this end, we suggest a new
metric named restoration gap for evaluating the quality of individual plans
generated by the diffusion model. A restoration gap is estimated by a gap
predictor which produces restoration gap guidance to refine a diffusion
planner. We additionally present an attribution map regularizer to prevent
adversarial refining guidance that could be generated from the sub-optimal gap
predictor, which enables further refinement of infeasible plans. We demonstrate
the effectiveness of our approach on three different benchmarks in offline
control settings that require long-horizon planning. We also illustrate that
our approach presents explainability by presenting the attribution maps of the
gap predictor and highlighting error-prone transitions, allowing for a deeper
understanding of the generated plans.Comment: NeurIPS 2023. First two authors contributed equally. Code at
http://github.com/leekwoon/rg
Decision-making with gaussian processes: sampling strategies and monte carlo methods
We study Gaussian processes and their application to decision-making in the real world. We begin by reviewing the foundations of Bayesian decision theory and show how these ideas give rise to methods such as Bayesian optimization. We investigate practical techniques for carrying out these strategies, with an emphasis on estimating and maximizing acquisition functions. Finally, we introduce pathwise approaches to conditioning Gaussian processes and demonstrate key benefits for representing random variables in this manner.Open Acces
This Year's Nobel Prize (2022) in Physics for Entanglement and Quantum Information: the New Revolution in Quantum Mechanics and Science
The paper discusses this year’s Nobel Prize in physics for experiments of entanglement “establishing the violation of Bell inequalities and pioneering quantum information science” in a much wider, including philosophical context legitimizing by the authority of the Nobel Prize a new scientific area out of “classical” quantum mechanics relevant to Pauli’s “particle” paradigm of energy conservation and thus to the Standard model obeying it. One justifies the eventual future theory of quantum gravitation as belonging to the newly established quantum information science. Entanglement, involving non-Hermitian operators for its rigorous description, non-unitarity as well as nonlocal and superluminal physical signals “spookily” (by Einstein’s flowery epithet) synchronizing and transferring some nonzero action at a distance, can be considered to be quantum gravity so that its local counterpart to be Einstein’s gravitation according to general relativity therefore pioneering an alternative pathway to quantum gravitation different from the “secondary quantization” of the Standard model. So, the experiments of entanglement once they have been awarded by the Nobel Prize launch particularly the relevant theory of quantum gravitation grounded on “quantum information science” thus granted to be nonclassical quantum mechanics in the shared framework of the generalized quantum mechanics obeying rather quantum-information conservation than only energy conservation. The concept of “dark phase” of the universe naturally linked to the very well confirmed “dark matter” and “dark energy” and opposed to its “light phase” inherent to classical quantum mechanics and the Standard model obeys quantum-information conservation, after which reversible causality or the mutual transformation of energy and information are valid. The mythical Big Bang after which energy conservation holds universally is to be replaced by an omnipresent and omnitemporal medium of decoherence of the dark and nonlocal phase into the light and local phase. The former is only an integral image of the latter and borrowed in fact rather from religion than from science. Physical, methodological and proper philosophical conclusions follow from that paradigm shift heralded by this year’s Nobel Prize in physics. For example, the scientific theory of thinking should originate from the dark phase of the universe, as well: probably only approximately modeled by neural networks physically belonging to the light phase thoroughly. A few crucial philosophical sequences follow from the break of Pauli’s paradigm: (1) the establishment of the “dark” phase of the universe as opposed to its “light” phase, only to which the Cartesian dichotomy of “body” and “mind” is valid; (2) quantum information conservation as relevant to the dark phase, furthermore generalizing energy conservation as to its light phase, productively allowing for physical entities to appear “ex nihilo”, i.e., from the dark phase, in which energy and time are yet inseparable from each other; (3) reversible causality as inherent to the dark phase; (4) the interpretation of gravitation only mathematically: as an interpretation of the incompleteness of finiteness to infinity, for example, following the Gödel dichotomy (“either contradiction or incompleteness”) about the relation of arithmetic to set theory; (5) the restriction of the concept of hierarchy only to the light phase; (6) the commensurability of both physical extremes of a quantum and the universe as a whole in the dark phase obeying quantum information conservation and akin to Nicholas of Cusa’s philosophical and theological worldview
Measuring the impact of COVID-19 on hospital care pathways
Care pathways in hospitals around the world reported significant disruption during the recent COVID-19 pandemic but measuring the actual impact is more problematic. Process mining can be useful for hospital management to measure the conformance of real-life care to what might be considered normal operations. In this study, we aim to demonstrate that process mining can be used to investigate process changes associated with complex disruptive events. We studied perturbations to accident and emergency (A &E) and maternity pathways in a UK public hospital during the COVID-19 pandemic. Co-incidentally the hospital had implemented a Command Centre approach for patient-flow management affording an opportunity to study both the planned improvement and the disruption due to the pandemic. Our study proposes and demonstrates a method for measuring and investigating the impact of such planned and unplanned disruptions affecting hospital care pathways. We found that during the pandemic, both A &E and maternity pathways had measurable reductions in the mean length of stay and a measurable drop in the percentage of pathways conforming to normative models. There were no distinctive patterns of monthly mean values of length of stay nor conformance throughout the phases of the installation of the hospital’s new Command Centre approach. Due to a deficit in the available A &E data, the findings for A &E pathways could not be interpreted
Discovering Causal Relations and Equations from Data
Physics is a field of science that has traditionally used the scientific
method to answer questions about why natural phenomena occur and to make
testable models that explain the phenomena. Discovering equations, laws and
principles that are invariant, robust and causal explanations of the world has
been fundamental in physical sciences throughout the centuries. Discoveries
emerge from observing the world and, when possible, performing interventional
studies in the system under study. With the advent of big data and the use of
data-driven methods, causal and equation discovery fields have grown and made
progress in computer science, physics, statistics, philosophy, and many applied
fields. All these domains are intertwined and can be used to discover causal
relations, physical laws, and equations from observational data. This paper
reviews the concepts, methods, and relevant works on causal and equation
discovery in the broad field of Physics and outlines the most important
challenges and promising future lines of research. We also provide a taxonomy
for observational causal and equation discovery, point out connections, and
showcase a complete set of case studies in Earth and climate sciences, fluid
dynamics and mechanics, and the neurosciences. This review demonstrates that
discovering fundamental laws and causal relations by observing natural
phenomena is being revolutionised with the efficient exploitation of
observational data, modern machine learning algorithms and the interaction with
domain knowledge. Exciting times are ahead with many challenges and
opportunities to improve our understanding of complex systems.Comment: 137 page
(b2023 to 2014) The UNBELIEVABLE similarities between the ideas of some people (2006-2016) and my ideas (2002-2008) in physics (quantum mechanics, cosmology), cognitive neuroscience, philosophy of mind, and philosophy (this manuscript would require a REVOLUTION in international academy environment!)
(b2023 to 2014) The UNBELIEVABLE similarities between the ideas of some people (2006-2016) and my ideas (2002-2008) in physics (quantum mechanics, cosmology), cognitive neuroscience, philosophy of mind, and philosophy (this manuscript would require a REVOLUTION in international academy environment!
LieDetect: Detection of representation orbits of compact Lie groups from point clouds
We suggest a new algorithm to estimate representations of compact Lie groups
from finite samples of their orbits. Different from other reported techniques,
our method allows the retrieval of the precise representation type as a direct
sum of irreducible representations. Moreover, the knowledge of the
representation type permits the reconstruction of its orbit, which is useful to
identify the Lie group that generates the action. Our algorithm is general for
any compact Lie group, but only instantiations for SO(2), T^d, SU(2) and SO(3)
are considered. Theoretical guarantees of robustness in terms of Hausdorff and
Wasserstein distances are derived. Our tools are drawn from geometric measure
theory, computational geometry, and optimization on matrix manifolds. The
algorithm is tested for synthetic data up to dimension 16, as well as real-life
applications in image analysis, harmonic analysis, and classical mechanics
systems, achieving very accurate results.Comment: 84 pages, 16 figure
Deep Generative Modelling of Human Behaviour
Human action is naturally intelligible as a time-varying graph of connected joints constrained by locomotor anatomy and physiology. Its prediction allows the anticipation of actions with applications across healthcare, physical rehabilitation and training, robotics, navigation, manufacture, entertainment, and security. In this thesis we investigate deep generative approaches to the problem of understanding human action. We show that the learning of generative qualities of the distribution may render discriminative tasks more robust to distributional shift and real-world variations in data quality. We further build, from the bottom-up, a novel stochastically deep generative modelling model taylored to the problem of human motion and demonstrate many of it’s state-of-the-art properties such as anomaly detection, imputation in the face of incomplete examples, as well as synthesis—and conditional synthesis—of new samples on massive open source human motion datasets compared to multiple baselines derived from the most relevant pieces of literature
Northeastern Illinois University, Academic Catalog 2023-2024
https://neiudc.neiu.edu/catalogs/1064/thumbnail.jp
- …