355 research outputs found
Eighth International Workshop "What can FCA do for Artificial Intelligence?" (FCA4AI at ECAI 2020)
International audienceProceedings of the 8th International Workshop "What can FCA do for Artificial Intelligence?" (FCA4AI 2020)co-located with 24th European Conference on Artificial Intelligence (ECAI 2020), Santiago de Compostela, Spain, August 29, 202
Advances in Robotics, Automation and Control
The book presents an excellent overview of the recent developments in the different areas of Robotics, Automation and Control. Through its 24 chapters, this book presents topics related to control and robot design; it also introduces new mathematical tools and techniques devoted to improve the system modeling and control. An important point is the use of rational agents and heuristic techniques to cope with the computational complexity required for controlling complex systems. Through this book, we also find navigation and vision algorithms, automatic handwritten comprehension and speech recognition systems that will be included in the next generation of productive systems developed by man
Statistical and Graph-Based Signal Processing: Fundamental Results and Application to Cardiac Electrophysiology
The goal of cardiac electrophysiology is to obtain information about the mechanism, function, and performance of the electrical activities of the heart, the identification of deviation from normal pattern and the design of treatments. Offering a better insight into cardiac arrhythmias comprehension and management, signal processing can help the physician to enhance the treatment strategies, in particular in case of atrial fibrillation (AF), a very common atrial arrhythmia which is associated to significant morbidities, such as increased risk of mortality, heart failure, and thromboembolic events. Catheter ablation of AF is a therapeutic technique which uses radiofrequency energy to destroy atrial tissue involved in the arrhythmia sustenance, typically aiming at the electrical disconnection of the of the pulmonary veins triggers. However, recurrence rate is still very high, showing that the very complex and heterogeneous nature of AF still represents a challenging problem.
Leveraging the tools of non-stationary and statistical signal processing, the first part of our work has a twofold focus: firstly, we compare the performance of two different ablation technologies, based on contact force sensing or remote magnetic controlled, using signal-based criteria as surrogates for lesion assessment. Furthermore, we investigate the role of ablation parameters in lesion formation using the late-gadolinium enhanced magnetic resonance imaging. Secondly, we hypothesized that in human atria the frequency content of the bipolar signal is directly related to the local conduction velocity (CV), a key parameter characterizing the substrate abnormality and influencing atrial arrhythmias. Comparing the degree of spectral compression among signals recorded at different points of the endocardial surface in response to decreasing pacing rate, our experimental data demonstrate a significant correlation between CV and the corresponding spectral centroids.
However, complex spatio-temporal propagation pattern characterizing AF spurred the need for new signals acquisition and processing methods. Multi-electrode catheters allow whole-chamber panoramic mapping of electrical activity but produce an amount of data which need to be preprocessed and analyzed to provide clinically relevant support to the physician. Graph signal processing has shown its potential on a variety of applications involving high-dimensional data on irregular domains and complex network. Nevertheless, though state-of-the-art graph-based methods have been successful for many tasks, so far they predominantly ignore the time-dimension of data.
To address this shortcoming, in the second part of this dissertation, we put forth a Time-Vertex Signal Processing Framework, as a particular case of the multi-dimensional graph signal processing. Linking together the time-domain signal processing techniques with the tools of GSP, the Time-Vertex Signal Processing facilitates the analysis of graph structured data which also evolve in time. We motivate our framework leveraging the notion of partial differential equations on graphs. We introduce joint operators, such as time-vertex localization and we present a novel approach to significantly improve the accuracy of fast joint filtering. We also illustrate how to build time-vertex dictionaries, providing conditions for efficient invertibility and examples of constructions.
The experimental results on a variety of datasets suggest that the proposed tools can bring significant benefits in various signal processing and learning tasks involving time-series on graphs. We close the gap between the two parts illustrating the application of graph and time-vertex signal processing to the challenging case of multi-channels intracardiac signals
Recommended from our members
Degassing of open-vent low-silica volcanoes
Open-vent activity at volcanoes of low-silica composition, such as Stromboli (Italy), Villarrica (Chile), Mt. Erebus (Antarctica), is characterised by persistent passive gas emission and recurrent mild explosive outgassing. Four styles of bubble bursting activity have been recognised in such volcanoes: seething magma, small short-lived lava fountains, strombolian explosions and gas puffing. At Villarrica, one of the two case study volcanoes, seething magma consists of continual bursts of bubbles up to a few metres in diameter, with varying strength over the entire surface of the lava lake. Small lava fountains, seen as a vigorous extension of seething magma, commonly last 20-120 s and reach 10-40 m above the lava free-surface. Strombolian explosions can last for less than a second in a single bubble burst that erupts mainly bombs, as seen at the lava lake of Mt. Erebus and Villarrica volcanoes, or for more than 30 seconds accompanied by large amounts of ash, as seen at Stromboli and Mt. Etna volcanoes. At Stromboli, the second case study volcano, gas puffing consists of small but repetitive bubble bursts with a generally stable eruption frequency in the range 0.2-1.2 s-1. More vigorous explosive phenomena, such as hundreds-metres high lava fountains or very strong (paroxysmal) explosions, may occur during eruptions or episodes of elevated activity.
Correlations between seismicity and visual observations at Villarrica volcano indicate that the seismic tremor is mostly caused by explosive outgassing. Real-time Seismic Amplitude Measurements (RSAM) and SO2 emission rates (measured by FLYSPEC) show a very good positive linear correlation between periods of background and elevated activity. Higher SO2 emissions appear to be related to higher levels of the lava lake, stronger bubble bursting activity and changes in the morphology and texture of the crater floor. Background (low) levels of activity correspond to a lava lake located >80 m below the crater rim, small and/or blocky morphology of the roof, seismic amplitude (RSAM) lower than 25 units, few volcano-tectonic earthquakes, and daily averages of SO2 emission below 600 Mg d-1.
Convection of magma in the narrow conduits of the plumbing system can explain the sustained degassing with negligible effusion of lava, while supporting the variable outgassing styles at open-vent volcanoes. Theoretical analysis and laboratory experiments carried out with immiscible fluids in vertical and inclined pipes, constrain the convection in terms of a 'flux coefficient' that depends on the viscosity ratio between the liquids, flow regime, angle of inclination of the pipe, and position of the interface between the fluids. Prediction of the flux coefficient is possible within an acceptable range of error. Application of this model to Villarrica and Stromboli volcanoes, along with the analysis of the physical properties of the magma and gas data collated from the literature, allow the estimation of two parameters that constrain the dimensions of the convection: the magma flow rate and equivalent radius. Magma degassing at Villarrica is characterised by the ascent of a relatively degassed magma. Most of the gas exsolves at shallow levels in the system, leading to continuous bubble bursting activity at the lava lake. At Stromboli. magma degassing takes place in an inclined dyke (or dykes). Within this geometry. magma convection adopts a stratified regime of the gas-rich magma overlying the degassed melt, which favours coalescence of bubbles and an efficient convection. Interconnected conduits at the uppermost part of the system constrain the release of the large gas slugs observed during strombolian explosions
An Information-Theoretic Framework for Consistency Maintenance in Distributed Interactive Applications
Distributed Interactive Applications (DIAs) enable geographically dispersed users
to interact with each other in a virtual environment. A key factor to the success
of a DIA is the maintenance of a consistent view of the shared virtual world for
all the participants. However, maintaining consistent states in DIAs is difficult
under real networks. State changes communicated by messages over such networks
suffer latency leading to inconsistency across the application. Predictive Contract
Mechanisms (PCMs) combat this problem through reducing the number of messages
transmitted in return for perceptually tolerable inconsistency. This thesis examines
the operation of PCMs using concepts and methods derived from information theory.
This information theory perspective results in a novel information model of PCMs
that quantifies and analyzes the efficiency of such methods in communicating the
reduced state information, and a new adaptive multiple-model-based framework for
improving consistency in DIAs.
The first part of this thesis introduces information measurements of user behavior
in DIAs and formalizes the information model for PCM operation. In presenting the
information model, the statistical dependence in the entity state, which makes using
extrapolation models to predict future user behavior possible, is evaluated. The
efficiency of a PCM to exploit such predictability to reduce the amount of network
resources required to maintain consistency is also investigated. It is demonstrated
that from the information theory perspective, PCMs can be interpreted as a form
of information reduction and compression.
The second part of this thesis proposes an Information-Based Dynamic Extrapolation
Model for dynamically selecting between extrapolation algorithms based on
information evaluation and inferred network conditions. This model adapts PCM
configurations to both user behavior and network conditions, and makes the most
information-efficient use of the available network resources. In doing so, it improves
PCM performance and consistency in DIAs
Solutions for large scale, efficient, and secure Internet of Things
The design of a general architecture for the Internet of Things (IoT) is a complex task, due to the heterogeneity of devices, communication technologies, and applications that are part of such systems. Therefore, there are significant opportunities to improve the state of the art, whether to better the performance of the system, or to solve actual issues in current systems. This thesis focuses, in particular, on three aspects of the IoT. First, issues of cyber-physical systems are analysed. In these systems, IoT technologies are widely used to monitor, control, and act on physical entities. One of the most important issue in these scenarios are related to the communication layer, which must be characterized by high reliability, low latency, and high energy efficiency. Some solutions for the channel access scheme of such systems are proposed, each tailored to different specific scenarios. These solutions, which exploit the capabilities of state of the art radio transceivers, prove effective in improving the performance of the considered systems. Positioning services for cyber-physical systems are also investigated, in order to improve the accuracy of such services. Next, the focus moves to network and service optimization for traffic intensive applications, such as video streaming. This type of traffic is common amongst non-constrained devices, like smartphones and augmented/virtual reality headsets, which form an integral part of the IoT ecosystem. The proposed solutions are able to increase the video Quality of Experience while wasting less bandwidth than state of the art strategies. Finally, the security of IoT systems is investigated. While often overlooked, this aspect is fundamental to enable the ubiquitous deployment of IoT. Therefore, security issues of commonly used IoT protocols are presented, together with a proposal for an authentication mechanism based on physical channel features. This authentication strategy proved to be effective as a standalone mechanism or as an additional security layer to improve the security level of legacy systems
Understanding Quantum Technologies 2022
Understanding Quantum Technologies 2022 is a creative-commons ebook that
provides a unique 360 degrees overview of quantum technologies from science and
technology to geopolitical and societal issues. It covers quantum physics
history, quantum physics 101, gate-based quantum computing, quantum computing
engineering (including quantum error corrections and quantum computing
energetics), quantum computing hardware (all qubit types, including quantum
annealing and quantum simulation paradigms, history, science, research,
implementation and vendors), quantum enabling technologies (cryogenics, control
electronics, photonics, components fabs, raw materials), quantum computing
algorithms, software development tools and use cases, unconventional computing
(potential alternatives to quantum and classical computing), quantum
telecommunications and cryptography, quantum sensing, quantum technologies
around the world, quantum technologies societal impact and even quantum fake
sciences. The main audience are computer science engineers, developers and IT
specialists as well as quantum scientists and students who want to acquire a
global view of how quantum technologies work, and particularly quantum
computing. This version is an extensive update to the 2021 edition published in
October 2021.Comment: 1132 pages, 920 figures, Letter forma
Computer Aided Verification
This open access two-volume set LNCS 11561 and 11562 constitutes the refereed proceedings of the 31st International Conference on Computer Aided Verification, CAV 2019, held in New York City, USA, in July 2019. The 52 full papers presented together with 13 tool papers and 2 case studies, were carefully reviewed and selected from 258 submissions. The papers were organized in the following topical sections: Part I: automata and timed systems; security and hyperproperties; synthesis; model checking; cyber-physical systems and machine learning; probabilistic systems, runtime techniques; dynamical, hybrid, and reactive systems; Part II: logics, decision procedures; and solvers; numerical programs; verification; distributed systems and networks; verification and invariants; and concurrency
- …