405 research outputs found
Effective Aggregation and Querying of Probabilistic RFID Data in a Location Tracking Context
RFID applications usually rely on RFID deployments to manage high-level events such as tracking the location that products visit for supply-chain management, localizing intruders for alerting services, and so on. However, transforming low-level streams into high-level events poses a number of challenges. In this paper, we deal with the well known issues of data redundancy and data-information mismatch: we propose an on-line summarization mechanism that is able to provide small space representation for massive RFID probabilistic data streams while preserving the meaningfulness of the information. We also show that common information needs, i.e. detecting complex events meaningful to applications, can be effectively answered by executing temporal probabilistic SQL queries directly on the summarized data. All the techniques presented in this paper are implemented in a complete framework and successfully evaluated in real-world location tracking scenarios
An HMM–ensemble approach to predict severity progression of ICU treatment for hospitalized Covid–19 patients
COVID–19–related pneumonia requires different modalities of Intensive Care Unit (ICU) interventions at different times to facilitate breathing, depending on severity progression. The ability for clinical staff to predict how patients admitted to hospital will require more or less ICU treatment on a daily basis is critical to ICU management. For real datasets that are sparse and incomplete and where the most important state transitions (dismissal, death) are rare, a standard Hidden Markov Model (HMM) approach is insufficient, as it is prone to overfitting. In this paper we propose a more sophisticated ensemble-based approach that involves training multiple HMMs, each specialized in a subset of the state transitions, and then selecting the more plausible predictions either by selecting or combining the models. We have validated the approach on a live dataset of about 1,000 patients from a partner hospital. Our results show that rare events, as well as the transitions to the most severe treatments outperform state of the art approaches
A weak KAM approach to the periodic stationary Hartree equation
AbstractWe present, through weak KAM theory, an investigation of the stationary Hartree equation in the periodic setting. More in details, we study the Mean Field asymptotics of quantum many body operators thanks to various integral identities providing the energy of the ground state and the minimum value of the Hartree functional. Finally, the ground state of the multiple-well case is studied in the semiclassical asymptotics thanks to the Agmon metric
No users no dataspaces! Query-driven dataspace orchestration
Data analysis in rich spaces of heterogeneous data sources
is an increasingly common activity. Examples include querying the web
of linked data and personal information management. Such analytics on
dataspaces is often iterative and dynamic, in an open-ended interaction
between discovery and data orchestration. The current state of the art in
integration and orchestration in dataspaces is primarily geared towards
close-ended analysis, targeting the discovery of stable data mappings or
one-time, pay-as-you-go ad hoc data mappings. The perspective here is
dataspace-centric.
In this paper, we propose a shift to a user-centric perspective on dataspace
orchestration. We outline basic conceptual and technical challenges
in supporting data analytics which is open-ended and always evolving,
as users respond to new discoveries and connections
Approximating expressive queries on graph-modeled data: The GeX approach
We present the GeX (Graph-eXplorer) approach for the approximate matching of complex queries on graph-modeled data. GeX generalizes existing approaches and provides for a highly expressive graph-based query language that supports queries ranging from keyword-based to structured ones. The GeX query answering model gracefully blends label approximation with structural relaxation, under the primary objective of delivering meaningfully approximated results only. GeX implements ad-hoc data structures that are exploited by a top-k retrieval algorithm which enhances the approximate matching of complex queries. An extensive experimental evaluation on real world datasets demonstrates the efficiency of the GeX query answering
Dealing with data and software interoperability issues in digital factories
The digital factory paradigm comprises a multi-layered integration of the information related to various activities along the factory and product lifecycle manufacturing related resources. A central aspect of a digital factory is that of enabling the product lifecycle stakeholders to collaborate through the use of
software solutions. The digital factory thus expands outside the actual company boundaries and offers the opportunity for the business and its suppliers to collaborate on business processes that affect the whole supply chain. This paper discusses an interoperability architecture for digital factories. To this end, it delves into the issue by analysing the main challenges that must be addressed to support an integrated and scalable factory architecture characterized by access to services, aggregation of data, and orchestration of production processes. Then, it revises the state of the art in the light of these requirements and proposes a general architectural framework conjugating the most interesting features of serviceoriented architectures and data sharing architectures. The study is exemplified through a case study
Density based kinetic Monte Carlo methods
Ziel der vorliegenden Arbeit ist die Entwicklung neuer Methoden für die Simulation epitaxischen Wachstums auf mesoskopischer Skala, insbesondere in der Nähe des thermodynamischem Gleichgewichts. Epitaxisches Wachstum war und ist eines der Schlüsselthemen der Festkörperphysik. Auf Grund der Komplexität des betrachteten Systems ist ein quantitatives Verständnis des epitaxischen Wachstum und eine Vorhersage der Strukturen nur durch die Anwendung von Computermethoden möglich. Wegen der großen Breite relevanter Zeit- und Längenskalen, die für eine realistische Beschreibung des epitaxischen Prozesses zu berücksichtigen sind, kann eine solche Simulation nicht auf einer einzelnen Methode beruhen. In der Vergangenheit wurden daher Methoden entwickelt, die je nach Anwendung und Fragestellung ausgewählte Längen- und Zeitskalen berücksichtigten. Die mesoskopische Skala ist für praktischen Anwendungen besonders wichtig, weil diese die typische Skala für Halbleiterbauelemente ist. Eine weit verbreitete Methode auf dieser Skala ist die Kinetic Monte Carlo (KMC) Methode. Basierend auf einer sorgfältigen Analyse wurden Möglichkeiten aufgezeigt, um die KMC Methode bzgl. Rechengeschwindigkeit zu optimieren. Besonderes Gewicht wurde dabei auf den technologisch wichtigen Fall höherer Temperaturen in der Nähe der thermodynamischen Gleichgewichts gelegt. Auf Grund dieser Analyse wurde eine neue Methode, die eine Erweiterung von KMC darstellt, vorgeschlagen. Diese neue Methode ist ein Dichteansatz für KMC. Dichteansätze haben sich als sehr effizient in anderen Gebieten der Physik oder der Chemie erwiesen. Ein Beispiel ist die Berechnung der elektronischen Struktur, wo die Dichte-Funktional-Theorie heute ein Standardwerkzeug ist. Die Adatomdichte ist das wichtigste Element in der neu entwickelten Methode (ähnlich wie die Elektrondichte bei der Beschreibung der Elektronenstruktur). Die Adatomdichte-Methode entsteht aus der Kombination von KMC-Methoden und der Lösung von diffusionsähnlichen Gleichungen für die Adatomdichte. Zwei verschiedene Methode wurden hier entwickelt: (i) die Adatom Density Kinetic Monte Carlo (AD-KMC) Methode und die Adatom Probability Kinetic Monte Carlo (AP-KMC) Methode. Beide Methoden sind eng miteinander verbunden, wobei die AD-KMC-Methode eine Nährung der AP-KMC-Methode darstellt. Beide Methoden wurden von der Master Gleichung abgeleitet, aus der auch die KMC-Methode abgeleitet wurde. In AP-KMC gibt es für jedes Adatom auf der wachsenden Oberfläche eine entsprechende Adatomdichte. AP-KMC liefert gute Ergebnisse für Simulationen bei beliebigen Temperaturen. Bei höheren Temperaturen sind die Adatomdichten schnell auf der Oberfläche zerstreut. Unter diesen Bedingungen ist es möglich, die einzelnen Adatomdichten zu einer einzigen integralen Adatomdichte zusammenzufassen. Das wurde in AD-KMC gemacht, die schneller als AP-KMC ist, die aber auch mehr Nährungen enthielt. Statistische Tests wurden duchgeführt, um die Genauigkeit und die Anwendbarkeit der dichtebasierten Simulationen zu testen. Die Vorteile von diesen Methoden gegenüber KMC wurden in einem Vergleich von CPU-Zeiten für Test-Simulationen gezeigt. Um diese Methoden zu entwickeln, wurde die Beschreibung der Nukleationsprozesse innerhalb des Adatomdichte-Ansatzs notwendig. Zum diesem Zweck wurde zum ersten Mal, soweit uns bekannt ist, ein lokaler Nukleationsterm eingeführt. Dieser Term ist jedoch noch sehr approximativ analog zur Dichtefunktionaltheorie sind daher noch weitere Anstrengungen/ Entwicklungen notwendig, um den lokalen Term für die Nukleation zu verbessern.The goal of the present work is the development of new methods for the simulation of epitaxial growth at the mesoscopic scale, in particular close to the thermodynamic equilibrium. Epitaxail growth was and still is one of the main themes of solid state physics. Due to the complexity of the systems considered a quantitative understanding of epitaxial growth and the prediction of the structures which can take places over the surface is possible only using computer simulations. Because of the wide range of time and space scales, that has to be consider to describe the epitaxial processes, the simulation of such a system cannot be based on a single method. Different methods have been developed, which depending on the time and length scale are suited for given applications or problems. The mesoscopic scale is for practical applications particularly important, because it is the typical scale for semiconductor devices. Kinetic Monte Carlo (KMC) is a very used method for simulations at this scale. Based on a careful analysis it has been showed in this work how it is possible to optimize and speed up the KMC method. The interest has been focused on high temperatures close to the thermodynamic equilibrium, which is particularly important for technological applications. Based on this analysis a new method have been proposed, which represents an extension of KMC. Density methods have been efficiently applied in other fields of Physics and Chemistry. An example is the calculation of electronic structures, where the Density Functional Theory is nowadays a standard tool. The adatom density is the key element in the new developed method (similarly to the electron density for the description of electronic structures). The adatom density method is a combination of the KMC method and of the solution of the diffusion like equation for the adatom density. Two different methods have been developed: (i) the Adatom Density Kinetic Monte Carlo (AD-KMC) and (ii) the Adatom Probability Kinetic Monte Carlo (AP-KMC). The two methods are closely related and AD-KMC can be considered a further approximation of the AP-KMC. Both methods have been directly derived from the master equation, from which it is also possible to derive the KMC method. AP-KMC gives good results for simulations at any temperatures. At high temperatures the adatom densities of the different adatoms spread in a short time over the surface. Under these conditions it is possible to sum up all the single adatom densities in a total adatom density. This is what has been done in AD-KMC, which is faster than AP-KMC, but it contains more approximations. Statistical tests have been performed to check the accuracy and the field of application of the new methods. The advantage of these methods over KMC has been shown in a CPU-time comparison for test simulations. To develop these methods it was necessary to describe the nucleation processes within the adatom density approach. For this reason, it has been introduced for the first time, as far as we know, a local nucleation term. This term is still very approximate - similarly to the Density Functional Theory are necessary further developments, to improve the local nucleation term
Data-driven, AI-based clinical practice: experiences, challenges, and research directions
Clinical practice is evolving rapidly, away from the traditional but inefficient detect-and-cure approach, and towards a Preventive, Predictive, Personalised and Participative (P4) vision that focuses on extending people’s wellness state. This vision is increasingly data-driven, AI-based, and is underpinned by many forms of "Big Health Data" including periodic clinical assessments and electronic health records, but also using new forms of self-assessment, such as mobile-based questionnaires and personal wearable devices. Over the last few years, we have been conducting a fruitful research collaboration with the Infectious Disease Clinic of the University Hospital of Modena having the main aim of exploring specific opportunities offered by data-driven AI-based approaches to support diagnosis, hospital organization and clinical research. Drawing from this experience, in this paper we provide an overview of the main research challenges that need to be addressed to design and implement data-driven healthcare applications. We present concrete instantiations of these challenges in three real-world use cases and summarise the specific solutions we devised to address them and, finally, we propose a research agenda that outlines the future of research in this field
Data-Driven, AI-Based Clinical Practice:Experiences, Challenges, and Research Directions
Clinical practice is evolving rapidly, away from the traditional but inefficient detect-and-cure approach, and towards a Preventive, Predictive, Personalised and Participative (P4) vision that focuses on extending people's wellness state. This vision is increasingly data-driven, AI-based, and is underpinned by many forms of "Big Health Data" including periodic clinical assessments and electronic health records, but also using new forms of self-assessment, such as mobile-based questionnaires and personal wearable devices. Over the last few years, we have been conducting a fruitful research collaboration with the Infectious Disease Clinic of the University Hospital of Modena having the main aim of exploring specific opportunities offered by data-driven AI-based approaches to support diagnosis, hospital organization and clinical research. Drawing from this experience, in this paper we provide an overview of the main research challenges that need to be addressed to design and implement data-driven healthcare applications. We present concrete instantiations of these challenges in three real-world use cases and summarise the specific solutions we devised to address them and, finally, we propose a research agenda that outlines the future of research in this field.</p
- …