1,543 research outputs found

    SPADE4: Sparsity and Delay Embedding based Forecasting of Epidemics

    Full text link
    Predicting the evolution of diseases is challenging, especially when the data availability is scarce and incomplete. The most popular tools for modelling and predicting infectious disease epidemics are compartmental models. They stratify the population into compartments according to health status and model the dynamics of these compartments using dynamical systems. However, these predefined systems may not capture the true dynamics of the epidemic due to the complexity of the disease transmission and human interactions. In order to overcome this drawback, we propose Sparsity and Delay Embedding based Forecasting (SPADE4) for predicting epidemics. SPADE4 predicts the future trajectory of an observable variable without the knowledge of the other variables or the underlying system. We use random features model with sparse regression to handle the data scarcity issue and employ Takens' delay embedding theorem to capture the nature of the underlying system from the observed variable. We show that our approach outperforms compartmental models when applied to both simulated and real data.Comment: 24 pages, 13 figures, 2 table

    Smart territories

    Get PDF
    The concept of smart cities is relatively new in research. Thanks to the colossal advances in Artificial Intelligence that took place over the last decade we are able to do all that that we once thought impossible; we build cities driven by information and technologies. In this keynote, we are going to look at the success stories of smart city-related projects and analyse the factors that led them to success. The development of interactive, reliable and secure systems, both connectionist and symbolic, is often a time-consuming process in which numerous experts are involved. However, intuitive and automated tools like “Deep Intelligence” developed by DCSc and BISITE, facilitate this process. Furthermore, in this talk we will analyse the importance of complementary technologies such as IoT and Blockchain in the development of intelligent systems, as well as the use of edge platforms or fog computing

    Graph Signal Processing: Overview, Challenges and Applications

    Full text link
    Research in Graph Signal Processing (GSP) aims to develop tools for processing data defined on irregular graph domains. In this paper we first provide an overview of core ideas in GSP and their connection to conventional digital signal processing. We then summarize recent developments in developing basic GSP tools, including methods for sampling, filtering or graph learning. Next, we review progress in several application areas using GSP, including processing and analysis of sensor network data, biological data, and applications to image processing and machine learning. We finish by providing a brief historical perspective to highlight how concepts recently developed in GSP build on top of prior research in other areas.Comment: To appear, Proceedings of the IEE

    Big Data Research in Italy: A Perspective

    Get PDF
    The aim of this article is to synthetically describe the research projects that a selection of Italian universities is undertaking in the context of big data. Far from being exhaustive, this article has the objective of offering a sample of distinct applications that address the issue of managing huge amounts of data in Italy, collected in relation to diverse domains

    Epidemic Modeling with Generative Agents

    Full text link
    This study offers a new paradigm of individual-level modeling to address the grand challenge of incorporating human behavior in epidemic models. Using generative artificial intelligence in an agent-based epidemic model, each agent is empowered to make its own reasonings and decisions via connecting to a large language model such as ChatGPT. Through various simulation experiments, we present compelling evidence that generative agents mimic real-world behaviors such as quarantining when sick and self-isolation when cases rise. Collectively, the agents demonstrate patterns akin to multiple waves observed in recent pandemics followed by an endemic period. Moreover, the agents successfully flatten the epidemic curve. This study creates potential to improve dynamic system modeling by offering a way to represent human brain, reasoning, and decision making

    Big data analytics:Computational intelligence techniques and application areas

    Get PDF
    Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment

    Beyond Econometrics: Using Google Trends and Social Media Data to Forecast Unemployment - OECD analysis of accuracy gains and robustness of predictions

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Business AnalyticsGoogle Trends has been used for less than two decades in academia to forecast outcomes, using various techniques. While most research has focused on developed countries, there are clear information gaps that have not been fully addressed. Previous studies in this field indicate that non-linear algorithms with feature set selection while using a large set of queries can yield better results across more countries. However, it is unlikely that these methods will be widely and rapidly adopted given the skills required. Therefore, the objective of this research is to explore whether the abundance of digital data sources, specifically Google searches, can aid agents as institutions and policy makers in their modeling efforts. The aim is to fill the gap in analysis for less influential countries and explore whether the use of Google searches data can be extended to multiple countries using a simple and agile methodology based on a widely used statistics-based modeling approach (ARIMAX). For this use we selected unemployment rate as the variable of interest. However, our findings show that only 30% of countries had promising results using Google-augmented ARIMAs. Thus, more computationally intensive empirical strategies would be needed to extract more predictive power out of Google queries information pool for unemployment rate modelling
    • …
    corecore