93 research outputs found

    Algorithmic futures

    Get PDF
    In the last few years, tracking systems that harvest web data to identify trends, calculate predictions, and warn about potential epidemic outbreaks have proliferated. These systems integrate crowdsourced data and digital traces, collecting information from a variety of online sources, and they promise to change the way governments, institutions, and individuals understand and respond to health concerns. This article examines some of the conceptual and practical challenges raised by the online algorithmic tracking of disease by focusing on the case of Google Flu Trends (GFT). Launched in 2008, GFT was Google’s flagship syndromic surveillance system, specializing in ‘real-time’ tracking of outbreaks of influenza. GFT mined massive amounts of data about online search behavior to extract patterns and anticipate the future of viral activity. But it did a poor job, and Google shut the system down in 2015. This paper focuses on GFT’s shortcomings, which were particularly severe during flu epidemics, when GFT struggled to make sense of the unexpected surges in the number of search queries. I suggest two reasons for GFT’s difficulties. First, it failed to keep track of the dynamics of contagion, at once biological and digital, as it affected what I call here the ‘googling crowds’. Search behavior during epidemics in part stems from a sort of viral anxiety not easily amenable to algorithmic anticipation, to the extent that the algorithm’s predictive capacity remains dependent on past data and patterns. Second, I suggest that GFT’s troubles were the result of how it collected data and performed what I call ‘epidemic reality’. GFT’s data became severed from the processes Google aimed to track, and the data took on a life of their own: a trackable life, in which there was little flu left. The story of GFT, I suggest, offers insight into contemporary tensions between the indomitable intensity of collective life and stubborn attempts at its algorithmic formalization

    Disaster and Pandemic Management Using Machine Learning: A Survey

    Get PDF
    This article provides a literature review of state-of-the-art machine learning (ML) algorithms for disaster and pandemic management. Most nations are concerned about disasters and pandemics, which, in general, are highly unlikely events. To date, various technologies, such as IoT, object sensing, UAV, 5G, and cellular networks, smartphone-based system, and satellite-based systems have been used for disaster and pandemic management. ML algorithms can handle multidimensional, large volumes of data that occur naturally in environments related to disaster and pandemic management and are particularly well suited for important related tasks, such as recognition and classification. ML algorithms are useful for predicting disasters and assisting in disaster management tasks, such as determining crowd evacuation routes, analyzing social media posts, and handling the post-disaster situation. ML algorithms also find great application in pandemic management scenarios, such as predicting pandemics, monitoring pandemic spread, disease diagnosis, etc. This article first presents a tutorial on ML algorithms. It then presents a detailed review of several ML algorithms and how we can combine these algorithms with other technologies to address disaster and pandemic management. It also discusses various challenges, open issues and, directions for future research
    • …
    corecore