2,877 research outputs found

    Social media monitoring: Responsive governance in the shadow of surveillance?

    Get PDF
    __Abstract__ Social media monitoring is gradually becoming a common practice in public organizations in the Netherlands. The main purposes of social media monitoring are strategic control and responsiveness. Social media monitoring poses normative questions in terms of transparency, accountability and privacy. We investigate practices of social media monitoring in four Dutch public organizations. Policy departments seem to be more strongly orientated towards monitoring, whereas organizations involved in policy implementation seem to be more inclined to progress to webcare. The paper argues for more transparency on social media monitoring

    Understanding the role of social media monitoring in generating external intelligence

    Get PDF
    Social media data are becoming increasingly critical for businesses to capture, analyse, and utilise in a timely manner. However, the unstructured and distributed nature and volume of this information makes the task of extracting useful and practical information challenging. Given the dynamic evolution of social media and social media monitoring, our current understanding of how social media monitoring can help organisations to create business value is inadequate. As a result, there is a need to study how organisations can (a) extract and analyse social media data related to their business (Sensing), and (b) utilise external intelligence gained from social media monitoring for specific business initiatives (Seizing). This study uses a qualitative approach with a multiple embedded case study design to understand the phenomenon of social media monitoring and its outcome for organisations. Anticipated contributions are presented.<br /

    Social-media monitoring for cold-start recommendations

    Get PDF
    Generating personalized movie recommendations to users is a problem that most commonly relies on user-movie ratings. These ratings are generally used either to understand the user preferences or to recommend movies that users with similar rating patterns have rated highly. However, movie recommenders are often subject to the Cold-Start problem: new movies have not been rated by anyone, so, they will not be recommended to anyone; likewise, the preferences of new users who have not rated any movie cannot be learned. In parallel, Social-Media platforms, such as Twitter, collect great amounts of user feedback on movies, as these are very popular nowadays. This thesis proposes to explore feedback shared on Twitter to predict the popularity of new movies and show how it can be used to tackle the Cold-Start problem. It also proposes, at a finer grain, to explore the reputation of directors and actors on IMDb to tackle the Cold-Start problem. To assess these aspects, a Reputation-enhanced Recommendation Algorithm is implemented and evaluated on a crawled IMDb dataset with previous user ratings of old movies,together with Twitter data crawled from January 2014 to March 2014, to recommend 60 movies affected by the Cold-Start problem. Twitter revealed to be a strong reputation predictor, and the Reputation-enhanced Recommendation Algorithm improved over several baseline methods. Additionally, the algorithm also proved to be useful when recommending movies in an extreme Cold-Start scenario, where both new movies and users are affected by the Cold-Start problem

    Social Media Monitoring

    Get PDF
    El sistema desarrollado tiene como objetivo la integración y monitorización de la información en castellano de las redes sociales de un usuario (Facebook, Twitter y noticias web de interés) a través de una única aplicación web. El sistema se sustenta en tres componentes principales: un módulo que implementa una gran variedad de tareas de Procesamiento del Lenguaje Natural (PLN), un módulo software de recuperación de datos de redes sociales mediante crawlers y almacenamiento de resultados, y una aplicación web que presenta una interfaz de usuario para la visualización de la información de forma sugestiva e interactiva. De esta forma, la solución propuesta permite a los usuarios estar actualizados y tener un control de sus redes sociales, pudiendo estar al día de la información, tanto de sus publicaciones como de sus intereses, en una única interfaz sencilla e intuitiva.The developed system aims to integrate and monitor information in Spanish of a user's social networks (Facebook, Twitter and web news of interest) through a single web application. It is based on three main components: a module that implements a wide variety of Natural Language Processing tasks (NLP), an information retrieval module which capture social networks data by means of crawling and stores processing results, and an application web that presents a user interface through which visualizing the information obtained in a suggestive and interactive way. Therefore, the proposed solution allows users to be updated and control their social media networks, being to able to be up-to-date about the information of their publications and their interests, in a single, simple and intuitive graphical interface.Este trabajo ha sido patrocinado en parte por el Grupo de Big Data y Sistemas Cognitivos del Instituto Tecnológico de Aragón. La difusión de este trabajo ha sido parcialmente financiada por el Programa Operativo FSE para Aragón (2014-2020)

    Neue Einblicke – Social Media Monitoring in der Stadtplanung

    Get PDF
    Das Social Web ist ein Kommunikationskanal, der mittlerweile weit verbreitet ist und den man allgemein schon als Standard bezeichnen kann. Vielfach wurde beschrieben, wie Social Media auch in der Raumplanung zur Kommunikation und Partizipation genutzt werden kann (vgl. Habbel, Huber 2008; VHW 2011; Haller, Höffken 2011). Der Stadtplanung von heute bietet sich aufgrund der stetigen Entwicklung von Hard- und Software sowie eines immer komplexer werdenden Internets eine Vielzahl an Instrumenten, um eine erfolgreiche Planung durchzuführen. So ist es heutzutage möglich, die verschiedenen Aufgabenfelder mit digitalen Werkzeugen zu unterstützen und zu bearbeiten. Ein Beispiel ist der Einsatz von Social Media im Kontext der Bürgerbeteiligung. Um hierbei genauere Analysen des Nutzerverhaltens vornehmen zu können, bietet sich ein Einsatz von Monitoring-Tools wie bspw. Google Analytics an. Dabei steht die Möglichkeit, noch präziser auf die Interessen und Anregungen der Nutzer einzugehen, um eine gezieltere Kommunikation zu erreichen, im Fokus des Interesses (von Dobeneck, 2012). Ziel dieses Papers ist es, in Erfahrung zu bringen, wie der Einsatz einer begleitenden Social Media Strategie aussehen kann und welcher Mehrwert sich für die Stadtplanung erzielen lässt. Nach einem einführenden Teil, werden die Grundlagen von Social Media und der Mehrwert digitaler Instrumente in der Stadtplanung aufgezeigt. Zentrales Element des Papers sind die Ergebnisse aus der Untersuchung eines Praxisbeispiels – eines Weblogs für ein Stadtentwicklungsprojekt. Basierend auf den dort gewonnen Daten werden Erkenntnisse bezüglich möglicher Analysen und Auswertungen, aber auch Grenzen und Hindernissen aufgezeigt. Aufgrund dieser praxisnahen Analyse werden die allgemeinen Erkenntnisse und der Nutzen für einen Einsatz in der Stadtplanung skizziert, wodurch sich konkrete Handlungsempfehlungen ableiten lassen. Ebenfalls werden mögliche Gefahren dargestellt, welche eine Erhebung von nutzergenerierter Daten im Bezug zum Datenschutz, bzw. Recht auf Privatsphäre mit sich bringen. Diese kritische Reflektion soll helfen, einer Übereuphorisierung der Datenanalyse entgegenzuwirken, um eine realistische Einschätzung des Anwendungspotentials für die Stadtplanung geben zu können

    Firsthand Opiates Abuse on Social Media: Monitoring Geospatial Patterns of Interest Through a Digital Cohort

    Get PDF
    In the last decade drug overdose deaths reached staggering proportions in the US. Besides the raw yearly deaths count that is worrisome per se, an alarming picture comes from the steep acceleration of such rate that increased by 21% from 2015 to 2016. While traditional public health surveillance suffers from its own biases and limitations, digital epidemiology offers a new lens to extract signals from Web and Social Media that might be complementary to official statistics. In this paper we present a computational approach to identify a digital cohort that might provide an updated and complementary view on the opioid crisis. We introduce an information retrieval algorithm suitable to identify relevant subspaces of discussion on social media, for mining data from users showing explicit interest in discussions about opioid consumption in Reddit. Moreover, despite the pseudonymous nature of the user base, almost 1.5 million users were geolocated at the US state level, resembling the census population distribution with a good agreement. A measure of prevalence of interest in opiate consumption has been estimated at the state level, producing a novel indicator with information that is not entirely encoded in the standard surveillance. Finally, we further provide a domain specific vocabulary containing informal lexicon and street nomenclature extracted by user-generated content that can be used by researchers and practitioners to implement novel digital public health surveillance methodologies for supporting policy makers in fighting the opioid epidemic.Comment: Proceedings of the 2019 World Wide Web Conference (WWW '19

    The role of visualisations in social media monitoring systems

    Get PDF
    Social-Media streams are constantly supplying vast volumes of real-time User Generated Content through platforms such as Twitter, Facebook, and Instagram, which makes it a challenge to monitor and understand. Understanding social conversations has now become a major interest for businesses, PR and advertising agencies, as well as law enforcement and government bodies. Monitoring of social-media allows us to observe large numbers of spontaneous, real-time interactions and varied expression of opinion, often fleeting and private. However, human, expert monitoring is generally unfeasible due to the high volumes of data. This has been a major reason for recent research and development work looking at automated social-media monitoring systems. Such systems often keep the human "out of the loop" as an NLP (Natural Language Processing) pipeline and other data-mining algorithms deal with analysing and extracting features and meaning from the data. This is plagued by a variety of problems, mostly due to the heterogenic, inconsistent and context-poor nature of social-media data, where as a result the accuracy and efficacy of such systems suffers. Nevertheless, automated social-media monitoring systems provide for a scalable, streamlined and often efficient way of dealing with big-data streams. The integration of processing outputs from automated systems and feedback to human experts is a challenge and deserves to be addressed in research literature. This paper will establish the role of the human in the social-media monitoring loop, based on prior systems work in this area. The focus of our investigation will be on use of visualisations for effective feedback to human experts. A specific, custom built system’s case-study in a social-media monitoring scenario will be considered and suggestions on how to bring back the human “into the loop” will be provided. Also some related ethical questions will be briefly considered. It is hoped that this work will inform and provide valuable insight to help improve development of automated social-media monitoring systems

    Using Social Media Monitoring Data to Forecast Online Word-of-Mouth Valence: A Network Autoregressive Approach

    Get PDF
    Managers increasingly use social media for marketing research, particularly to monitor what consumers think about brands. Although social media monitoring can provide rich insights into consumer attitudes, marketers typically use it in a backward-looking manner — that is, to measure past online word-of-mouth (WOM) valence (i.e., sentiment). This article proposes a novel method for using social media monitoring in a forward-looking manner to forecast brands’ future online WOM valence. The approach takes into account information on related brands based on the premise that consumers’ attitudes toward one brand are likely relative to — and therefore associated with — attitudes toward other brands. The method infers associative relations between brands from social media monitoring data by observing which brands are mentioned at the same time in the same social media sources, thus enabling construction of time-varying brand “networks” for representing interdependencies between brands. The authors test six possible methods for capturing brand interdependencies (Jaccard, Dice, anti-Dice, correlation, normalized correlation, and Euclidean distance) and examine the relative performance of each alternative method with a view to identifying the best approach
    corecore