960 research outputs found

    Online Friendships and the Bird’s Nest Drawing in the Age of the Internet

    Get PDF
    This study was a qualitative exploration of friendships facilitated through the internet and online video games. The goal was to investigate how online friendships compare to in-person friendships in terms of quality. Three English-speaking participants who played an online video game and had an online friendship provided unique case studies describing the differences between an online and in-person friendship. The Bird Nest Drawing art assessment by Kaiser (1996; 2016) revealed themes of attachment security which helped explain the variations in the friendships. The findings of this study opened the topic of online friendships for further exploration in the field of art therapy, both in research and in a therapy setting

    ENHANCING CLOUD SYSTEM RUNTIME TO ADDRESS COMPLEX FAILURES

    Get PDF
    As the reliance on cloud systems intensifies in our progressively digital world, understanding and reinforcing their reliability becomes more crucial than ever. Despite impressive advancements in augmenting the resilience of cloud systems, the growing incidence of complex failures now poses a substantial challenge to the availability of these systems. With cloud systems continuing to scale and increase in complexity, failures not only become more elusive to detect but can also lead to more catastrophic consequences. Such failures question the foundational premises of conventional fault-tolerance designs, necessitating the creation of novel system designs to counteract them. This dissertation aims to enhance distributed systems’ capabilities to detect, localize, and react to complex failures at runtime. To this end, this dissertation makes contributions to address three emerging categories of failures in cloud systems. The first part delves into the investigation of partial failures, introducing OmegaGen, a tool adept at generating tailored checkers for detecting and localizing such failures. The second part grapples with silent semantic failures prevalent in cloud systems, showcasing our study findings, and introducing Oathkeeper, a tool that leverages past failures to infer rules and expose these silent issues. The third part explores solutions to slow failures via RESIN, a framework specifically designed to detect, diagnose, and mitigate memory leaks in cloud-scale infrastructures, developed in collaboration with Microsoft Azure. The dissertation concludes by offering insights into future directions for the construction of reliable cloud systems

    La traduzione specializzata all’opera per una piccola impresa in espansione: la mia esperienza di internazionalizzazione in cinese di Bioretics© S.r.l.

    Get PDF
    Global markets are currently immersed in two all-encompassing and unstoppable processes: internationalization and globalization. While the former pushes companies to look beyond the borders of their country of origin to forge relationships with foreign trading partners, the latter fosters the standardization in all countries, by reducing spatiotemporal distances and breaking down geographical, political, economic and socio-cultural barriers. In recent decades, another domain has appeared to propel these unifying drives: Artificial Intelligence, together with its high technologies aiming to implement human cognitive abilities in machinery. The “Language Toolkit – Le lingue straniere al servizio dell’internazionalizzazione dell’impresa” project, promoted by the Department of Interpreting and Translation (Forlì Campus) in collaboration with the Romagna Chamber of Commerce (Forlì-Cesena and Rimini), seeks to help Italian SMEs make their way into the global market. It is precisely within this project that this dissertation has been conceived. Indeed, its purpose is to present the translation and localization project from English into Chinese of a series of texts produced by Bioretics© S.r.l.: an investor deck, the company website and part of the installation and use manual of the Aliquis© framework software, its flagship product. This dissertation is structured as follows: Chapter 1 presents the project and the company in detail; Chapter 2 outlines the internationalization and globalization processes and the Artificial Intelligence market both in Italy and in China; Chapter 3 provides the theoretical foundations for every aspect related to Specialized Translation, including website localization; Chapter 4 describes the resources and tools used to perform the translations; Chapter 5 proposes an analysis of the source texts; Chapter 6 is a commentary on translation strategies and choices

    NCC: Natural Concurrency Control for Strictly Serializable Datastores by Avoiding the Timestamp-Inversion Pitfall

    Full text link
    Strictly serializable datastores greatly simplify the development of correct applications by providing strong consistency guarantees. However, existing techniques pay unnecessary costs for naturally consistent transactions, which arrive at servers in an order that is already strictly serializable. We find these transactions are prevalent in datacenter workloads. We exploit this natural arrival order by executing transaction requests with minimal costs while optimistically assuming they are naturally consistent, and then leverage a timestamp-based technique to efficiently verify if the execution is indeed consistent. In the process of designing such a timestamp-based technique, we identify a fundamental pitfall in relying on timestamps to provide strict serializability, and name it the timestamp-inversion pitfall. We find timestamp-inversion has affected several existing works. We present Natural Concurrency Control (NCC), a new concurrency control technique that guarantees strict serializability and ensures minimal costs -- i.e., one-round latency, lock-free, and non-blocking execution -- in the best (and common) case by leveraging natural consistency. NCC is enabled by three key components: non-blocking execution, decoupled response control, and timestamp-based consistency check. NCC avoids timestamp-inversion with a new technique: response timing control, and proposes two optimization techniques, asynchrony-aware timestamps and smart retry, to reduce false aborts. Moreover, NCC designs a specialized protocol for read-only transactions, which is the first to achieve the optimal best-case performance while ensuring strict serializability, without relying on synchronized clocks. Our evaluation shows that NCC outperforms state-of-the-art solutions by an order of magnitude on many workloads

    Cognitive Machine Individualism in a Symbiotic Cybersecurity Policy Framework for the Preservation of Internet of Things Integrity: A Quantitative Study

    Get PDF
    This quantitative study examined the complex nature of modern cyber threats to propose the establishment of cyber as an interdisciplinary field of public policy initiated through the creation of a symbiotic cybersecurity policy framework. For the public good (and maintaining ideological balance), there must be recognition that public policies are at a transition point where the digital public square is a tangible reality that is more than a collection of technological widgets. The academic contribution of this research project is the fusion of humanistic principles with Internet of Things (IoT) technologies that alters our perception of the machine from an instrument of human engineering into a thinking peer to elevate cyber from technical esoterism into an interdisciplinary field of public policy. The contribution to the US national cybersecurity policy body of knowledge is a unified policy framework (manifested in the symbiotic cybersecurity policy triad) that could transform cybersecurity policies from network-based to entity-based. A correlation archival data design was used with the frequency of malicious software attacks as the dependent variable and diversity of intrusion techniques as the independent variable for RQ1. For RQ2, the frequency of detection events was the dependent variable and diversity of intrusion techniques was the independent variable. Self-determination Theory is the theoretical framework as the cognitive machine can recognize, self-endorse, and maintain its own identity based on a sense of self-motivation that is progressively shaped by the machine’s ability to learn. The transformation of cyber policies from technical esoterism into an interdisciplinary field of public policy starts with the recognition that the cognitive machine is an independent consumer of, advisor into, and influenced by public policy theories, philosophical constructs, and societal initiatives

    Second-Person Surveillance: Politics of User Implication in Digital Documentaries

    Get PDF
    This dissertation analyzes digital documentaries that utilize second-person address and roleplay to make users feel implicated in contemporary refugee crises, mass incarceration in the U.S., and state and corporate surveillances. Digital documentaries are seemingly more interactive and participatory than linear film and video documentary as they are comprised of a variety of auditory, visual, and written media, utilize networked technologies, and turn the documentary audience into a documentary user. I draw on scholarship from documentary, game, new media, and surveillance studies to analyze how second-person address in digital documentaries is configured through user positioning and direct address within the works themselves, in how organizations and creators frame their productions, and in how users and players respond in reviews, discussion forums, and Let’s Plays. I build on Michael Rothberg’s theorization of the implicated subject to explore how these digital documentaries bring the user into complicated relationality with national and international crises. Visually and experientially implying that users bear responsibility to the subjects and subject matter, these works can, on the one hand, replicate modes of liberal empathy for suffering, distant “others” and, on the other, simulate one’s own surveillant modes of observation or behavior to mirror it back to users and open up one’s offline thoughts and actions as a site of critique. This dissertation charts how second-person address shapes and limits the political potentialities of documentary projects and connects them to a lineage of direct address from educational and propaganda films, museum exhibits, and serious games. By centralizing the user’s individual experience, the interventions that second-person digital documentaries can make into social discourse change from public, institution-based education to more privatized forms of sentimental education geared toward personal edification and self-realization. Unless tied to larger initiatives or movements, I argue that digital documentaries reaffirm a neoliberal politics of individual self-regulation and governance instead of public education or collective, social intervention. Chapter one focuses on 360-degree virtual reality (VR) documentaries that utilize the feeling of presence to position users as if among refugees and as witnesses to refugee experiences in camps outside of Europe and various dwellings in European cities. My analysis of Clouds Over Sidra (Gabo Arora and Chris Milk 2015) and The Displaced (Imraan Ismail and Ben C. Solomon 2015) shows how these VR documentaries utilize observational realism to make believable and immersive their representations of already empathetic refugees. The empathetic refugee is often young, vulnerable, depoliticized and dehistoricized and is a well-known trope in other forms of humanitarian media that continues into VR documentaries. Forced to Flee (Zahra Rasool 2017), I am Rohingya (Zahra Rasool 2017), So Leben Flüchtlinge in Berlin (Berliner Morgenpost 2017), and Limbo: A Virtual Experience of Waiting for Asylum (Shehani Fernando 2017) disrupt easy immersions into realistic-looking VR experiences of stereotyped representations and user identifications and, instead, can reflect back the user’s political inaction and surveillant modes of looking. Chapter two analyzes web- and social media messenger-based documentaries that position users as outsiders to U.S. mass incarceration. Users are noir-style co-investigators into the crime of the prison-industrial complex in Fremont County, Colorado in Prison Valley: The Prison Industry (David Dufresne and Philippe Brault 2009) and co-riders on a bus transporting prison inmates’ loved ones for visitations to correctional facilities in Upstate New York in A Temporary Contact (Nirit Peled and Sara Kolster 2017). Both projects construct an experience of carceral constraint for users to reinscribe seeming “outside” places, people, and experiences as within the continuation of the racialized and classed politics of state control through mass incarceration. These projects utilize interfaces that create a tension between replicating an exploitative hierarchy between non-incarcerated users and those subject to mass incarceration while also de-immersing users in these experiences to mirror back the user’s supposed distance from this mode of state regulation. Chapter three investigates a type of digital game I term dataveillance simulation games, which position users as surveillance agents in ambiguously dystopian nation-states and force users to use their own critical thinking and judgment to construct the criminality of state-sanctioned surveillance targets. Project Perfect Citizen (Bad Cop Studios 2016), Orwell: Keeping an Eye on You (Osmotic Studios 2016), and Papers, Please (Lucas Pope 2013) all create a dual empathy: players empathize with bureaucratic surveillance agents while empathizing with surveillance targets whose emails, text messages, documents, and social media profiles reveal them to be “normal” people. I argue that while these games show criminality to be a construct, they also utilize a racialized fear of the loss of one’s individual privacy to make players feel like they too could be surveillance targets. Chapter four examines personalized digital documentaries that turn users and their data into the subject matter. Do Not Track (Brett Gaylor 2015), A Week with Wanda (Joe Derry Hall 2019), Stealing Ur Feelings (Noah Levenson 2019), Alfred Premium (Joël Ronez, Pierre Corbinais, and Émilie F. Grenier 2019), How They Watch You (Nick Briz 2021), and Fairly Intelligent™ (A.M. Darke 2021) track, monitor, and confront users with their own online behavior to reflect back a corporate surveillance that collects, analyzes, and exploits user data for profit. These digital documentaries utilize emotional fear- and humor-based appeals to persuade users that these technologies are controlling them, shaping their desires and needs, and dehumanizing them through algorithmic surveillance

    Pandemic Protagonists: Viral (Re)Actions in Pandemic and Corona Fictions

    Get PDF
    During the first mandatory lockdowns of the Covid-19 pandemic, citizens worldwide turned to "pandemic fictions" or started to produce their own »Corona Fictions« across different media. These accounts of (previously) experienced or imagined health crises feature a great variety of protagonists and their (re)actions in response to the exceptional circumstances. The contributors to this volume take a closer look at different pandemic protagonists in fictional narratives relating to the Covid-19 pandemic as well as in existing pandemic fictions. Thereby they provide new insights into pandemic narratives from a cultural, literary, and media studies perspective from antiquity to today

    Cross-Supply Chain Collaboration Platform for Pallet Management

    Get PDF
    Standardized pallets are an important factor in today's logistics sector to enable efficient processes in transport, storage and handling. By using an open exchange pool for pallets, additional opportunities arise for horizontal and vertical collaboration of various actors from different supply chains. The dissertation "Cross-Supply Chain Collaboration Platform for Pallet Management" investigates the potential of a digital platform for such cross-actor collaboration in pallet management. The designed platform has special mechanisms for balancing pallet debts that arise in the network and for joint planning of empty pallet flows. Therefore, the impact of the designed platforms on logistic processes, especially transports, is explored using simulation modeling. Furthermore, blockchain technology is investigated, which could be used for the implementation of the platform concept and could generate trust in a network of unknown actors. In this context, an empirical online-experiment is used to analyze in a differentiated way which specific features of the blockchain technology generate trust in technology and how these features interact with each other

    Algorithmic authority in music creation : the beauty of losing control = De l’autorité algorithmique dans la création musicale : la beauté de la perte de contrôle

    Full text link
    Type de dépôt #5 (version complète)De plus en plus de tâches humaines sont prises en charge par les algorithmes dans le domaine des arts. Avec les nouvelles techniques d’intelligence artificielle (IA) disponibles, qui reposent généralement sur le concept d’autoapprentissage, la frontière entre l’assistance informatique et la création algorithmique proprement dite s’estompe. À titre de mise en contexte, je présente, à l’aide d’exemples récents, un aperçu de la manière dont les artistes utilisent des algorithmes sophistiqués dans leur travail - et comment cette nouvelle forme d’art piloté par l’IA pourrait différer des premières formes d’art générées par ordinateur. Ensuite, sur la base de théories récentes issues d’une société post-humaniste et la montée de ce que certains auteurs appellent le dataisme, j’aborde des questions liées à l’autonomie, à la collaboration, aux droits d’auteur et au contrôle de la composition sur la création sonore et musicale. D’un point de vue pratique, je présente les logiciels libres et open source (FLOSS, de l’anglais Free/Libre and Open Source Software) qui ont pris une place constante dans mon processus de composition ; et je discute de la façon dont leur écosystème, principalement dans le domaine de l’IA, a façonné mon travail au fil des ans, tant d’un point de vue esthétique que pratique. Une série de trois pièces mixtes produites dans le cadre de ce projet de recherche-création est ensuite analysée. Je présente les différentes dimensions dans lesquelles le concept d’autorité algorithmique a pris part à mon processus de composition, des approches techniques aux choix esthétiques. Enfin, je propose des stratégies de notation musicale, basées sur des standards ouverts, visant à assurer la lisibilité, et donc la pérennité de mon travail.Algorithms have taken over an increasing number of human tasks in the realm of the arts. With newly available AI techniques, typically relying on the concept of self-learning, the line separating computer assistance from actual algorithmic creation is blurring. To contextualize, through recent illustrative examples I elaborate an overview of ways in which artists are making use of sophisticated algorithms in their work – and how this new form of AI-driven art might differ from early computer-generated art. Then, based on recent theories concerning a post-humanist society and the rise of what some authors call dataism, I discuss issues related to autonomy, collaboration, authorship and compositional control over the creation of sound and music. From a practical perspective, I present the Free/Libre and Open Source Software (FLOSS) that have been a consistent presence in my compositional process; and discuss how their ecosystem, mainly in the domain of AI, has shaped my work over the last decade from both aesthetic and practical standpoints. A series of three mixed pieces produced as part of this research-creation project is then analyzed. I present the different dimensions in which the concept of algorithmic authority took part in my composition process, from technical approaches to aesthetic choices. Finally, I propose some strategies for music notation, based on open standards, aiming to assure the readability, and therefore the perpetuity of my work

    DLAS: An Exploration and Assessment of the Deep Learning Acceleration Stack

    Full text link
    Deep Neural Networks (DNNs) are extremely computationally demanding, which presents a large barrier to their deployment on resource-constrained devices. Since such devices are where many emerging deep learning applications lie (e.g., drones, vision-based medical technology), significant bodies of work from both the machine learning and systems communities have attempted to provide optimizations to accelerate DNNs. To help unify these two perspectives, in this paper we combine machine learning and systems techniques within the Deep Learning Acceleration Stack (DLAS), and demonstrate how these layers can be tightly dependent on each other with an across-stack perturbation study. We evaluate the impact on accuracy and inference time when varying different parameters of DLAS across two datasets, seven popular DNN architectures, four DNN compression techniques, three algorithmic primitives with sparse and dense variants, untuned and auto-scheduled code generation, and four hardware platforms. Our evaluation highlights how perturbations across DLAS parameters can cause significant variation and across-stack interactions. The highest level observation from our evaluation is that the model size, accuracy, and inference time are not guaranteed to be correlated. Overall we make 13 key observations, including that speedups provided by compression techniques are very hardware dependent, and that compiler auto-tuning can significantly alter what the best algorithm to use for a given configuration is. With DLAS, we aim to provide a reference framework to aid machine learning and systems practitioners in reasoning about the context in which their respective DNN acceleration solutions exist in. With our evaluation strongly motivating the need for co-design, we believe that DLAS can be a valuable concept for exploring the next generation of co-designed accelerated deep learning solutions
    • …
    corecore