397 research outputs found

    Recalibrating machine learning for social biases: demonstrating a new methodology through a case study classifying gender biases in archival documentation

    Get PDF
    This thesis proposes a recalibration of Machine Learning for social biases to minimize harms from existing approaches and practices in the field. Prioritizing quality over quantity, accuracy over efficiency, representativeness over convenience, and situated thinking over universal thinking, the thesis demonstrates an alternative approach to creating Machine Learning models. Drawing on GLAM, the Humanities, the Social Sciences, and Design, the thesis focuses on understanding and communicating biases in a specific use case. 11,888 metadata descriptions from the University of Edinburgh Heritage Collections' Archives catalog were manually annotated for gender biases and text classification models were then trained on the resulting dataset of 55,260 annotations. Evaluations of the models' performance demonstrates that annotating gender biases can be automated; however, the subjectivity of bias as a concept complicates the generalizability of any one approach. The contributions are: (1) an interdisciplinary and participatory Bias-Aware Methodology, (2) a Taxonomy of Gendered and Gender Biased Language, (3) data annotated for gender biased language, (4) gender biased text classification models, and (5) a human-centered approach to model evaluation. The contributions have implications for Machine Learning, demonstrating how bias is inherent to all data and models; more specifically for Natural Language Processing, providing an annotation taxonomy, annotated datasets and classification models for analyzing gender biased language at scale; for the Gallery, Library, Archives, and Museum sector, offering guidance to institutions seeking to reconcile with histories of marginalizing communities through their documentation practices; and for historians, who utilize cultural heritage documentation to study and interpret the past. Through a real-world application of the Bias-Aware Methodology in a case study, the thesis illustrates the need to shift away from removing social biases and towards acknowledging them, creating data and models that surface the uncertainty and multiplicity characteristic of human societies

    Node-Based Native Solution to Procedural Game Level Generation

    Get PDF
    A Geração Procedural de Conteúdo (PCG) aplicada ao domínio do desenvolvimento de jogos tem se tornado um tópico proeminente, com um número crescente de implementações e aplicações. Soluções de PCG standalone e plugin, regidas por interfaces baseadas em nós e outros modelos de alto nível, enfrentam limitações em termos de integração, interatividade e responsividade quando inseridas no processo de desenvolvimento de jogos. Essas limitações afetam a experiência do utilizador e inibem o verdadeiro potencial que estes sistemas podem oferecer. Adotando uma metodologia de Action-Research, realizou-se um estudo preliminar com entrevistas a especialistas da área. A avaliação da pertinência desta metodologia nativa e da abordagem visual mais adequada para a sua interface foi efetuada através de uma série de protótipos. Posteriormente, foi implementado um protótipo funcional e conduzido um estudo de caso com uma amostra constituída por um grupo de especialistas em PCG e de desenvolvedores de jogos. Os participantes realizaram uma série de exercícios que estavam documentados com os respetivos tutoriais. Após a conclusão dos exercícios propostos, os participantes avaliaram a relevância da solução e da experiência do utilizador através de um questionário. No desenvolvimento de uma metodologia nativa de PCG baseado em nós, integrado no motor de jogo, identificamos limitações e concluímos que existem diversos desafios ainda por superar no que diz respeito a uma implementação completa de um sistema complexo e amplo.Procedural Content Generation (PCG) applied to game development has become a prominent topic with increasing implementations and use cases. However, existing standalone and plugin PCG solutions, which use Node-based interfaces and other high-level approaches, face limitations in integration, interactivity, and responsiveness within the game development pipeline. These limitations hinder the overall user experience and restrain the true potential of PCG systems. Adopting an Action-Research methodology, a preliminary interview was conducted with experts in the field. The relevance assessment of this native methodology and the most suitable visual approach for its interface was carried out through a series of prototypes. Subsequently, a functional prototype was implemented, and a case study was conducted using a sample consisting of a group of PCG experts and game developers. The participants performed a series of exercises documented with the respective tutorials. After completing the exercises, the solution's relevancy and user experience were evaluated through a questionnaire. In developing a native node-based PCG methodology integrated into the game engine, we identified limitations. We concluded that several challenges are yet to be overcome regarding fully implementing a complex and extensive system

    Anime Studies: media-specific approaches to neon genesis evangelion

    Get PDF
    Anime Studies: Media-Specific Approaches to Neon Genesis Evangelion aims at advancing the study of anime, understood as largely TV-based genre fiction rendered in cel, or cel-look, animation with a strong affinity to participatory cultures and media convergence. Making Neon Genesis Evangelion (Shin Seiki Evangerion, 1995-96) its central case and nodal point, this volumen forground anime as a media with clearly recognizable aesthetic properties, (sub)cultural affordances and situated discourses

    50 Years of quantum chromodynamics – Introduction and Review

    Get PDF

    Impact of Geomatic Techniques on Topo-Bathymetric Surveys for Coastal Analysis

    Get PDF
    Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations

    Decisioning 2022 : Collaboration in knowledge discovery and decision making: Applications to sustainable agriculture

    Get PDF
    Sustainable agriculture is one of the Sustainable Development Goals (SDG) proposed by UN (United Nations), but little systematic work on Knowledge Discovery and Decision Making has been applied to it. Knowledge discovery and decision making are becoming active research areas in the last years. The era of FAIR (Findable, Accessible, Interoperable, Reusable) data science, in which linked data with a high degree of variety and different degrees of veracity can be easily correlated and put in perspective to have an empirical and scientific perception of best practices in sustainable agricultural domain. This requires combining multiple methods such as elicitation, specification, validation, technologies from semantic web, information retrieval, formal concept analysis, collaborative work, semantic interoperability, ontological matching, specification, smart contracts, and multiple decision making. Decisioning 2022 is the first workshop on Collaboration in knowledge discovery and decision making: Applications to sustainable agriculture. It has been organized by six research teams from France, Argentina, Colombia and Chile, to explore the current frontier of knowledge and applications in different areas related to knowledge discovery and decision making. The format of this workshop aims at the discussion and knowledge exchange between the academy and industry members.Laboratorio de Investigación y Formación en Informática Avanzad

    Novel Architectures for Offloading and Accelerating Computations in Artificial Intelligence and Big Data

    Get PDF
    Due to the end of Moore's Law and Dennard Scaling, performance gains in general-purpose architectures have significantly slowed in recent years. While raising the number of cores has been a viable approach for further performance increases, Amdahl's Law and its implications on parallelization also limit further performance gains. Consequently, research has shifted towards different approaches, including domain-specific custom architectures tailored to specific workloads. This has led to a new golden age for computer architecture, as noted in the Turing Award Lecture by Hennessy and Patterson, which has spawned several new architectures and architectural advances specifically targeted at highly current workloads, including Machine Learning. This thesis introduces a hierarchy of architectural improvements ranging from minor incremental changes, such as High-Bandwidth Memory, to more complex architectural extensions that offload workloads from the general-purpose CPU towards more specialized accelerators. Finally, we introduce novel architectural paradigms, namely Near-Data or In-Network Processing, as the most complex architectural improvements. This cumulative dissertation then investigates several architectural improvements to accelerate Sum-Product Networks, a novel Machine Learning approach from the class of Probabilistic Graphical Models. Furthermore, we use these improvements as case studies to discuss the impact of novel architectures, showing that minor and major architectural changes can significantly increase performance in Machine Learning applications. In addition, this thesis presents recent works on Near-Data Processing, which introduces Smart Storage Devices as a novel architectural paradigm that is especially interesting in the context of Big Data. We discuss how Near-Data Processing can be applied to improve performance in different database settings by offloading database operations to smart storage devices. Offloading data-reductive operations, such as selections, reduces the amount of data transferred, thus improving performance and alleviating bandwidth-related bottlenecks. Using Near-Data Processing as a use-case, we also discuss how Machine Learning approaches, like Sum-Product Networks, can improve novel architectures. Specifically, we introduce an approach for offloading Cardinality Estimation using Sum-Product Networks that could enable more intelligent decision-making in smart storage devices. Overall, we show that Machine Learning can benefit from developing novel architectures while also showing that Machine Learning can be applied to improve the applications of novel architectures

    A review of commercialisation mechanisms for carbon dioxide removal

    Get PDF
    The deployment of carbon dioxide removal (CDR) needs to be scaled up to achieve net zero emission pledges. In this paper we survey the policy mechanisms currently in place globally to incentivise CDR, together with an estimate of what different mechanisms are paying per tonne of CDR, and how those costs are currently distributed. Incentive structures are grouped into three structures, market-based, public procurement, and fiscal mechanisms. We find the majority of mechanisms currently in operation are underresourced and pay too little to enable a portfolio of CDR that could support achievement of net zero. The majority of mechanisms are concentrated in market-based and fiscal structures, specifically carbon markets and subsidies. While not primarily motivated by CDR, mechanisms tend to support established afforestation and soil carbon sequestration methods. Mechanisms for geological CDR remain largely underdeveloped relative to the requirements of modelled net zero scenarios. Commercialisation pathways for CDR require suitable policies and markets throughout the projects development cycle. Discussion and investment in CDR has tended to focus on technology development. Our findings suggest that an equal or greater emphasis on policy innovation may be required if future requirements for CDR are to be met. This study can further support research and policy on the identification of incentive gaps and realistic potential for CDR globally
    corecore