5,616 research outputs found

    Endogenous measures for contextualising large-scale social phenomena: a corpus-based method for mediated public discourse

    Get PDF
    This work presents an interdisciplinary methodology for developing endogenous measures of group membership through analysis of pervasive linguistic patterns in public discourse. Focusing on political discourse, this work critiques the conventional approach to the study of political participation, which is premised on decontextualised, exogenous measures to characterise groups. Considering the theoretical and empirical weaknesses of decontextualised approaches to large-scale social phenomena, this work suggests that contextualisation using endogenous measures might provide a complementary perspective to mitigate such weaknesses. This work develops a sociomaterial perspective on political participation in mediated discourse as affiliatory action performed through language. While the affiliatory function of language is often performed consciously (such as statements of identity), this work is concerned with unconscious features (such as patterns in lexis and grammar). This work argues that pervasive patterns in such features that emerge through socialisation are resistant to change and manipulation, and thus might serve as endogenous measures of sociopolitical contexts, and thus of groups. In terms of method, the work takes a corpus-based approach to the analysis of data from the Twitter messaging service whereby patterns in users’ speech are examined statistically in order to trace potential community membership. The method is applied in the US state of Michigan during the second half of 2018—6 November having been the date of midterm (i.e. non-Presidential) elections in the United States. The corpus is assembled from the original posts of 5,889 users, who are nominally geolocalised to 417 municipalities. These users are clustered according to pervasive language features. Comparing the linguistic clusters according to the municipalities they represent finds that there are regular sociodemographic differentials across clusters. This is understood as an indication of social structure, suggesting that endogenous measures derived from pervasive patterns in language may indeed offer a complementary, contextualised perspective on large-scale social phenomena

    CITIES: Energetic Efficiency, Sustainability; Infrastructures, Energy and the Environment; Mobility and IoT; Governance and Citizenship

    Get PDF
    This book collects important contributions on smart cities. This book was created in collaboration with the ICSC-CITIES2020, held in San José (Costa Rica) in 2020. This book collects articles on: energetic efficiency and sustainability; infrastructures, energy and the environment; mobility and IoT; governance and citizenship

    Desarrollo de una herramienta integral de gestión de gases de efecto invernadero para la toma de decisión contra el cambio climático a nivel regional y local en la Comunitat Valenciana

    Full text link
    Tesis por compendio[ES] Actualmente, los responsables de tomar decisiones contra el cambio climático carecen de herramientas para desarrollar inventarios de emisiones de gases de efecto invernadero (GEI) con suficiente rigor científico-técnico y precisión para priorizar e invertir los recursos disponibles de manera eficiente en las medidas necesarias para luchar contra el cambio climático. Por ello, en esta tesis se expone el desarrollo de un sistema de información territorial y sectorial (SITE) para monitorear las emisiones de GEI que sirva como herramienta de gobernanza climática local y regional. SITE combina las ventajas de los enfoques metodológicos descendente o top-down (de arriba hacia abajo) y ascendente o bottom-up (de abajo hacia arriba), para lograr un enfoque híbrido innovador para contabilizar y gestionar de manera eficiente las emisiones de GEI. Por tanto, en esta tesis se definen los diferentes desarrollos metodológicos, tanto generales como específicos de sectores clave del Panel Intergubernamental de Cambio Climático (IPPC) (edificación, transporte, sector forestal, etc.), un desarrollo informático para la parte de SITE que se ejecuta del lado del servidor, que de ahora en adelante denominaremos back-end del sistema, y siete implementaciones como casos de estudio representativos, a diferentes escalas y aplicados sobre diferentes sectores. Estas implementaciones a diferentes escalas y sectores demuestran el potencial del sistema como herramienta de apoyo en la toma de decisión contra el cambio climático a nivel regional y local. Las diferentes implementaciones en casos piloto representativos, tanto a nivel regional en la Comunitat Valenciana como a nivel local en municipios grandes (València) y medianos (Quart de Poblet y Llíria) muestran el potencial de adaptación territorial y sectorial que tiene la herramienta. Las metodologías desarrolladas para los sectores específicos de tráfico rodado, edificación o sector forestal, ofrecen cuantificaciones con una resolución espacial con gran capacidad de optimizar las políticas locales y regionales. Por tanto, la herramienta cuenta con un gran potencial de escalabilidad y gran capacidad de mejora continua mediante la inclusión de nuevos enfoques metodológicos, adaptación de las metodologías a la disponibilidad de datos, metodologías concretas para sectores clave y actualización a las mejores metodologías disponibles derivadas de actividades de investigación de la comunidad científica.[CA] Actualment, els responsables de prendre decisions contra el canvi climàtic no tenen eines per aconseguir inventaris d'emissions de gasos d'efecte hivernacle (GEH) amb prou cientificotècnic rigor, precisió i integritat per invertir els recursos disponibles de manera eficient en les mesures necessàries contra el canvi climàtic. Per això, en aquesta tesis se exposa el desenvolupa un sistema d'informació territorial i sectorial (SITE) per monitoritzar les emissions de GEH com a eina de governança climàtica local i regional. Aquest sistema combina els avantatges dels enfocaments metodològics descendent o top-down (de dalt a baix) i ascendent o bottom-up (de baix a dalt), per aconseguir un enfocament híbrid innovador per comptabilitzar i gestionar de manera eficient les emissions de GEH. Per tant, en aquesta tesi doctoral es descriuen els diferents desenvolupaments metodològics, tant generals com específics de sectors clau del Panel Intergovernamental contra el Canvi Climàtic (edificació, transport, forestal, etc.), un desenvolupament informàtic per al back-end del sistema i set implementacions com a casos d'estudi representatius, a diferents escales, amb els diferents enfocaments metodològics i aplicats sobre diferents sectors. Això queda descrit en sis capítols. Aquestes implementacions a diferents escales i sectors demostren el potencial del sistema com a eina de suport en la presa de decisió contra el canvi climàtic a nivell regional i local. Les diferents implementacions en casos pilot representatius, tant a nivell regional a la Comunitat Valenciana com a nivell local en municipis grans (València) i mitjans (Quart de Poblet i Llíria,) mostren el potencial d'adaptació territorial i sectorial que té l'eina. Les metodologies desenvolupades per als sectors específics de trànsit rodat, edificació i forestal, ofereixen quantificacions amb una resolució espacial amb gran capacitat d'optimitzar les polítiques locals i regionals. Per tant, l'eina compta amb un gran potencial d'escalabilitat i gran capacitat de millora contínua mitjançant la inclusió de nous enfocaments metodològics, adaptació de les metodologies a la disponibilitat de dades, metodologies concretes per a sectors clau, i actualització a les millors metodologies disponibles derivades de activitats de investigació de la comunitat científica.[EN] Currently, regional and local decision-makers lack of tools to achieve greenhouse gases (GHG) emissions inventories with enough rigor, accuracy and completeness in order to prioritize available resources efficiently against climate change. Thus, in this thesis the development of a territorial and sectoral information system (SITE) to monitor GHG emissions as a local and regional climate governance tool is exposed. This system combines the advantages of both, top-down and bottom-up approaches, to achieve an innovative hybrid approach to account and manage efficiently GHG emissions. Furthermore, this thesis defines the methodologies developed, a computer proposal for the back-end of the system and seven implementations as representative case studies at different scales (local and regional level), with the different methodological approaches and applied to different sectors. Thus, these implementations demonstrate the potential of the system as decision-making tool against climate change at the regional and local level as climate governance tool. The different implementations in representative pilot cases, both at the regional level in the Valencian Community and at the local level in large (Valencia) and medium-sized municipalities (Quart de Poblet and Llíria) demonstrate the potential for territorial and sectoral adaptation of the system developed. The methodologies developed for the specific sectors of road transport, building and forestry, offer quantifications with a spatial resolution with a great capacity to optimize local and regional policies. Therefore, the tool has a great potential for scalability and a great capacity for continuous improvement through the inclusion of new methodological approaches, adapting the methodologies to the availability of data, specific methodologies for key sectors, and updating to the best methodologies available in the scientific community.Lorenzo Sáez, E. (2022). Desarrollo de una herramienta integral de gestión de gases de efecto invernadero para la toma de decisión contra el cambio climático a nivel regional y local en la Comunitat Valenciana [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/181662TESISCompendi

    Digital asset management via distributed ledgers

    Get PDF
    Distributed ledgers rose to prominence with the advent of Bitcoin, the first provably secure protocol to solve consensus in an open-participation setting. Following, active research and engineering efforts have proposed a multitude of applications and alternative designs, the most prominent being Proof-of-Stake (PoS). This thesis expands the scope of secure and efficient asset management over a distributed ledger around three axes: i) cryptography; ii) distributed systems; iii) game theory and economics. First, we analyze the security of various wallets. We start with a formal model of hardware wallets, followed by an analytical framework of PoS wallets, each outlining the unique properties of Proof-of-Work (PoW) and PoS respectively. The latter also provides a rigorous design to form collaborative participating entities, called stake pools. We then propose Conclave, a stake pool design which enables a group of parties to participate in a PoS system in a collaborative manner, without a central operator. Second, we focus on efficiency. Decentralized systems are aimed at thousands of users across the globe, so a rigorous design for minimizing memory and storage consumption is a prerequisite for scalability. To that end, we frame ledger maintenance as an optimization problem and design a multi-tier framework for designing wallets which ensure that updates increase the ledger’s global state only to a minimal extent, while preserving the security guarantees outlined in the security analysis. Third, we explore incentive-compatibility and analyze blockchain systems from a micro and a macroeconomic perspective. We enrich our cryptographic and systems' results by analyzing the incentives of collective pools and designing a state efficient Bitcoin fee function. We then analyze the Nash dynamics of distributed ledgers, introducing a formal model that evaluates whether rational, utility-maximizing participants are disincentivized from exhibiting undesirable infractions, and highlighting the differences between PoW and PoS-based ledgers, both in a standalone setting and under external parameters, like market price fluctuations. We conclude by introducing a macroeconomic principle, cryptocurrency egalitarianism, and then describing two mechanisms for enabling taxation in blockchain-based currency systems

    Full stack development toward a trapped ion logical qubit

    Get PDF
    Quantum error correction is a key step toward the construction of a large-scale quantum computer, by preventing small infidelities in quantum gates from accumulating over the course of an algorithm. Detecting and correcting errors is achieved by using multiple physical qubits to form a smaller number of robust logical qubits. The physical implementation of a logical qubit requires multiple qubits, on which high fidelity gates can be performed. The project aims to realize a logical qubit based on ions confined on a microfabricated surface trap. Each physical qubit will be a microwave dressed state qubit based on 171Yb+ ions. Gates are intended to be realized through RF and microwave radiation in combination with magnetic field gradients. The project vertically integrates software down to hardware compilation layers in order to deliver, in the near future, a fully functional small device demonstrator. This thesis presents novel results on multiple layers of a full stack quantum computer model. On the hardware level a robust quantum gate is studied and ion displacement over the X-junction geometry is demonstrated. The experimental organization is optimized through automation and compressed waveform data transmission. A new quantum assembly language purely dedicated to trapped ion quantum computers is introduced. The demonstrator is aimed at testing implementation of quantum error correction codes while preparing for larger scale iterations.Open Acces

    Industrial Robotics for Advanced Machining

    Get PDF
    This work presents a literature review of the current state of robotic machining with industrial machining robots, primarily those with 6-axis end effectors and serial link (anthropomorphic) construction. Various disadvantages of robotic machining in industry are presented, as well as the methods applied to mitigate them and discussions of their effects. From this review, the methods of dynamic modelling, stability prediction and configuration control are selected for application to the task of optimisation of a robotic machining cell for drilling operations. Matrix Structural Analysis (MSA) and methods developed by Klimchik et al. are used for compliance modelling, stability prediction methods developed by Altintas et al. and machining stability lobe prediction are then applied to a robotic drilling process, as explored by Mousavi et al. This optimisation method is applied using the measured and estimated properties of an ABB IRB 6640 robot and results are presented in comparison with previous experimentation with the physical robot, and analytical stability predictions from the same cutting parameters with Cutpro software. Results are discussed in the concluding chapters, as well as discontinued parts of the project and suggestions for future work

    Graphical scaffolding for the learning of data wrangling APIs

    Get PDF
    In order for students across the sciences to avail themselves of modern data streams, they must first know how to wrangle data: how to reshape ill-organised, tabular data into another format, and how to do this programmatically, in languages such as Python and R. Despite the cross-departmental demand and the ubiquity of data wrangling in analytical workflows, the research on how to optimise the instruction of it has been minimal. Although data wrangling as a programming domain presents distinctive challenges - characterised by on-the-fly syntax lookup and code example integration - it also presents opportunities. One such opportunity is how tabular data structures are easily visualised. To leverage the inherent visualisability of data wrangling, this dissertation evaluates three types of graphics that could be employed as scaffolding for novices: subgoal graphics, thumbnail graphics, and parameter graphics. Using a specially built e-learning platform, this dissertation documents a multi-institutional, randomised, and controlled experiment that investigates the pedagogical effects of these. Our results indicate that the graphics are well-received, that subgoal graphics boost the completion rate, and that thumbnail graphics improve navigability within a command menu. We also obtained several non-significant results, and indications that parameter graphics are counter-productive. We will discuss these findings in the context of general scaffolding dilemmas, and how they fit into a wider research programme on data wrangling instruction

    Developing automated meta-research approaches in the preclinical Alzheimer's disease literature

    Get PDF
    Alzheimer’s disease is a devastating neurodegenerative disorder for which there is no cure. A crucial part of the drug development pipeline involves testing therapeutic interventions in animal disease models. However, promising findings in preclinical experiments have not translated into clinical trial success. Reproducibility has often been cited as a major issue affecting biomedical research, where experimental results in one laboratory cannot be replicated in another. By using meta-research (research on research) approaches such as systematic reviews, researchers aim to identify and summarise all available evidence relating to a specific research question. By conducting a meta-analysis, researchers can also combine the results from different experiments statistically to understand the overall effect of an intervention and to explore reasons for variations seen across different publications. Systematic reviews of the preclinical Alzheimer’s disease literature could inform decision making, encourage research improvement, and identify gaps in the literature to guide future research. However, due to the vast amount of potentially useful evidence from animal models of Alzheimer’s disease, it remains difficult to make sense of and utilise this data effectively. Systematic reviews are common practice within evidence based medicine, yet their application to preclinical research is often limited by the time and resources required. In this thesis, I develop, build-upon, and implement automated meta-research approaches to collect, curate, and evaluate the preclinical Alzheimer’s literature. I searched several biomedical databases to obtain all research relevant to Alzheimer’s disease. I developed a novel deduplication tool to automatically identify and remove duplicate publications identified across different databases with minimal human effort. I trained a crowd of reviewers to annotate a subset of the publications identified and used this data to train a machine learning algorithm to screen through the remaining publications for relevance. I developed text-mining tools to extract model, intervention, and treatment information from publications and I improved existing automated tools to extract reported measures to reduce the risk of bias. Using these tools, I created a categorised database of research in transgenic Alzheimer’s disease animal models and created a visual summary of this dataset on an interactive, openly accessible online platform. Using the techniques described, I also identified relevant publications within the categorised dataset to perform systematic reviews of two key outcomes of interest in transgenic Alzheimer’s disease models: (1) synaptic plasticity and transmission in hippocampal slices and (2) motor activity in the open field test. Over 400,000 publications were identified across biomedical research databases, with 230,203 unique publications. In a performance evaluation across different preclinical datasets, the automated deduplication tool I developed could identify over 97% of duplicate citations and a had an error rate similar to that of human performance. When evaluated on a test set of publications, the machine learning classifier trained to identify relevant research in transgenic models performed was highly sensitive (captured 96.5% of relevant publications) and excluded 87.8% of irrelevant publications. Tools to identify the model(s) and outcome measure(s) within the full-text of publications may reduce the burden on reviewers and were found to be more sensitive than searching only the title and abstract of citations. Automated tools to assess risk of bias reporting were highly sensitive and could have the potential to monitor research improvement over time. The final dataset of categorised Alzheimer’s disease research contained 22,375 publications which were then visualised in the interactive web application. Within the application, users can see how many publications report measures to reduce the risk of bias and how many have been classified as using each transgenic model, testing each intervention, and measuring each outcome. Users can also filter to obtain curated lists of relevant research, allowing them to perform systematic reviews at an accelerated pace with reduced effort required to search across databases, and a reduced number of publications to screen for relevance. Both systematic reviews and meta-analyses highlighted failures to report key methodological information within publications. Poor transparency of reporting limited the statistical power I had to understand the sources of between-study variation. However, some variables were found to explain a significant proportion of the heterogeneity. Transgenic animal model had a significant impact on results in both reviews. For certain open field test outcomes, wall colour of the open field arena and the reporting of measures to reduce the risk of bias were found to impact results. For in vitro electrophysiology experiments measuring synaptic plasticity, several electrophysiology parameters, including magnesium concentration of the recording solution, were found to explain a significant proportion of the heterogeneity. Automated meta-research approaches and curated web platforms summarising preclinical research could have the potential to accelerate the conduct of systematic reviews and maximise the potential of existing evidence to inform translation

    English for specific academic purposes: ICT classroom. Practical course of english 1

    Get PDF
    Навчальний електронний посібник «English for specific academic purposes: ICT classroom» рекомендований для аудиторної та самостійної роботи студентів 1-го курсу спеціальності 126 Інформаційні системи та технології факультету інформатики та обчислювальної техніки Національного технічного університету України «Київський політехнічний інститут імені Ігоря Сікорського». Основна мета публікації – розвиток комунікативних навичок у англомовному професійно-орієнтованому середовищі, розвиток навичок аудіювання, читання, говоріння та письма, та вдосконалення вмінь перекладу фахової лексики. Посібник складається з дев’яти розділів, граматичного довідника, додатків, списку скорочень та абревіатур, зміст яких відображає теми професійної діяльності майбутніх фахівців у сфері інформаційних технологій. Навчальний посібник відповідає вимогам силабусу.This electronic textbook «English for specific academic purposes: ICT classroom» is recommended for classroom work as well as self-study activities for the first-year students of the Faculty of Informatics and Computer Science majoring in Information systems and technologies, National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”. The main goal of the publication is to develop English communicative skills in speaking, listening, reading, writing, as well as developing and improving translation skills. The textbook consists of nine units, grammar reference, appendixes all of which comprise real professional themes for teaching future engineers in the area of information technologies. The electronic textbook meets the requirements of the syllabus
    corecore