1,170 research outputs found

    Language integrated relational lenses

    Get PDF
    Relational databases are ubiquitous. Such monolithic databases accumulate large amounts of data, yet applications typically only work on small portions of the data at a time. A subset of the database defined as a computation on the underlying tables is called a view. Querying views is helpful, but it is also desirable to update them and have these changes be applied to the underlying database. This view update problem has been the subject of much previous work before, but support by database servers is limited and only rarely available. Lenses are a popular approach to bidirectional transformations, a generalization of the view update problem in databases to arbitrary data. However, perhaps surprisingly, lenses have seldom actually been used to implement updatable views in databases. Bohannon, Pierce and Vaughan propose an approach to updatable views called relational lenses. However, to the best of our knowledge this proposal has not been implemented or evaluated prior to the work reported in this thesis. This thesis proposes programming language support for relational lenses. Language integrated relational lenses support expressive and efficient view updates, without relying on updatable view support from the database server. By integrating relational lenses into the programming language, application development becomes easier and less error-prone, avoiding the impedance mismatch of having two programming languages. Integrating relational lenses into the language poses additional challenges. As defined by Bohannon et al. relational lenses completely recompute the database, making them inefficient as the database scales. The other challenge is that some parts of the well-formedness conditions are too general for implementation. Bohannon et al. specify predicates using possibly infinite abstract sets and define the type checking rules using relational algebra. Incremental relational lenses equip relational lenses with change-propagating semantics that map small changes to the view into (potentially) small changes to the source tables. We prove that our incremental semantics are functionally equivalent to the non-incremental semantics, and our experimental results show orders of magnitude improvement over the non-incremental approach. This thesis introduces a concrete predicate syntax and shows how the required checks are performed on these predicates and show that they satisfy the abstract predicate specifications. We discuss trade-offs between static predicates that are fully known at compile time vs dynamic predicates that are only known during execution and introduce hybrid predicates taking inspiration from both approaches. This thesis adapts the typing rules for relational lenses from sequential composition to a functional style of sub-expressions. We prove that any well-typed functional relational lens expression can derive a well-typed sequential lens. We use these additions to relational lenses as the foundation for two practical implementations: an extension of the Links functional language and a library written in Haskell. The second implementation demonstrates how type-level computation can be used to implement relational lenses without changes to the compiler. These two implementations attest to the possibility of turning relational lenses into a practical language feature

    The Development of Novel Genomic Technologies that Classify Pathogens of Public Health Significance

    Full text link
    Recent advances in whole genome sequencing (WGS) have led to the routine sequencing of pathogenic bacteria that burden public health care. Taking advantage of the freely available genomes of tens of thousands of bacterial isolates, this thesis developed novel genomic classification technologies that described the long-and-short-term genomic epidemiology of two pathogens of significant disease burden, Vibrio cholerae and Staphylococcus aureus. V. cholerae is the etiological agent of cholera disease and has circumnavigated the globe in a series of seven pandemics. In Chapter 3, a novel tool named Multilevel Genome Typing (MGT) was developed to classify the V. cholerae species with a focus on the seventh pandemic. The V. cholerae MGT analysed a seventh pandemic dataset (n=4,770), described the seventh pandemic's global population structure, and reconfirmed the origins of the 2016 Yemen outbreak, considered the worst cholera outbreak in modern history. Informed by the successful application of the V. cholerae MGT in Chapter 4, a S. aureus MGT was developed. S. aureus asymptotically colonises approximately 30% of the population and is responsible for the onset of over a dozen diseases. The MGT characterised the global population structure of a clone that emerged in the early 2000s and became the major cause of infections in North America. The S. aureus MGT further investigated the persistent colonisation of patients not associated with hospitals. The MGT described the carriage of multiple isolates colonising the same patient. To gain a high-level view of the population structure consistent with phylogenetic divisions, in Chapter 5, a novel genomic classification tool named the S. aureus Lineage Typer (SaLTy) was developed for the species-level classification of S. aureus. We applied SaLTy to a species dataset (n=50,481) to generate a snapshot of the species population structure and identified six large lineages representing most of the species. To summarise, this thesis developed three novel genomic technologies that, when applied, improved the description of S. aureus and V. cholerae genomic epidemiology. The classifications defined by these technologies can inform the design of prevention and control strategies aiming to lower the disease and economic burdens caused by S. aureus and V. cholerae

    Concurrent Asynchronous Byzantine Agreement in Expected-Constant Rounds, Revisited

    Get PDF
    It is well known that without randomization, Byzantine agreement (BA) requires a linear number of rounds in the synchronous setting, while it is flat out impossible in the asynchronous setting. The primitive which allows to bypass the above limitation is known as oblivious common coin (OCC). It allows parties to agree with constant probability on a random coin, where agreement is oblivious, i.e., players are not aware whether or not agreement has been achieved. The starting point of our work is the observation that no known protocol exists for information-theoretic multi-valued OCC---i.e., OCC where the coin might take a value from a domain of cardinality larger than 2---with optimal resiliency in the asynchronous (with eventual message delivery) setting. This apparent hole in the literature is particularly problematic, as multi-valued OCC is implicitly or explicitly used in several constructions. (In fact, it is often falsely attributed to the asynchronous BA result by Canetti and Rabin [STOC ’93], which, however, only achieves binary OCC and does not translate to a multi-valued OCC protocol.) In this paper, we present the first information-theoretic multi-valued OCC protocol in the asynchronous setting with optimal resiliency, i.e., tolerating t<n/3t<n/3 corruptions, thereby filling this important gap. Further, our protocol efficiently implements OCC with an exponential-size domain, a property which is not even achieved by known constructions in the simpler, synchronous setting. We then turn to the problem of round-preserving parallel composition of asynchronous BA. A protocol for this task was proposed by Ben-Or and El-Yaniv [Distributed Computing ’03]. Their construction, however, is flawed in several ways: For starters, it relies on multi-valued OCC instantiated by Canetti and Rabin\u27s result (which, as mentioned above, only provides binary OCC). This shortcoming can be repaired by plugging in our above multi-valued OCC construction. However, as we show, even with this fix it remains unclear whether the protocol of Ben-Or and El-Yaniv achieves its goal of expected-constant-round parallel asynchronous BA, as the proof is incorrect. Thus, as a second contribution, we provide a simpler, more modular protocol for the above task. Finally, and as a contribution of independent interest, we provide proofs in Canetti\u27s Universal Composability framework; this makes our work the first one offering composability guarantees, which are important as BA is a core building block of secure multi-party computation protocols

    “Oh my god, how did I spend all that money?”: Lived experiences in two commodified fandom communities

    Get PDF
    This research explores the role of commodification in participation in celebrity-centric fandom communities, applying a leisure studies framework to understand the constraints fans face in their quest to participate and the negotiations they engage in to overcome these constraints. In fan studies scholarship, there is a propensity to focus on the ways fans oppose commodified industry structures; however, this ignores the many fans who happily participate within them. Using the fandoms for the pop star Taylor Swift and the television series Supernatural as case studies, this project uses a mixed-methodological approach to speak directly to fans via surveys and semistructured interviews to develop an understanding of fans’ lived experiences based on their own words. By focusing on celebrity-centric fandom communities rather than on the more frequently studied textual fandoms, this thesis turns to the role of the celebrity in fans’ ongoing desire to participate in commodified spaces. I argue that fans are motivated to continue spending money to participate within their chosen fandom when this form of participation is tied to the opportunity for engagement with the celebrity. While many fans seek community from their fandom participation, this research finds that for others, social ties are a secondary outcome of their overall desire for celebrity attention, which becomes a hobby in which they build a “leisure career” (Stebbins 2014). When fans successfully gain attention from their celebrity object of fandom, they gain status within their community, creating intra-fandom hierarchies based largely on financial resources and on freedom from structural constraints related to education, employment, and caring. Ultimately, this thesis argues that the broad neglect of celebrity fandom practices means we have overlooked the experiences of many fans, necessitating a much broader future scope for the field

    Towards an integrated vulnerability-based approach for evaluating, managing and mitigating earthquake risk in urban areas

    Get PDF
    Tese de doutoramento em Civil EngineeringSismos de grande intensidade, como aqueles que ocorreram na Turquía-Síria (2023) ou México (2017) deviam chamar a atenção para o projeto e implementação de ações proativas que conduzam à identificação de bens vulneráveis. A presente tese propõe um fluxo de trabalho relativamente simples para efetuar avaliações da vulnerabilidade sísmica à escala urbana mediante ferramentas digitais. Um modelo de vulnerabilidade baseado em parâmetros é adotado devido à afinidade que possui com o Catálogo Nacional de Monumentos Históricos mexicano. Uma primeira implementação do método (a grande escala) foi efetuada na cidade histórica de Atlixco (Puebla, México), demonstrando a sua aplicabilidade e algumas limitações, o que permitiu o desenvolvimento de uma estratégia para quantificar e considerar as incertezas epistémicas encontradas nos processos de aquisição de dados. Devido ao volume de dados tratado, foi preciso desenvolver meios robustos para obter, armazenar e gerir informações. O uso de Sistemas de Informação Geográfica, com programas à medida baseados em linguagem Python e a distribuição de ficheiros na ”nuvem”, facilitou a criação de bases de dados de escala urbana para facilitar a aquisição de dados em campo, os cálculos de vulnerabilidade e dano e, finalmente, a representação dos resultados. Este desenvolvimento foi a base para um segundo conjunto de trabalhos em municípios do estado de Morelos (México). A caracterização da vulnerabilidade sísmica de mais de 160 construções permitiu a avaliação da representatividade do método paramétrico pela comparação entre os níveis de dano teórico e os danos observados depois do terramoto de Puebla-Morelos (2017). Esta comparação foi a base para efetuar processos de calibração e ajuste assistidos por algoritmos de aprendizagem de máquina (Machine Learning), fornecendo bases para o desenvolvimento de modelos de vulnerabilidade à medida (mediante o uso de Inteligência Artificial), apoiados nas evidências de eventos sísmicos prévios.Strong seismic events like the ones of Türkiye-Syria (2023) or Mexico (2017) should guide our attention to the design and implementation of proactive actions aimed to identify vulnerable assets. This work is aimed to propose a suitable and easy-to-implement workflow for performing large-scale seismic vulnerability assessments in historic environments by means of digital tools. A vulnerability-oriented model based on parameters is adopted given its affinity with the Mexican Catalogue of Historical Monuments. A first large-scale implementation of this method in the historical city of Atlixco (Puebla, Mexico) demonstrated its suitability and some limitations, which lead to develop a strategy for quantifying and involving the epistemic uncertainties found during the data acquisition process. Given the volume of data that these analyses involve, it was necessary to develop robust data acquisition, storing and management strategies. The use of Geographical Information System environments together with customised Python-based programs and cloud-based distribution permitted to assemble urban databases for facilitating field data acquisition, performing vulnerability and damage calculations, and representing outcomes. This development was the base for performing a second large-scale assessment in selected municipalities of the state of Morelos (Mexico). The characterisation of the seismic vulnerability of more than 160 buildings permitted to assess the representativeness of the parametric vulnerability approach by comparing the theoretical damage estimations against the damages observed after the Puebla-Morelos 2017 Earthquakes. Such comparison is the base for performing a Machine Learning assisted process of calibration and adjustment, representing a feasible strategy for calibrating these vulnerability models by using Machine-Learning algorithms and the empirical evidence of damage in post-seismic scenarios.This work was partly financed by FCT/MCTES through national funds (PIDDAC) under the R&D Unit Institute for Sustainability and Innovation in Structural Engineering (ISISE), reference UIDB/04029/2020. This research had financial support provided by the Portuguese Foundation of Science and Technology (FCT) through the Analysis and Mitigation of Risks in Infrastructures (InfraRisk) program under the PhD grant PD/BD/150385/2019

    Designing adaptivity in educational games to improve learning

    Get PDF
    The study of pedagogy has shown that students have different ways of learning and processing information. Students in a classroom learn best when being taught by a teacher who is able to adapt and/or change the pedagogical model being used, to better suit said students and/or the subject being taught. When considering other teaching mediums such as computer-assisted learning systems or educational video games, research also identified the benefits of adapting educational features to better teach players. However, effective methods for adaptation in educational video games are less well researched.This study addresses four points regarding adaptivity within educational games. Firstly, a framework for making any game adaptive was extracted from the literature. Secondly, an algorithm capable of monitoring, modelling and executing adaptations was developed and explained using the framework. Thirdly, the algorithm's effect on learning gains in players was evaluated using a customised version of Minecraft as the educational game and topics from critical thinking as the educational content. Lastly, a methodology explaining the process of utilising the algorithm with any educational game and the evaluation of said methodology were detailed

    Sampling-Based Exploration Strategies for Mobile Robot Autonomy

    Get PDF
    A novel, sampling-based exploration strategy is introduced for Unmanned Ground Vehicles (UGV) to efficiently map large GPS-deprived underground environments. It is compared to state-of-the-art approaches and performs on a similar level, while it is not designed for a specific robot or sensor configuration like the other approaches. The introduced exploration strategy, which is called Random-Sampling-Based Next-Best View Exploration (RNE), uses a Rapidly-exploring Random Graph (RRG) to find possible view points in an area around the robot. They are compared with a computation-efficient Sparse Ray Polling (SRP) in a voxel grid to find the next-best view for the exploration. Each node in the exploration graph built with RRG is evaluated regarding the ability of the UGV to traverse it, which is derived from an occupancy grid map. It is also used to create a topology-based graph where nodes are placed centrally to reduce the risk of collisions and increase the amount of observable space. Nodes that fall outside the local exploration area are stored in a global graph and are connected with a Traveling Salesman Problem solver to explore them later

    Security and Privacy of Resource Constrained Devices

    Get PDF
    The thesis aims to present a comprehensive and holistic overview on cybersecurity and privacy & data protection aspects related to IoT resource-constrained devices. Chapter 1 introduces the current technical landscape by providing a working definition and architecture taxonomy of ‘Internet of Things’ and ‘resource-constrained devices’, coupled with a threat landscape where each specific attack is linked to a layer of the taxonomy. Chapter 2 lays down the theoretical foundations for an interdisciplinary approach and a unified, holistic vision of cybersecurity, safety and privacy justified by the ‘IoT revolution’ through the so-called infraethical perspective. Chapter 3 investigates whether and to what extent the fast-evolving European cybersecurity regulatory framework addresses the security challenges brought about by the IoT by allocating legal responsibilities to the right parties. Chapters 4 and 5 focus, on the other hand, on ‘privacy’ understood by proxy as to include EU data protection. In particular, Chapter 4 addresses three legal challenges brought about by the ubiquitous IoT data and metadata processing to EU privacy and data protection legal frameworks i.e., the ePrivacy Directive and the GDPR. Chapter 5 casts light on the risk management tool enshrined in EU data protection law, that is, Data Protection Impact Assessment (DPIA) and proposes an original DPIA methodology for connected devices, building on the CNIL (French data protection authority) model
    corecore