University of Waterloo

University of Waterloo's Institutional Repository
Not a member yet
    18714 research outputs found

    Optimizing Differential Computation for Large-Scale Graph Processing

    Full text link
    Diverse applications spanning areas such as fraud detection, risk assessment, recommendations, and telecommunications process datasets characterized by entities and their relationships. Graphs naturally emerge as the most intuitive abstraction for modeling these datasets. Many practical applications seek the ability to share computations across multiple snapshots of evolving graphs to efficiently perform analyses, such as evaluating changing road conditions in transportation networks or performing contingency analysis on infrastructure networks. The research in this thesis is motivated by the challenge of efficiently supporting such applications on large datasets. Differential computation (DC) has emerged as a powerful general technique for incrementally maintaining computations over evolving datasets, even those containing arbitrarily nested loops. It is thus a promising technique that can be used to build the kinds of applications that motivate this thesis. We present a study of DC that explores how it can be used to build practical data systems. In particular, this thesis addresses two challenges that impede the adoption of DC: (i) the lack of high-level interfaces that can be used to develop graph-specific applications; and (ii) scalability challenges that arise due to the general maintenance technique used by DC, making it less efficient for application-specific workloads. The main contribution of this thesis is to show how DC can be made more practical for graph processing systems. To address the lack of high-level interfaces, we built GraphSurge, a system that can be used to create and analyze multiple views over static graphs using a declarative programming interface. When users perform graph computations on a collection of views, GraphSurge internally uses DC to share the computation across all views. We develop several optimizations that improve the scalability of DC. Within GraphSurge, we identify two optimization problems, which we call collection ordering and collection splitting, and present algorithms to solve these problems. These optimizations improve the runtime of GraphSurge applications by up to an order of magnitude. In the reference implementation of DC, we identify two design bottlenecks in how data is indexed and processed within operators. To address the bottlenecks, we implement a new index and an optimization called Fast Empty Difference Verification, which improves the runtime of graph processing workloads by up to 14x. Our work was informed by insights from a non-technical user survey we conducted to understand how graphs are used in practice

    Variation in powder size distribution for electron beam melting of Ti48Al2Cr2Nb - density and surface roughness effects

    No full text
    Additive manufacturing (AM), also known as 3D printing, is an innovative method for producing complex geometries with minimal material waste. Aerospace, medical, and battery fields are a few sectors that employ AM as a means to further create innovation in materials design, structural design and structural/mechanical property development. The aerospace industry is on of such sectors continuously seeking ways to reduce energy/fuel consumption by shifting towards lightweight and high-temperature resistant materials. Ti-48Al-2Cr-2Nb (γ-TiAl) is a popular choice due to its desirable weight-to-density ratio and superior high-temperature mechanical properties. However, traditional manufacturing methods for the Ti-48Al-2Cr-2Nb alloy are expensive. Additive manufacturing (AM), specifically electron beam melting (EBM), is a cost-effective alternative that offers the potential for optimal ductility and fracture toughness for γ-TiAl. To promote economic sustainability, this study explores varying particle size distribution (PSD) in the EBM process. The research focuses on Ti-48Al-2Cr-2Nb alloy for PSD 45-150 µm, 38-150 µm, and 38-180 µm, specifically with the potential of PSD 38-180 µm as a more cost-effective reliable material to promote economic sustainability. By examining density and porosity morphology, as well as surface roughness, the study investigates how to maintain good density and surface qualities, irrespective of PSD, as the mechanical properties of the printed materials are dependent on the density, pore formation and surface quality obtained during the AM process. To mitigate pore defects and increase surface quality, process parameters involved in AM must be tuned to ensure the target density, porosity type and surface roughness of the final product. In this work, gamma titanium aluminide (Ti-48Al-2Cr-2Nb) was deployed in an electron beam melting (EBM) powder bed fusion process. The focus was to analyze and classify the printed part quality in terms of surface roughness (SA, STR, and SPC), relative density, lack of fusion defects and gas pore defects to establish trends across the parameter space. This thesis follows various sets of experiments that detail process parameter combinations that aim to provide a wide range or part quality results, therefore gathering enough unique data to conduct a full analysis on the behavior of the EBM-printed parts. The experimental study involves data analysis and data visualization utilized to identify the relationship between the process inputs (parameters and engineered features such as normalized enthalpy and volumetric energy density), the resulting density and pore type (gas pores and lack of fusion pores) and the surface roughness parameters (spatial, feature, and height parameters). The EBM process can face challenges in achieving low density levels and poor surface quality due to the applied energy and powder characteristics. More specifically, this thesis examines the relationship between PSD and porosity by identifying and investigating lack of fusion (LOF) and gas porosity types within threshold levels of 'Excellent,' 'Good,' 'Poor,' and 'Failed', with pre-defined ranges for the three classes. Additionally, the relationship between the PSD and surface roughness (SA, STR, and SPC) is also confirmed but utilizing a full factorial design of experiment (DOE) to create three sub-experimental hypothesis that explores the various prominent surface quality influencer (layer thickness and higher energy electron beam). Based on this experimental study, it was found that bulk density ranges spanned 88% - 99.99%, with gas porosity spanning 0.01% - 0.3%, and lack of fusion porosity spanning 0.001% - 12%. Additionally, for surface roughness, SA reports a range spanning (39.41 µm – 109.74 µm), STR reports (0.0984 – 0.8126), and SPC reports (2344.2 1/µm – 4084.31 1/µm) across the reference PSD 45-150 µm and the widest PSD in this study 38-180 µm. Based on resulting outcomes and literature, data distributions and statistical analysis were employed to classify and quantitatively defined the results obtained from the experimental study. The results demonstrate the relationship between the process parameters, density, and porosity type as well as the surface roughness. The study contributes to a comprehensive understanding of the interplay between PSD and porosity as crucial factors in the EBM AM process

    Transtextual Narratives about the European Union. A qualitative research on structuring, sense-making constructions of reality from a linguistic perspective

    Full text link
    Narratives are never finished products, but constantly evolving linguistic constructs. Actors use them to position themselves and develop the narratives further in the process. In this respect, narratives are to be understood as specifically interpreted constructions of reality that lie between the level of text and discourse. The aim of this paper is to analyze such narratives about the European Union in three political speeches by the presidents of the European Commission Jean-Claude Juncker and Ursula von der Leyen. To this end, narrative is methodically analyzed using the narrative constituents 'chronotope', 'actors', 'plot development/emplotment' and 'attitude'. These narrative constituents are in turn divided into sub-constituents that can be analyzed intratextually. The analysis of transtextual narratives offers the possibility of examining constructed sequences of actions that are contextualized in an actor-specific way. In the course of the argumentation of policies or political goals, for example, legitimization strategies can be traced. The constitution of functional actor constellations as well as of spatio-temporal distinct contexts can justify projects and motivate actions. The qualitative-explorative approach of this paper is intended to promote a better understanding of transtextual narratives and thus of a form of discursively constructed realities

    Analysis of the Economic and Carbon Emission Reduction Potential of Fuel Cell Electric Vehicle-to-Grid in Alberta and Ontario

    Full text link
    Connecting battery electric vehicle (BEV) to the grid is a way of utilizing existing BEV fleet to cut the cost on energy storage and provide monetary incentives to vehicle owners. By coordinating the charging and discharging of the growing BEV fleet, the grid load can be shifted. Meanwhile, fuel cell electric vehicles (FCEVs) are gaining popularity, especially in heavy-duty vehicle market because of the advantages of hydrogen over battery such as the higher gravimetric density and faster refueling time. Similarly, FCEV fleet can also be connected with the grid (FCEV2G) and become moving energy generators that generate electricity and supply it to the grid using hydrogen. The hydrogen used can be produced locally with cheap and excess electricity or in a centralized production site at lower cost. A profit could be made to benefit from the high electricity price during peak hours, which can be shared among FCEV owners and the FCEV2G coordinator. This study analyzes an FCEV2G station that can connect a few FCEVs to the grid to generate electricity. The operation, including local hydrogen production and storage, hydrogen purchased from a centralized plant, and schedules of FCEV2G, is modeled as a mixed integer linear programming problem. Using historical data of electricity price and generation mix in Alberta and Ontario, in 2019 and 2022, The profits of this FCEV2G station with different configurations are optimized and compared. Parameters including component efficiency, onsite electrolyzer are studied to investigate their impacts on the optimization result. The carbon emission potential of FCEV2G is also evaluated. The results in Alberta show that an annual net revenue as high as 66k USD could be made in 2022 via FCEV2G, as the high and volatile electricity prices amplify the load-balancing function of FCEV2G. In addition, 185 t CO2 emission could also be avoided by using clean hydrogen to generate electricity and supply it to the carbon-intensive grid in Alberta. However, under the base case assumption, such a FCEV2G station could not make profit in 2019 in Alberta because of the efficiency losses of the electrolyzer and fuel cells as well as the relatively stable electricity price. This means, high and unstable electricity prices through a year are the key factors for FCEV2G to be profitable. On the other hand, Ontario has abundant nuclear and hydro power supply and hence maintain a stable electricity price profile. A parametric study is conducted to study how the profitability will depend on technological improvements in the future, and it finds that, by using the 2022 data, the FCEV2G station becomes profitable after market hydrogen cost divided by fuel cell efficiency is below 86 USD/MWh. Meanwhile, the carbon intensity of electricity varies largely in Ontario because natural gas is primarily used to meet peak demands. This allows a FCEV2G pathway to reduce the carbon emissions during peak hours, and the result shows as high as 213 t CO2 emissions could be reduced in the 2022 base case

    ProofFrog: A Tool For Verifying Game-Hopping Proofs

    Full text link
    Cryptographic proofs allow researchers to provide theoretical guarantees on the security that their constructions provide. A proof of security can completely eliminate a class of attacks by potential adversaries. Human fallibility, however, means that even a proof reviewed by experts may still hide flaws or outright errors. Proof assistants are software tools built for the purpose of formally verifying each step in a proof, and as such have the potential to prevent erroneous proofs from being published and insecure constructions from being implemented. Unfortunately, existing tooling for verifying cryptographic proofs has found limited adoption in the cryptographic community, in part due to concerns with ease of use. This thesis presents ProofFrog: a new tool for verifying cryptographic game-hopping proofs. ProofFrog is designed with the average cryptographer in mind, using an imperative syntax for specifying games and a syntax for proofs that closely models pen-and-paper arguments. As opposed to other proof assistant tools which largely operate by manipulating logical formulae, ProofFrog manipulates abstract syntax trees (ASTs) into a canonical form to establish indistinguishable or equivalent behaviour for pairs of games in a user-provided sequence. We detail the domain-specific language developed for use with the ProofFrog proof engine as well as present a sequence of worked examples that demonstrate ProofFrog’s capacity for verifying proofs and the exact transformations it applies to canonicalize ASTs. A tool like ProofFrog that prioritizes ease of use can lower the barrier of entry to using computer-verified proofs and aid in catching insecure constructions before they are made public

    Structurally Enhanced Electrodes for Redox Flow Batteries Produced via Electrospinning

    Full text link
    The vanadium redox flow battery is one of the most promising secondary batteries for energy storage system due to its design flexibility attributed to the large adjustable capacity of the storage tanks filled with electrolyte solution. However, the vanadium redox flow battery is not yet widely deployed owing to its low power density. This thesis describes the way of constructing the fibrous electrode with novel structure to overcome the flaw. The general electrospun materials of polyacrylonitrile were synthesized with substantially lower porosity than standard materials by applying compression during the stabilization stage. This objective was to create flow battery electrodes with higher volumetric surface area. The flexibility of the electrospinning technique combined with adjustable post-processing steps such as stabilization and carbonization allowed for the creation of layers with very specific structural and transport properties. In-plane permeability was found to remain relatively constant compared to the original uncompressed fibrous structure. On the other hand, the fibers compacted and compressed down to the flat ribbon shape hurt the through-plane permeability, so artificial holes were created using a CO_2 laser to perforate the structure. The loss of specific surface area caused by laser perforation was quite negligible and still showed improvement. Overall, the novel flow-through electrode provided from this study successfully contributed to improving the transport properties as well as the electrochemical reaction rate, leading to the optimal power density of a vanadium redox flow battery. In addition to that, 2-dimensional half-cell model was created with multi-physics simulation to predict the change in performance with respect to the structural properties of fibrous electrode. The performance was evaluated based on polarization behavior, required pumping power to operate the cell, and operating efficiency. Moreover, electrode was constructed to multi-layered structure in profiles of permeability, fiber size, and porosity. The vanadium ion could be distributed uniformly over the entire region of electrode, which enabled more portion of fiber surface to be utilized for reaction to improve power density while maintaining low pumping power for operation. Based on the prediction from the model, the actual experimental work was invested for multi-layered structure built with novel electrospun fibrous layers. Two different flow channel designs were considered: interdigitated and parallel. The convective flow was induced with the interdigitated flow channel design. Thus, the vanadium ions could be distributed effectively to the region of electrode, resulting in the higher power density. The electrode created in multi-layer provided higher net power density even though the increased pumping power requirement compared to the case of single layer. The body of work presented in this thesis has contributed significantly to understanding the mass transport phenomena taking place in electrodes built in novel fibrous structures. It highlights the preparation of this media through electrospinning as well as numerical and experimental methods for characterizing and understanding these processes. All the work presented here promoted the development of flow batteries through better understanding of the flow battery electrode

    Not All Pull Request Rejections Are The Same

    Full text link
    In the Open Source Software (OSS) development landscape, evaluating pull requests extends beyond code quality assessment. Recent research has revealed the significant influence of social dynamics and perceptions on pull request evaluations, a notion our study seeks to expand upon. By examining the intricate reasons behind pull request rejections, we aim to move beyond the traditional view of rejections as a monolithic category. Utilizing a dataset comprising of 52,829 pull requests across 3,931 projects, we conduct a large-scale comprehensive analysis identifying twelve distinct categories of rejection rea- sons. Our findings underscore that although social ties and technical abilities are factors that influence pull request decisions, they are not consistent across all rejection reasons. Notably, certain characteristics, such as extensive line changes and team size, exhibit varied impacts on different types of rejections, indicating the complex interplay between social and technical factors in pull request assessments. This study provides a multifaceted understanding of the OSS contribution evaluation process, highlighting the complexity and diversity of rejection reasons. By describing the specific features that influence distinct types of rejections, we contribute to the development of more nuanced strategies for managing contributions. Our findings offer valuable insights for both contributors and project maintainers, emphasizing the need for a tailored approach to understanding and enhancing the pull request evaluation process in OSS projects

    Advancements in Road Lane Mapping: Comparative Analysis of Deep Learning-based Semantic Segmentation Methods Using Aerial Imagery

    Full text link
    The rapid advancement of autonomous vehicles (AVs) underscores the necessity for high-definition (HD) maps, with road lane information being crucial for their navigation. The widespread use of Earth observation data, including aerial imagery, provides invaluable resources for constructing these maps. However, to fully exploit the potential of aerial imagery for HD road map creation, it is essential to leverage the capabilities of artificial intelligence (AI) and deep learning technologies. Conversely, the domain of remote sensing has not yet fully explored the development of specialized models for road lane extraction, an area where the field of computer vision has made significant progress with the introduction of advanced semantic segmentation models. This research undertakes a comprehensive comparative analysis of twelve deep learning-based semantic segmentation models, specifically to measure their skill in road lane marking extraction, with a special emphasis on a novel dataset characterized by partially labeled instances. This investigation aims to examine the models' performance when applied to scenarios with minimal labeled data, examining their efficiency, accuracy, and ability to adapt under conditions of limited annotation and transfer learning. The outcome of this study highlights the distinct advantage of Transformer-based models over their Convolutional Neural Network (CNN) counterparts in the context of extracting road lanes from aerial imagery. Remarkably, within the state-of-the-art models, such as Segmenting Transformers (SegFormer), Shifted Window (Swin) Transformer, and Twins Scaled Vision Transformer (Twins-SVT) exhibit superior performance. The empirical results on the Waterloo Urban Scene dataset mark substantial progress, with mean Intersection over Union (IoU) scores ranging from 33.56% to 76.11%, precision from 64.33% to 77.44%, recall from 66.0% to 98.96%, and F1 scores from 44.34% to 85.35%. These findings underscore the benefits of model pretraining and the distinctive attributes of the dataset in strengthening the effectiveness of models for HD road map development, announcing new possibilities in the advancement of autonomous vehicle navigation systems

    Explaining Expert Search and Team Formation Systems with ExES

    Full text link
    Expert search and team formation systems operate on collaboration networks with nodes representing individuals, labeled with their skills, and edges denoting collaboration relationships. Given a query corresponding to a set of desired skills, these systems identify experts or teams that best match the query. However, state-of-the-art solutions to this problem lack transparency and interpretability. To address this issue, we propose ExES, an interactive tool designed to explain black-box expert search systems. Our system leverages saliency and counterfactual methods from the field of explainable artificial intelligence (XAI). ExES enables users to understand why individuals were or were not included in the query results and what individuals could do, in terms of perturbing skills or connections, to be included or excluded in the results. Based on several experiments using real-world datasets, we verify the quality and efficiency of our explanation generation methods. We demonstrate that ExES takes a significant step toward interactivity by achieving an average latency reduction of 50% in comparison to an exhaustive approach while maintaining over 82% precision in producing saliency explanations and over 70% precision in identifying optimal counterfactual explanations

    Evolution of the Laurentide Ice Sheet in north-central Ontario from subglacial sediments

    No full text
    Till stratigraphic analysis of 10 sediment bluffs near Ogoki Post in the Hudson and James Bay Lowlands

    17,696

    full texts

    18,834

    metadata records
    Updated in last 30 days.
    University of Waterloo's Institutional Repository is based in Canada
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇