7,529 research outputs found

    Optimising water quality outcomes for complex water resource systems and water grids

    Get PDF
    As the world progresses, water resources are likely to be subjected to much greater pressures than in the past. Even though the principal water problem revolves around inadequate and uncertain water supplies, water quality management plays an equally important role. Availability of good quality water is paramount to sustainability of human population as well as the environment. Achieving water quality and quantity objectives can be conflicting and becomes more complicated with challenges like, climate change, growing populations and changed land uses. Managing adequate water quality in a reservoir gets complicated by multiple inflows with different water quality levels often resulting in poor water quality. Hence, it is fundamental to approach this issue in a more systematic, comprehensive, and coordinated fashion. Most previous studies related to water resources management focused on water quantity and considered water quality separately. However, this research study focused on considering water quantity and quality objectives simultaneously in a single model to explore and understand the relationship between them in a reservoir system. A case study area was identified in Western Victoria, Australia with water quantity and quality challenges. Taylors Lake of Grampians System in Victoria, Australia receives water from multiple sources of differing quality and quantity and has the abovesaid problems. A combined simulation and optimisation approach was adopted to carry out the analysis. A multi-objective optimisation approach was applied to achieve optimal water availability and quality in the storage. The multi-objective optimisation model included three objective functions which were: water volume and two water quality parameters: salinity and turbidity. Results showed competing nature of water quantity and quality objectives and established the trade-offs. It further showed that it was possible to generate a range of optimal solutions to effectively manage those trade-offs. The trade-off analysis explored and informed that selective harvesting of inflows is effective to improve water quality in storage. However, with strict water quality restriction there is a considerable loss in water volume. The robustness of the optimisation approach used in this study was confirmed through sensitivity and uncertainty analysis. The research work also incorporated various spatio-temporal scenario analyses to systematically articulate long-term and short-term operational planning strategies. Operational decisions around possible harvesting regimes while achieving optimal water quantity and quality and meeting all water demands were established. The climate change analysis revealed that optimal management of water quantity and quality in storage became extremely challenging under future climate projections. The high reduction in storage volume in the future will lead to several challenges such as water supply shortfall and inability to undertake selective harvesting due to reduced water quality levels. In this context, selective harvesting of inflows based on water quality will no longer be an option to manage water quantity and quality optimally in storage. Some significant conclusions of this research work included the establishment of trade-offs between water quality and quantity objectives particular to this configuration of water supply system. The work demonstrated that selective harvesting of inflows will improve the stored water quality, and this finding along with the approach used is a significant contribution to decision makers working within the water sector. The simulation-optimisation approach is very effective in providing a range of optimal solutions, which can be used to make more informed decisions around achieving optimal water quality and quantity in storage. It was further demonstrated that there are range of planning periods, both long-term (>10 years) and short-term (<1 year), all of which offer distinct advantages and provides useful insights, making this an additional key contribution of the work. Importantly, climate change was also considered where it was found that diminishing water resources, particularly to this geographic location, makes it increasingly difficult to optimise both quality and quantity in storage providing further useful insights from this work.Doctor of Philosoph

    Resilience and food security in a food systems context

    Get PDF
    This open access book compiles a series of chapters written by internationally recognized experts known for their in-depth but critical views on questions of resilience and food security. The book assesses rigorously and critically the contribution of the concept of resilience in advancing our understanding and ability to design and implement development interventions in relation to food security and humanitarian crises. For this, the book departs from the narrow beaten tracks of agriculture and trade, which have influenced the mainstream debate on food security for nearly 60 years, and adopts instead a wider, more holistic perspective, framed around food systems. The foundation for this new approach is the recognition that in the current post-globalization era, the food and nutritional security of the world’s population no longer depends just on the performance of agriculture and policies on trade, but rather on the capacity of the entire (food) system to produce, process, transport and distribute safe, affordable and nutritious food for all, in ways that remain environmentally sustainable. In that context, adopting a food system perspective provides a more appropriate frame as it incites to broaden the conventional thinking and to acknowledge the systemic nature of the different processes and actors involved. This book is written for a large audience, from academics to policymakers, students to practitioners

    “Oh my god, how did I spend all that money?”: Lived experiences in two commodified fandom communities

    Get PDF
    This research explores the role of commodification in participation in celebrity-centric fandom communities, applying a leisure studies framework to understand the constraints fans face in their quest to participate and the negotiations they engage in to overcome these constraints. In fan studies scholarship, there is a propensity to focus on the ways fans oppose commodified industry structures; however, this ignores the many fans who happily participate within them. Using the fandoms for the pop star Taylor Swift and the television series Supernatural as case studies, this project uses a mixed-methodological approach to speak directly to fans via surveys and semistructured interviews to develop an understanding of fans’ lived experiences based on their own words. By focusing on celebrity-centric fandom communities rather than on the more frequently studied textual fandoms, this thesis turns to the role of the celebrity in fans’ ongoing desire to participate in commodified spaces. I argue that fans are motivated to continue spending money to participate within their chosen fandom when this form of participation is tied to the opportunity for engagement with the celebrity. While many fans seek community from their fandom participation, this research finds that for others, social ties are a secondary outcome of their overall desire for celebrity attention, which becomes a hobby in which they build a “leisure career” (Stebbins 2014). When fans successfully gain attention from their celebrity object of fandom, they gain status within their community, creating intra-fandom hierarchies based largely on financial resources and on freedom from structural constraints related to education, employment, and caring. Ultimately, this thesis argues that the broad neglect of celebrity fandom practices means we have overlooked the experiences of many fans, necessitating a much broader future scope for the field

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    Root Locus Design for the Synchronization of Multi-Agent Systems in General Directed Networks

    Get PDF
    This paper considers the synchronization problem of multi-agent SISO systems with general unidirectional communication structures. A distributed control strategy is presented which relies on relative output differences of neighboring agents and, thus, does not need relative state information. We propose a root locus design method to determine the synchronization gain. Since in directed networks the characteristic equation for synchronization might be complex valued, we use tools from the complex root locus technique to solve the synchronization task

    On the Principles of Evaluation for Natural Language Generation

    Get PDF
    Natural language processing is concerned with the ability of computers to understand natural language texts, which is, arguably, one of the major bottlenecks in the course of chasing the holy grail of general Artificial Intelligence. Given the unprecedented success of deep learning technology, the natural language processing community has been almost entirely in favor of practical applications with state-of-the-art systems emerging and competing for human-parity performance at an ever-increasing pace. For that reason, fair and adequate evaluation and comparison, responsible for ensuring trustworthy, reproducible and unbiased results, have fascinated the scientific community for long, not only in natural language but also in other fields. A popular example is the ISO-9126 evaluation standard for software products, which outlines a wide range of evaluation concerns, such as cost, reliability, scalability, security, and so forth. The European project EAGLES-1996, being the acclaimed extension to ISO-9126, depicted the fundamental principles specifically for evaluating natural language technologies, which underpins succeeding methodologies in the evaluation of natural language. Natural language processing encompasses an enormous range of applications, each with its own evaluation concerns, criteria and measures. This thesis cannot hope to be comprehensive but particularly addresses the evaluation in natural language generation (NLG), which touches on, arguably, one of the most human-like natural language applications. In this context, research on quantifying day-to-day progress with evaluation metrics lays the foundation of the fast-growing NLG community. However, previous works have failed to address high-quality metrics in multiple scenarios such as evaluating long texts and when human references are not available, and, more prominently, these studies are limited in scope, given the lack of a holistic view sketched for principled NLG evaluation. In this thesis, we aim for a holistic view of NLG evaluation from three complementary perspectives, driven by the evaluation principles in EAGLES-1996: (i) high-quality evaluation metrics, (ii) rigorous comparison of NLG systems for properly tracking the progress, and (iii) understanding evaluation metrics. To this end, we identify the current state of challenges derived from the inherent characteristics of these perspectives, and then present novel metrics, rigorous comparison approaches, and explainability techniques for metrics to address the identified issues. We hope that our work on evaluation metrics, system comparison and explainability for metrics inspires more research towards principled NLG evaluation, and contributes to the fair and adequate evaluation and comparison in natural language processing

    Evaluating Stability in Massive Social Networks: Efficient Streaming Algorithms for Structural Balance

    Full text link
    Structural balance theory studies stability in networks. Given a nn-vertex complete graph G=(V,E)G=(V,E) whose edges are labeled positive or negative, the graph is considered \emph{balanced} if every triangle either consists of three positive edges (three mutual ``friends''), or one positive edge and two negative edges (two ``friends'' with a common ``enemy''). From a computational perspective, structural balance turns out to be a special case of correlation clustering with the number of clusters at most two. The two main algorithmic problems of interest are: (i)(i) detecting whether a given graph is balanced, or (ii)(ii) finding a partition that approximates the \emph{frustration index}, i.e., the minimum number of edge flips that turn the graph balanced. We study these problems in the streaming model where edges are given one by one and focus on \emph{memory efficiency}. We provide randomized single-pass algorithms for: (i)(i) determining whether an input graph is balanced with O(log⁥n)O(\log{n}) memory, and (ii)(ii) finding a partition that induces a (1+Δ)(1 + \varepsilon)-approximation to the frustration index with O(n⋅polylog(n))O(n \cdot \text{polylog}(n)) memory. We further provide several new lower bounds, complementing different aspects of our algorithms such as the need for randomization or approximation. To obtain our main results, we develop a method using pseudorandom generators (PRGs) to sample edges between independently-chosen \emph{vertices} in graph streaming. Furthermore, our algorithm that approximates the frustration index improves the running time of the state-of-the-art correlation clustering with two clusters (Giotis-Guruswami algorithm [SODA 2006]) from nO(1/Δ2)n^{O(1/\varepsilon^2)} to O(n2log⁥3n/Δ2+nlog⁥n⋅(1/Δ)O(1/Δ4))O(n^2\log^3{n}/\varepsilon^2 + n\log n \cdot (1/\varepsilon)^{O(1/\varepsilon^4)}) time for (1+Δ)(1+\varepsilon)-approximation. These results may be of independent interest

    Nonholonomic Motion Planning as Efficient as Piano Mover's

    Full text link
    We present an algorithm for non-holonomic motion planning (or 'parking a car') that is as computationally efficient as a simple approach to solving the famous Piano-mover's problem, where the non-holonomic constraints are ignored. The core of the approach is a graph-discretization of the problem. The graph-discretization is provably accurate in modeling the non-holonomic constraints, and yet is nearly as small as the straightforward regular grid discretization of the Piano-mover's problem into a 3D volume of 2D position plus angular orientation. Where the Piano mover's graph has one vertex and edges to six neighbors each, we have three vertices with a total of ten edges, increasing the graph size by less than a factor of two, and this factor does not depend on spatial or angular resolution. The local edge connections are organized so that they represent globally consistent turn and straight segments. The graph can be used with Dijkstra's algorithm, A*, value iteration or any other graph algorithm. Furthermore, the graph has a structure that lends itself to processing with deterministic massive parallelism. The turn and straight curves divide the configuration space into many parallel groups. We use this to develop a customized 'kernel-style' graph processing method. It results in an N-turn planner that requires no heuristics or load balancing and is as efficient as a simple solution to the Piano mover's problem even in sequential form. In parallel form it is many times faster than the sequential processing of the graph, and can run many times a second on a consumer grade GPU while exploring a configuration space pose grid with very high spatial and angular resolution. We prove approximation quality and computational complexity and demonstrate that it is a flexible, practical, reliable, and efficient component for a production solution.Comment: 34 pages, 37 figures, 9 tables, 4 graphs, 8 insert

    Modelling and Solving the Single-Airport Slot Allocation Problem

    Get PDF
    Currently, there are about 200 overly congested airports where airport capacity does not suffice to accommodate airline demand. These airports play a critical role in the global air transport system since they concern 40% of global passenger demand and act as a bottleneck for the entire air transport system. This imbalance between airport capacity and airline demand leads to excessive delays, as well as multi-billion economic, and huge environmental and societal costs. Concurrently, the implementation of airport capacity expansion projects requires time, space and is subject to significant resistance from local communities. As a short to medium-term response, Airport Slot Allocation (ASA) has been used as the main demand management mechanism. The main goal of this thesis is to improve ASA decision-making through the proposition of models and algorithms that provide enhanced ASA decision support. In doing so, this thesis is organised into three distinct chapters that shed light on the following questions (I–V), which remain untapped by the existing literature. In parentheses, we identify the chapters of this thesis that relate to each research question. I. How to improve the modelling of airline demand flexibility and the utility that each airline assigns to each available airport slot? (Chapters 2 and 4) II. How can one model the dynamic and endogenous adaptation of the airport’s landside and airside infrastructure to the characteristics of airline demand? (Chapter 2) III. How to consider operational delays in strategic ASA decision-making? (Chapter 3) IV. How to involve the pertinent stakeholders into the ASA decision-making process to select a commonly agreed schedule; and how can one reduce the inherent decision-complexity without compromising the quality and diversity of the schedules presented to the decision-makers? (Chapter 3) V. Given that the ASA process involves airlines (submitting requests for slots) and coordinators (assigning slots to requests based on a set of rules and priorities), how can one jointly consider the interactions between these two sides to improve ASA decision-making? (Chapter 4) With regards to research questions (I) and (II), the thesis proposes a Mixed Integer Programming (MIP) model that considers airlines’ timing flexibility (research question I) and constraints that enable the dynamic and endogenous allocation of the airport’s resources (research question II). The proposed modelling variant addresses several additional problem characteristics and policy rules, and considers multiple efficiency objectives, while integrating all constraints that may affect airport slot scheduling decisions, including the asynchronous use of the different airport resources (runway, aprons, passenger terminal) and the endogenous consideration of the capabilities of the airport’s infrastructure to adapt to the airline demand’s characteristics and the aircraft/flight type associated with each request. The proposed model is integrated into a two-stage solution approach that considers all primary and several secondary policy rules of ASA. New combinatorial results and valid tightening inequalities that facilitate the solution of the problem are proposed and implemented. An extension of the above MIP model that considers the trade-offs among schedule displacement, maximum displacement, and the number of displaced requests, is integrated into a multi-objective solution framework. The proposed framework holistically considers the preferences of all ASA stakeholder groups (research question IV) concerning multiple performance metrics and models the operational delays associated with each airport schedule (research question III). The delays of each schedule/solution are macroscopically estimated, and a subtractive clustering algorithm and a parameter tuning routine reduce the inherent decision complexity by pruning non-dominated solutions without compromising the representativeness of the alternatives offered to the decision-makers (research question IV). Following the determination of the representative set, the expected delay estimates of each schedule are further refined by considering the whole airfield’s operations, the landside, and the airside infrastructure. The representative schedules are ranked based on the preferences of all ASA stakeholder groups concerning each schedule’s displacement-related and operational-delay performance. Finally, in considering the interactions between airlines’ timing flexibility and utility, and the policy-based priorities assigned by the coordinator to each request (research question V), the thesis models the ASA problem as a two-sided matching game and provides guarantees on the stability of the proposed schedules. A Stable Airport Slot Allocation Model (SASAM) capitalises on the flexibility considerations introduced for addressing research question (I) through the exploitation of data submitted by the airlines during the ASA process and provides functions that proxy each request’s value considering both the airlines’ timing flexibility for each submitted request and the requests’ prioritisation by the coordinators when considering the policy rules defining the ASA process. The thesis argues on the compliance of the proposed functions with the primary regulatory requirements of the ASA process and demonstrates their applicability for different types of slot requests. SASAM guarantees stability through sets of inequalities that prune allocations blocking the formation of stable schedules. A multi-objective Deferred-Acceptance (DA) algorithm guaranteeing the stability of each generated schedule is developed. The algorithm can generate all stable non-dominated points by considering the trade-off between the spilled airline and passenger demand and maximum displacement. The work conducted in this thesis addresses several problem characteristics and sheds light on their implications for ASA decision-making, hence having the potential to improve ASA decision-making. Our findings suggest that the consideration of airlines’ timing flexibility (research question I) results in improved capacity utilisation and scheduling efficiency. The endogenous consideration of the ability of the airport’s infrastructure to adapt to the characteristics of airline demand (research question II) enables a more efficient representation of airport declared capacity that results in the scheduling of additional requests. The concurrent consideration of airlines’ timing flexibility and the endogenous adaptation of airport resources to airline demand achieves an improved alignment between the airport infrastructure and the characteristics of airline demand, ergo proposing schedules of improved efficiency. The modelling and evaluation of the peak operational delays associated with the different airport schedules (research question III) provides allows the study of the implications of strategic ASA decision-making for operations and quantifies the impact of the airport’s declared capacity on each schedule’s operational performance. In considering the preferences of the relevant ASA stakeholders (airlines, coordinators, airport, and air traffic authorities) concerning multiple operational and strategic ASA efficiency metrics (research question IV) the thesis assesses the impact of alternative preference considerations and indicates a commonly preferred schedule that balances the stakeholders’ preferences. The proposition of representative subsets of alternative schedules reduces decision-complexity without significantly compromising the quality of the alternatives offered to the decision-making process (research question IV). The modelling of the ASA as a two-sided matching game (research question V), results in stable schedules consisting of request-to-slot assignments that provide no incentive to airlines and coordinators to reject or alter the proposed timings. Furthermore, the proposition of stable schedules results in more intensive use of airport capacity, while simultaneously improving scheduling efficiency. The models and algorithms developed as part of this thesis are tested using airline requests and airport capacity data from coordinated airports. Computational results that are relevant to the context of the considered airport instances provide evidence on the potential improvements for the current ASA process and facilitate data-driven policy and decision-making. In particular, with regards to the alignment of airline demand with the capabilities of the airport’s infrastructure (questions I and II), computational results report improved slot allocation efficiency and airport capacity utilisation, which for the considered airport instance translate to improvements ranging between 5-24% for various schedule performance metrics. In reducing the difficulty associated with the assessment of multiple ASA solutions by the stakeholders (question IV), instance-specific results suggest reductions to the number of alternative schedules by 87%, while maintaining the quality of the solutions presented to the stakeholders above 70% (expressed in relation to the initially considered set of schedules). Meanwhile, computational results suggest that the concurrent consideration of ASA stakeholders’ preferences (research question IV) with regards to both operational (research question III) and strategic performance metrics leads to alternative airport slot scheduling solutions that inform on the trade-offs between the schedules’ operational and strategic performance and the stakeholders’ preferences. Concerning research question (V), the application of SASAM and the DA algorithm suggest improvements to the number of unaccommodated flights and passengers (13 and 40% improvements) at the expense of requests concerning fewer passengers and days of operations (increasing the number of rejected requests by 1.2% in relation to the total number of submitted requests). The research conducted in this thesis aids in the identification of limitations that should be addressed by future studies to further improve ASA decision-making. First, the thesis focuses on exact solution approaches that consider the landside and airside infrastructure of the airport and generate multiple schedules. The proposition of pre-processing techniques that identify the bottleneck of the airport’s capacity, i.e., landside and/or airside, can be used to reduce the size of the proposed formulations and improve the required computational times. Meanwhile, the development of multi-objective heuristic algorithms that consider several problem characteristics and generate multiple efficient schedules in reasonable computational times, could extend the capabilities of the models propositioned in this thesis and provide decision support for some of the world’s most congested airports. Furthermore, the thesis models and evaluates the operational implications of strategic airport slot scheduling decisions. The explicit consideration of operational delays as an objective in ASA optimisation models and algorithms is an issue that merits investigation since it may further improve the operational performance of the generated schedules. In accordance with current practice, the models proposed in this work have considered deterministic capacity parameters. Perhaps, future research could propose formulations that consider stochastic representations of airport declared capacity and improve strategic ASA decision-making through the anticipation of operational uncertainty and weather-induced capacity reductions. Finally, in modelling airlines’ utility for each submitted request and available time slot the thesis proposes time-dependent functions that utilise available data to approximate airlines’ scheduling preferences. Future studies wishing to improve the accuracy of the proposed functions could utilise commercial data sources that provide route-specific information; or in cases that such data is unavailable, employ data mining and machine learning methodologies to extract airlines’ time-dependent utility and preferences

    Optimization-based selection of hydrants and valves control in water distribution networks for fire incidents management

    Get PDF
    In England and Wales, water utilities reduce hydraulic pressure to a minimum regulatory threshold in order to reduce leakage and avoid financial penalties. However, utilities are not legally bound to guarantee specific flow rates from fire hydrants, thus posing a risk for firefighting. We formulate a biobjective mixed-integer nonlinear program (MINLP) to simultaneously determine control valve settings and the location of fire hydrants to be utilized in a water distribution network during urban fire incidents. The goal is to provide the required flow rate from the fire hydrants while minimizing 1) the distance of the utilized fire hydrants from the fire location and 2) the impact on customer supply. As the solution is required in real-time, we propose an optimization-based heuristic, which relies on iteratively solving a NLP approximation and relaxation of the MINLP formulation. Furthermore, we assess the quality of the heuristic solutions for the presented study case by calculating global optimality bounds. The proposed heuristic is applied to an operational water distribution network
    • 

    corecore