10,842 research outputs found

    Water accounting in the Oroua River Catchment : a thesis submitted in partial fulfilment of the requirements for the degree of Master in Applied Science in Agricultural Engineering at Massey University

    Get PDF
    Irregular pagination - pg 52 missing.With growing population and limited water resources, there is an increasing need worldwide for better management of water resources. This is especially true when all-or nearly all-water resources are allocated to various uses. Effective strategies for obtaining more productivity while maintaining or improving the environment must be formulated. This can be achieved only after the water quantity, quality and uses have been understood and evaluated. One tool to analyse the situation in order to gain a deeper understanding and possibly identify opportunities for better water management is the recently-proposed methodology of water accounting, which considers components of the water balance and classifies them according to uses and productivity of these uses. Identified changes in quantity and quality of water can provide important clues on increasing water productivity. The water accounting methodology was tried in the Oroua River Catchment to evaluate its use as a way of assessing water availability, and to identify opportunities for water savings in the catchment. The use of the methodology in a basin-wide water assessment was not successful due to insufficient rainfall data-especially at the State Forest Park where most of the streamflow (approximately 80%) comes from during low flows. In addition, the monthly climatic water balance model used failed to produce a reliable estimate of streamflow. The volume of estimated streamflow was greatly underestimated as compared to the actual recorded streamflow. Streamflow water accounting was able to assess the water availability in the lower portion of the Oroua River for the indicators gave a clear picture of the existing state of the river during the summer months. Water depletions from instream uses, which include waste assimilation, environmental maintenance, and free-water evaporation, comprised the largest part of the total streamflow depletions in the lower Oroua River. In some instances, combined depletion from waste assimilation and free-water evaporation was more than 3 times the available water. Depletions from offstream uses, including municipal and industrial, and irrigation abstractions comprised only a small portion of the total streamflow depletion. However, one limitation of the approach is that it did not account for the other return flows from irrigation and M&I diversions. Despite the limitations of the study, the use of the indicators helped in understanding the situation since the Depleted Fraction (DF available) indicator clearly showed how much further abstraction is allowed, and the use of the Process Fraction (PF depleted) readily shows an opportunity for better use of water. It is recommended that the pollution effect also be included in the original water accounting methodology of Molden (1997). The pollution effect of different contaminants could be quantified by their dilution factor i.e., the physical amount of water lost to pollution from the discharge of effluents is measured by the amount of upstream water which would be required to dilute it back down to the maximum allowed concentration of pollutants

    Flow Smoothing and Denoising: Graph Signal Processing in the Edge-Space

    Full text link
    This paper focuses on devising graph signal processing tools for the treatment of data defined on the edges of a graph. We first show that conventional tools from graph signal processing may not be suitable for the analysis of such signals. More specifically, we discuss how the underlying notion of a `smooth signal' inherited from (the typically considered variants of) the graph Laplacian are not suitable when dealing with edge signals that encode a notion of flow. To overcome this limitation we introduce a class of filters based on the Edge-Laplacian, a special case of the Hodge-Laplacian for simplicial complexes of order one. We demonstrate how this Edge-Laplacian leads to low-pass filters that enforce (approximate) flow-conservation in the processed signals. Moreover, we show how these new filters can be combined with more classical Laplacian-based processing methods on the line-graph. Finally, we illustrate the developed tools by denoising synthetic traffic flows on the London street network.Comment: 5 pages, 2 figur

    The Contribution of Vocational Training to Employment, Job-Related Skills and Productivity: Evidence from Madeira Island

    Get PDF
    In this paper, we analyze the transition to the labour market of participants in vocational training in Madeira Island. In a first stage, we investigate how the employment status at different dates (one month, one year, and two years after the completion of the training program) depends on relevant variables, such as age, gender, education and the content and duration of the training. In a second stage, we use the individuals’ self-assessment regarding the effectiveness of the training program along three dimensions: employment, job-related skills and productivity. We find that respondents score training activities high in every dimension. Moreover, we find that training is more effective among the educated, indicating that vocational training is far from being remedial. We also find that long training programs and training in the area of tourism are particularly effective.job-related skills, productivity, employment, training, ordered logit

    Network Inference from Consensus Dynamics

    Full text link
    We consider the problem of identifying the topology of a weighted, undirected network G\mathcal G from observing snapshots of multiple independent consensus dynamics. Specifically, we observe the opinion profiles of a group of agents for a set of MM independent topics and our goal is to recover the precise relationships between the agents, as specified by the unknown network G\mathcal G. In order to overcome the under-determinacy of the problem at hand, we leverage concepts from spectral graph theory and convex optimization to unveil the underlying network structure. More precisely, we formulate the network inference problem as a convex optimization that seeks to endow the network with certain desired properties -- such as sparsity -- while being consistent with the spectral information extracted from the observed opinions. This is complemented with theoretical results proving consistency as the number MM of topics grows large. We further illustrate our method by numerical experiments, which showcase the effectiveness of the technique in recovering synthetic and real-world networks.Comment: Will be presented at the 2017 IEEE Conference on Decision and Control (CDC

    Spectral partitioning of time-varying networks with unobserved edges

    Full text link
    We discuss a variant of `blind' community detection, in which we aim to partition an unobserved network from the observation of a (dynamical) graph signal defined on the network. We consider a scenario where our observed graph signals are obtained by filtering white noise input, and the underlying network is different for every observation. In this fashion, the filtered graph signals can be interpreted as defined on a time-varying network. We model each of the underlying network realizations as generated by an independent draw from a latent stochastic blockmodel (SBM). To infer the partition of the latent SBM, we propose a simple spectral algorithm for which we provide a theoretical analysis and establish consistency guarantees for the recovery. We illustrate our results using numerical experiments on synthetic and real data, highlighting the efficacy of our approach.Comment: 5 pages, 2 figure

    Graph-based Semi-Supervised & Active Learning for Edge Flows

    Full text link
    We present a graph-based semi-supervised learning (SSL) method for learning edge flows defined on a graph. Specifically, given flow measurements on a subset of edges, we want to predict the flows on the remaining edges. To this end, we develop a computational framework that imposes certain constraints on the overall flows, such as (approximate) flow conservation. These constraints render our approach different from classical graph-based SSL for vertex labels, which posits that tightly connected nodes share similar labels and leverages the graph structure accordingly to extrapolate from a few vertex labels to the unlabeled vertices. We derive bounds for our method's reconstruction error and demonstrate its strong performance on synthetic and real-world flow networks from transportation, physical infrastructure, and the Web. Furthermore, we provide two active learning algorithms for selecting informative edges on which to measure flow, which has applications for optimal sensor deployment. The first strategy selects edges to minimize the reconstruction error bound and works well on flows that are approximately divergence-free. The second approach clusters the graph and selects bottleneck edges that cross cluster-boundaries, which works well on flows with global trends
    • …
    corecore