731 research outputs found

    Bio-inspired optimization in integrated river basin management

    Get PDF
    Water resources worldwide are facing severe challenges in terms of quality and quantity. It is essential to conserve, manage, and optimize water resources and their quality through integrated water resources management (IWRM). IWRM is an interdisciplinary field that works on multiple levels to maximize the socio-economic and ecological benefits of water resources. Since this is directly influenced by the river’s ecological health, the point of interest should start at the basin-level. The main objective of this study is to evaluate the application of bio-inspired optimization techniques in integrated river basin management (IRBM). This study demonstrates the application of versatile, flexible and yet simple metaheuristic bio-inspired algorithms in IRBM. In a novel approach, bio-inspired optimization algorithms Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) are used to spatially distribute mitigation measures within a basin to reduce long-term annual mean total nitrogen (TN) concentration at the outlet of the basin. The Upper Fuhse river basin developed in the hydrological model, Hydrological Predictions for the Environment (HYPE), is used as a case study. ACO and PSO are coupled with the HYPE model to distribute a set of measures and compute the resulting TN reduction. The algorithms spatially distribute nine crop and subbasin-level mitigation measures under four categories. Both algorithms can successfully yield a discrete combination of measures to reduce long-term annual mean TN concentration. They achieved an 18.65% reduction, and their performance was on par with each other. This study has established the applicability of these bio-inspired optimization algorithms in successfully distributing the TN mitigation measures within the river basin. Stakeholder involvement is a crucial aspect of IRBM. It ensures that researchers and policymakers are aware of the ground reality through large amounts of information collected from the stakeholder. Including stakeholders in policy planning and decision-making legitimizes the decisions and eases their implementation. Therefore, a socio-hydrological framework is developed and tested in the Larqui river basin, Chile, based on a field survey to explore the conditions under which the farmers would implement or extend the width of vegetative filter strips (VFS) to prevent soil erosion. The framework consists of a behavioral, social model (extended Theory of Planned Behavior, TPB) and an agent-based model (developed in NetLogo) coupled with the results from the vegetative filter model (Vegetative Filter Strip Modeling System, VFSMOD-W). The results showed that the ABM corroborates with the survey results and the farmers are willing to extend the width of VFS as long as their utility stays positive. This framework can be used to develop tailor-made policies for river basins based on the conditions of the river basins and the stakeholders' requirements to motivate them to adopt sustainable practices. It is vital to assess whether the proposed management plans achieve the expected results for the river basin and if the stakeholders will accept and implement them. The assessment via simulation tools ensures effective implementation and realization of the target stipulated by the decision-makers. In this regard, this dissertation introduces the application of bio-inspired optimization techniques in the field of IRBM. The successful discrete combinatorial optimization in terms of the spatial distribution of mitigation measures by ACO and PSO and the novel socio-hydrological framework using ABM prove the forte and diverse applicability of bio-inspired optimization algorithms

    Beyond the two-state model of switching in biology and computation

    Get PDF
    The thesis presents various perspectives on physical and biological computation. Our fundamental object of study in both these contexts is the notion of switching/erasing a bit. In a physical context, a bit is represented by a particle in a double well, whose dynamics is governed by the Langevin equation. We define the notions of reliability and erasing time-scales in addition to the work required to erase a bit for a given family of control protocols. We call bits “optimal” if they meet the required reliability and erasing time requirements with minimal work cost. We find that optimal bits always saturate the erasing time requirement, but may not saturate the reliability time requirement. This allows us to eliminate several regions of parameter space as sub-optimal. In a biological context, our bits are represented by substrates that are acted upon by catalytic enzymes. We define retroactivity as the back-signal propagated by the downstream system when connected to the upstream system. We analyse certain upstream systems that can help mitigate retroactivity. However, these systems require a substantial pool of resources and are therefore not optimal. As a consequence, we turn our attention to insulating networks called push-pull motifs. We find that high rates of energy consumption are not essential to alleviate retroactivity in push-pull motifs; all we need is to couple weakly to the upstream system. However, this approach is not resilient to cross-talk caused by leak reactions in the circuit. Next, we consider a single enzyme-substrate reaction and analyse its mechanism. Our system has two intermediate states (enzyme-substrate complexes). Our main question is “How should we choose binding energies of the intermediates to minimize sequestra- tion of substrates (retroactivity), whilst maintaining a minimum flux at steady-state?”. Choosing very low binding energies increases retroactivity since the system spends a considerable proportion of time in the intermediate states. Choosing binding energies that are very high reduces retroactivity, but hinders the progress of the reaction. As a result, we find that the the optimal binding energies are both moderate, and indeed tuned with each other. In particular, their difference is related to the free energy difference between the products and reactants.Open Acces

    Theory and Engineering of Scheduling Parallel Jobs

    Get PDF
    Scheduling is very important for an efficient utilization of modern parallel computing systems. In this thesis, four main research areas for scheduling are investigated: the interplay and distribution of decision makers, the efficient schedule computation, efficient scheduling for the memory hierarchy and energy-efficiency. The main result is a provably fast and efficient scheduling algorithm for malleable jobs. Experiments show the importance and possibilities of scheduling considering the memory hierarchy

    Methodology for Analyzing and Characterizing Error Generation in Presence of Autocorrelated Demands in Stochastic Inventory Models

    Get PDF
    Most techniques that describe and solve stochastic inventory problems rely upon the assumption of identically and independently distributed (IID) demands. Stochastic inventory formulations that fail to capture serially-correlated components in the demand lead to serious errors. This dissertation provides a robust method that approximates solutions to the stochastic inventory problem where the control review system is continuous, the demand contains autocorrelated components, and the lost sales case is considered. A simulation optimization technique based on simulated annealing (SA), pattern search (PS), and ranking and selection (R&S) is developed and used to generate near-optimal solutions. The proposed method accounts for the randomness and dependency of the demand as well as for the inherent constraints of the inventory model. The impact of serially-correlated demand is investigated for discrete and continuous dependent input models. For the discrete dependent model, the autocorrelated demand is assumed to behave as a discrete Markov-modulated chain (DMC), while a first-order autoregressive AR(1) process is assumed for describing the continuous demand. The effects of these demand patterns combined with structural cost variations on estimating both total costs and control policy parameters were examined. Results demonstrated that formulations that ignore the serially-correlated component performed worse than those that considered it. In this setting, the effect of holding cost and its interaction with penalty cost become stronger and more significant as the serially-correlated component increases. The growth rate in the error generated in total costs by formulations that ignore dependency components is significant and fits exponential models. To verify the effectiveness of the proposed simulation optimization method for finding the near-optimal inventory policy at different levels of autocorrelation factors, total costs, and stockout rates were estimated. The results provide additional evidence that serially-correlated components in the demand have a relevant impact on determining inventory control policies and estimating measurement of performance
    • …
    corecore