66 research outputs found

    Priorities for ecological restoration in the Western Woodlands Way

    Get PDF

    Informing tropical mammal conservation in human-modified landscapes using remote technologies and hierarchical modelling

    Get PDF
    The aggressive expansion of anthropogenic activities is placing increasing pressure on biodiversity, particularly in tropical regions. Here, conservation efforts are hindered by poor understanding of species ecology and the failure of policy instruments to account for multiple stressors of land-use change. While protected areas are central to conservation strategies, there is a general consensus that the future of tropical biodiversity will be determined by how well modified landscapes are managed. In this thesis I advance our understanding of biodiversity persistence in modified tropical landscapes to inform emerging incentive-based policy mechanisms and supply-chain initiatives. Capitalising on recent advances in remote-sensing and hierarchical occupancy modelling, I provide a spatial appraisal of biodiversity in a modified landscape in Sabah, Malaysian Borneo. Fieldwork was conducted at the Stability of Altered Forest Ecosystems (SAFE) project, a large-scale landscape modification experiment, comprising a degradation gradient of old growth forest, selectively logged forest, remnant forest patches and oil palm plantations. The assessment focused on camera-trapping of tropical mammals, as they are sensitive to anthropogenic stressors, occupy key trophic positions, and prioritised in conservation. In Chapter 2 I link mammal occupancy data to airborne multispectral remote-sensing information to show how the conservation value of modified landscapes is dictated by the intensity of the underlying land-use. Logged forests retained appreciable levels of mammal diversity, and oil palm areas were largely devoid of forest specialists and threatened taxa. Moreover, many mammal species disproportionately occupied forested areas that retained old growth structural characteristics. The most influential structural measures accounted for vertical and horizontal components in environmental space, which cannot currently be derived from conventional satellite data. Using a novel application of ecological threshold analysis, I demonstrate how multispectral data and multi-scale occupancy models can help identify conservation and restoration areas in degraded forests. In Chapter 3 I assess the potential for carbon-orientated policy mechanisms (High Carbon Stock, HCS, Approach and REDD+) to prioritise high carbon areas with corresponding biodiversity value in highly modified landscapes. The areas of highest carbon value prioritised via HCS supported comparable species diversity to old growth forest. However, the strength, nature and extent of the biodiversity co-benefit was dependent on how carbon was characterised, the spatial resolution of carbon data, and the species considered. In Chapter 4 I further scrutinised HCS protocols to evaluate how well they delineated high priority forest patches that safeguard species most vulnerable to land-use change (i.e. IUCN threatened species). The minimum core area required to define a high priority patch (100 ha) supported only 35% of the mammal community. In fact the core area criterion would need to increase to 3,199 ha in order to sustain intact mammal assemblages, and an order of magnitude higher if hunting pressure was considered. These findings underline the importance of integrating secondary disturbance impacts into spatial conservation planning. Provided landscape interventions are directed to where they will have the greatest impact, they can be financially sustaining and garner local support for conservation. To this end I provide recommendations to guide policy implementation in modified tropical landscapes to support holistic conservation strategies

    Safe Class and Data Evolution in Large and Long-Lived Java Applications

    Get PDF
    There is a growing class of applications implemented in object-oriented languages that are large and complex, that exploit object persistence, and need to run uninterrupted for long periods of time. Development and maintenance of such applications can present challenges in the following interrelated areas: consistent and scalable evolution of persistent data and code, optimal build management, and runtime changes to applications. The research presented in this thesis addresses the above issues. Since Java is becoming increasingly popular platform for implementing large and long-lived applications, it was chosen for experiments. The first part of the research was undertaken in the context of the PJama system, an orthogonally persistent platform for Java. A technology that supports persistent class and object evolution for this platform was designed, built and evaluated. This technology integrates build management, persistent class evolution, and support for several forms of eager conversion of persistent objects. Research in build management for Java has resulted in the creation of a generally applicable, compiler-independent smart recompilation technology, which can be re-used in a Java IDE, or as a standalone Java-specific utility similar to make. The technology for eager object conversion that we developed allows the developers to perform arbitrarily complex changes to persistent objects and their collections. A high level of developer's control over the conversion process was achieved in part due to introduction of a mechanism for dynamic renaming of old class versions. This mechanism was implemented using minor non-standard extensions to the Java language. However, we also demonstrate how to achieve nearly the same results without modifying the language specification. In this form, we believe, our technology can be largely re-used with practically any persistent object solution for Java. The second part of this research was undertaken using as an implementation platform the HotSpot Java Virtual Machine (JVM), which is currently Sun's main production JVM. A technology was developed that allows the engineers to redefine classes on-the-fly in the running VM. Our main focus was on the runtime evolution of server-type applications, though we also address modification of applications running in the debugger. Unlike the only other similar system for Java known to us, our technology supports redefinition of classes that have methods currently active. Several policies for handling such methods have been proposed, one of them is currently operational, another one is in the experimental stage. We also propose to re-use the runtime evolution technology for dynamic fine-grain profiling of applications

    Phylogenetic and functional growth from diversification in the Cape grass genus Ehrharta Thunb

    Get PDF
    Includes bibliography.This thesis uses phylogenetic and comparative data to test an hypothesis of adaptive radiation in the Cape grass genus Ehrharta Thunb. sensu stricto. Morphological data and sequence data from two noncoding regions of DNA (lTS1 and trnL-F) are used to produce a phylogenetic hypothesis for the tribe Ehrharteae. Combined analysis of these data sets resolves four principal clades that approximate the genera Ehrharta s. s., Micro/aena, Tetrarrhena and Zotovia and this result thus supports a four-genus classification. Poor resolution and a reduction in branch length at the base of a clade nested within Ehrharta s. s. suggests past radiation. Parsimony-based reconstruction of ancestral habitats and growth form attributes indicates that such radiation is associated with a historical transition to seasonallydrier but more fertile habitats, and the coincident or subsequent evolution of several growth form novelties (e.g. buried and swollen culm bases and annualness). These traits are interpreted to reflect divergent strategies for surviving seasonal drought (Le. via seed or storage). Much higher transpiration rates in summer-deciduous leaves than in perennating culms of two species suggest that the evolution of summer-deciduous foliage was important in the occupation of seasonally-arid habitats. Controlled growth experiments are used to test the hypothesis that divergence in persistence traits is associated with differences in seedling biomass allocation and relative growth rate (RGR). Ehrharta s. s. shows wide variation in seedling RGR and regressions based on phylogenetically independent contrasts suggest that differences are better explained by early biomass allocation than leaf area indices. Species with a high allocation to leaves grow faster and flower sooner, so these traits are typical of seeding species

    Water Allocation Under Uncertainty – Potential Gains from Optimisation and Market Mechanisms

    Get PDF
    This thesis first develops a range of wholesale water market design options, based on an optimisation approach to market-clearing, as in electricity markets, focusing on the extent to which uncertainty is accounted for in bidding, market-clearing and contract formation. We conclude that the most promising option is bidding for, and trading, a combination of fixed and proportionally scaled contract volumes, which are based on optimised outputs. Other options include those which are based on a post-clearing fit (e.g. regression) to the natural optimised outputs, or constraining the optimisation such that cleared allocations are in the contractual form required by participants. Alternatively, participants could rely on financial markets to trade instruments, but informed by a centralised market-clearing simulation. We then describe a computational modelling system, using Stochastic Constructive Dynamic Programming (CDDP), and use it to assess the importance of modelling uncertainty, and correlations, in reservoir optimisation and/or market-clearing, under a wide range of physical and economic assumptions, with or without a market. We discuss a number of bases of comparison, but focus on the benefit gain achieved as a proportion of the perfectly competitive market value (price times quantity), calculated using the market clearing price from Markov Chain optimisation. With inflow and demand completely out of phase, high inflow seasonality and volatility, and a constant elasticity of -0.5, the greatest contribution of stochastic (Markov) optimisation, as a proportion of market value was 29%, when storage capacity was only 25% of mean monthly inflow, and with effectively unlimited release capacity. This proportional gain fell only slowly for higher storage capacities, but nearly halved for lower release capacities, around the mean monthly inflow, mainly because highly constrained systems produce high prices, and hence raise market value. The highest absolute gain was actually when release capacity was only 75% of mean monthly inflow. On average, over a storage capacity range from 2% to 1200%, and release capacity range from 100% to 400%, times the mean monthly inflow, the gains from using Markov Chain and Stochastic Independent optimisation, rather than deterministic optimisation, were 18% and 13% of market value, respectively. As expected, the gains from stochastic optimisation rose rapidly for lower elasticities, and when vertical steps were added to the demand curve. But they became nearly negligible when (the absolute value of) elasticity rose to 0.75 and beyond, inflow was in-phase with demand, or the range of either seasonal variation or intra-month variability reduced to ±50% of the mean monthly inflow. Still, our results indicate that there are a wide range of reservoir and economic systems where accounting for uncertainty directly in the water allocation process could result in significant gains, whether in a centrally controlled or market context. Price and price risk, which affect individual participants, were significantly more sensitive. Our hope is that this work helps inform parties who are considering enhancing their water allocation practices with improved stochastic optimisation, and potentially market based mechanisms

    Efficient Adaptive Hard Real-time Multi-processor Systems

    Get PDF
    Modern computing systems are based on multi-processor systems, i.e. multiple cores on the same chip. Hard real-time systems are required to perform particular tasks within certain amount of time; failure to do so characterises an unaccepted behavior. Hard real-time systems are found in safety-critical applications, e.g. airbag control software, flight control software, etc. In safety-critical applications, failure to meet the real-time constraints can have catastrophic effects. The safe and, at the same time, efficient deployment of applications, with hard real-time constraints, on multi-processors is a challenging task. Scheduling methods and Models of Computation, that provide safe deployments, require a realistic estimation of the Worst-Case Execution Time (WCET) of tasks. The simultaneous access of shared resources by parallel tasks, causes interference delays due to hardware arbitration. Interference delays can be accounted for, with the pessimistic assumption that all possible interference can happen. The resulting schedules would be exceedingly conservative, thus the benefits of multi-processor would be significantly negated. Producing less pessimistic schedules is challenging due to the inter-dependency between WCET estimation and deployment optimisation. Accurate estimation of interference delays -and thus estimation of task WCET- depends on the way an application is deployed; deployment is an optimisation problem that depends on the estimation of task WCET. Another efficiency gap, which is of consequence in several systems (e.g. airbag control), stems from the fact that rarely tasks execute with their WCET. Safe runtime adaptation based on the Actual Execution Times, can yield additional improvements in terms of latency (more responsive systems). To achieve efficiency and retain adaptability, we propose that interference analysis should be coupled with the deployment process. The proposed interference analysis method estimates the possible amount of interference, based on an architecture and an application model. As more information is provided, such as scheduling, memory mapping, etc, the per-task interference estimation becomes more accurate. Thus, the method computes interference-sensitive WCET estimations (isWCET). Based on the isWCET method, we propose a method to break the inter-dependency between WCET estimation and deployment optimisation. Initially, the isWCETs are over-approximated, by assuming worst-case interference, and a safe deployment is derived. Subsequently, the proposed method computes accurate isWCETs by spatio-temporal exclusion, i.e. excluding interferences from non-overlapping tasks that share resources (space). Based on accurate isWCETs, the deployment solution is improved to provide better latency guarantees. We also propose a distributed runtime adaptation technique, that aims to improve run-time latency. Using isWCET estimations restricts the possible adaptations, as an adaptation might increase the interference and violate the safety guarantees. The proposed technique introduces statically scheduling dependencies between tasks that prevent additional interference. At runtime, a self-timed scheduling policy that respects these dependencies, is applied, proven to be safe, and with minimal overhead. Experimental evaluation on Kalray MPPA-256 shows that our methods improve isWCET up to 36%, guaranteed latency up to 46%, runtime performance up to 42%, with a consolidated performance gain of 50%

    Data description and manipulation in persistent programming languages

    Get PDF

    Enabling caches in probabilistic timing analysis

    Get PDF
    Hardware and software complexity of future critical real-time systems challenges the scalability of traditional timing analysis methods. Measurement-Based Probabilistic Timing Analysis (MBPTA) has recently emerged as an industrially-viable alternative technique to deal with complex hardware/software. Yet, MBPTA requires certain timing properties in the system under analysis that are not satisfied in conventional systems. In this thesis, we introduce, for the first time, hardware and software solutions to satisfy those requirements as well as to improve MBPTA applicability. We focus on one of the hardware resources with highest impact on both average performance and Worst-Case Execution Time (WCET) in current real-time platforms, the cache. In this line, the contributions of this thesis follow three different axes: hardware solutions and software solutions to enable MBPTA, and MBPTA analysis enhancements in systems featuring caches. At hardware level, we set the foundations of MBPTA-compliant processor designs, and define efficient time-randomised cache designs for single- and multi-level hierarchies of arbitrary complexity, including unified caches, which can be time-analysed for the first time. We propose three new software randomisation approaches (one dynamic and two static variants) to control, in an MBPTA-compliant manner, the cache jitter in Commercial off-the-shelf (COTS) processors in real-time systems. To that end, all variants randomly vary the location of programs' code and data in memory across runs, to achieve probabilistic timing properties similar to those achieved with customised hardware cache designs. We propose a novel method to estimate the WCET of a program using MBPTA, without requiring the end-user to identify worst-case paths and inputs, improving its applicability in industry. We also introduce Probabilistic Timing Composability, which allows Integrated Systems to reduce their WCET in the presence of time-randomised caches. With the above contributions, this thesis pushes the limits in the use of complex real-time embedded processor designs equipped with caches and paves the way towards the industrialisation of MBPTA technology.La complejidad de hardware y software de los sistemas críticos del futuro desafía la escalabilidad de los métodos tradicionales de análisis temporal. El análisis temporal probabilístico basado en medidas (MBPTA) ha aparecido últimamente como una solución viable alternativa para la industria, para manejar hardware/software complejo. Sin embargo, MBPTA requiere ciertas propiedades de tiempo en el sistema bajo análisis que no satisfacen los sistemas convencionales. En esta tesis introducimos, por primera vez, soluciones hardware y software para satisfacer estos requisitos como también mejorar la aplicabilidad de MBPTA. Nos centramos en uno de los recursos hardware con el máximo impacto en el rendimiento medio y el peor caso del tiempo de ejecución (WCET) en plataformas actuales de tiempo real, la cache. En esta línea, las contribuciones de esta tesis siguen 3 ejes distintos: soluciones hardware y soluciones software para habilitar MBPTA, y mejoras de el análisis MBPTA en sistemas usado caches. A nivel de hardware, creamos las bases del diseño de un procesador compatible con MBPTA, y definimos diseños de cache con tiempo aleatorio para jerarquías de memoria con uno y múltiples niveles de cualquier complejidad, incluso caches unificadas, las cuales pueden ser analizadas temporalmente por primera vez. Proponemos tres nuevos enfoques de aleatorización de software (uno dinámico y dos variedades estáticas) para manejar, en una manera compatible con MBPTA, la variabilidad del tiempo (jitter) de la cache en procesadores comerciales comunes en el mercado (COTS) en sistemas de tiempo real. Por eso, todas nuestras propuestas varían aleatoriamente la posición del código y de los datos del programa en la memoria entre ejecuciones del mismo, para conseguir propiedades de tiempo aleatorias, similares a las logradas con diseños hardware personalizados. Proponemos un nuevo método para estimar el WCET de un programa usando MBPTA, sin requerir que el usuario dentifique los caminos y las entradas de programa del peor caso, mejorando así la aplicabilidad de MBPTA en la industria. Además, introducimos la composabilidad de tiempo probabilística, que permite a los sistemas integrados reducir su WCET cuando usan caches de tiempo aleatorio. Con estas contribuciones, esta tesis empuja los limites en el uso de diseños complejos de procesadores empotrados en sistemas de tiempo real equipados con caches y prepara el terreno para la industrialización de la tecnología MBPTA
    corecore