282 research outputs found

    Multiple perspectives of resilience: A holistic approach to resilience assessment using cognitive maps in practitioner engagement

    Get PDF
    Resilience has become a regulatory concept influencing investment decisions in the water and wastewater sector. However, current assessments predominantly focus on technical resilience and on engineering solutions. Here we propose an alternative, more holistic approach that captures multiple perspectives of resilience by eliciting and comparing cognitive maps of diverse agents both from within as well as external to a wastewater utility. We use Fuzzy Cognitive Mapping as a practical tool to elicit subjective views on resilience mechanisms and illustrate the methodology in co-production with professionals from the wastewater sector in the Belfast area (Northern Ireland). We find that the proposed participatory process facilitates a more reflective , inclusive and integrated assessment than current approaches. Screening for risks and vulnerabilities using this new approach can foster an integrated system perspective by (i) systematically identifying connections between (sub)systems which are normally assessed separately, (ii) detecting feedbacks between system components which may reveal unintended consequences of resilience interventions and by (iii) obtaining a wider portfolio of potential interventions to increase overall resilience. We conclude that the suggested approach may be useful for strategic planning purposes within a utility and for improving cross-departmental communication among both internal and external agents. © 2020 The Author

    Valuing deaths or years of life lost? Economic benefits of avoided mortality from early heat warning systems

    Get PDF
    The study aims to explore the main drivers influencing the economic appraisal of heat warning systems by integrating epidemiological modelling and benefit-cost analysis. To shed insights on heat wave mortality valuation, we consider three valuation schemes: (i) a traditional one, where the value of a statistical life (VSL) is applied to both displaced and premature mortality; (ii) an intermediate one, with VSL applied for premature mortality and value of a life year (VOLY) for displaced mortality; and (iii) a conservative one, where both premature and displaced mortality are quantified in terms of loss of life expectancy, and then valued using the VOLY approach. When applying these three schemes to Madrid (Spain), we obtain a benefit-cost ratio varying from 12 to 3700. We find that the choice of the valuation scheme has the largest influence, whereas other parameters such as attributable risk, displaced mortality ratio, or the comprehensiveness and effectiveness of the heat warning system are less influential. The results raise the question of which is the most appropriate approach to value mortality in the context of heat waves, given that the lower bound estimate for the benefit-cost ratio (option iii using VOLY) is up to two orders of magnitude lower than the value based on the traditional VSL approach (option i). The choice of the valuation methodology has significant implications for public health authorities at the local and regional scale, which becomes highly relevant for locations where the application of the VOLY approach could lead to benefit-cost ratios significantly lower than 1. We propose that specific metrics for premature and displaced VOLYs should be developed for the context of heat waves. Until such values are available, we suggest testing the economic viability of heat warning systems under the three proposed valuation schemes (i iii) and using values for VOLY commonly applied in air pollution as the health end points are similar. Lastly, periodical reassessment of heat alert plans should be performed by public health authorities to monitor their long-term viability and cost-effectiveness. © 2018, The Author(s).This study is part of the project BASE (Bottom-up Climate Adaptation Strategies for a Sustainable Europe) funded by the European Union?s Seventh Framework Programme for research, technological development, and demonstration under Grant Agreement No. 308337. Marc B. Neumann also acknowledges financial support from the Ram?n y Cajal Research Fellowship of the Ministry of Economy and Competitiveness of Spain (no. RYC-2013-13628

    Responses of sub-Saharan smallholders to climate change: Strategies and drivers of adaptation

    Get PDF
    Rural farm households in sub-Saharan Africa are vulnerable to climate variability due to their limited adaptive capacity. This paper explores how adaptation strategies are adopted by small-holders in sub-Saharan Africa as a function of their adaptive capacity. The latter is characterised by five types of capital: natural, physical, financial, human, and social. We use responses from farm households in sub-Saharan Africa dating from 1536 obtained by Climate Change, Agriculture and Food Security (CCAFS). This data provides information on the adoption of adaptation practices during the study period as well as information with which we develop indicators for the five types of capital. The results suggest that all the five types of capital positively influence adoption of adaptation practices. Human and social capital both displayed a positive and significant effect on the uptake of most adaptation practices. This finding suggests that the effect of less tangible kinds of capital such as knowledge, individual perceptions, farmers networks and access to information may be stronger than normally assumed. Directing more development policies towards enhancing human and social capital may therefore be more cost-effective than further investments into physical and financial capital, and could help in overcoming social barriers to adaptation to climate change. (c) 2018 Elsevier LtdMarc B. Neumann acknowledges financial support from the Ramón y Cajal Fellowship of the Ministry of Economy and Competitiveness of Spain (no. RYC-2013-13628 )

    Accounting for erroneous model structures in biokinetic process models

    Get PDF
    In engineering practice, model-based design requires not only a good process-based model, but also a good description of stochastic disturbances and measurement errors to learn credible parameter values from observations. However, typical methods use Gaussian error models, which often cannot describe the complex temporal patterns of residuals. Consequently, this results in overconfidence in the identified parameters and, in turn, optimistic reactor designs. In this work, we assess the strengths and weaknesses of a method to statistically describe these patterns with autocorrelated error models. This method produces increased widths of the credible prediction intervals following the inclusion of the bias term, in turn leading to more conservative design choices. However, we also show that the augmented error model is not a universal tool, as its application cannot guarantee the desired reliability of the resulting wastewater reactor design. © 2020 Elsevier LtdMarc B. Neumann acknowledges financial support provided by the Spanish Government through the BC3 María de Maeztu excellence accreditation 2018–2022 (MDM-2017-0714) and the Ramón y Cajal grant (RYC-2013-13628); and by the Basque Government through the BERC 2018-2021 program

    A pre-crisis vs. crisis analysis of peripheral EU stock markets by means of wavelet transform and a nonlinear causality test

    Get PDF
    This paper presents an analysis of EU peripheral (so-called PIIGS) stock market indices and the SandP Europe 350 index (SPEURO), as a European benchmark market, over the pre-crisis (2004 2007) and crisis (2008 2011) periods. We computed a rolling-window wavelet correlation for the market returns and applied a non-linear Granger causality test to the wavelet decomposition coefficients of these stock market returns. Our results show that the correlation is stronger for the crisis than for the pre-crisis period. The stock market indices from Portugal, Italy and Spain were more interconnected among themselves during the crisis than with the SPEURO. The stock market from Portugal is the most sensitive and vulnerable PIIGS member, whereas the stock market from Greece tends to move away from the European benchmark market since the 2008 financial crisis till 2011. The non-linear causality test indicates that in the first three wavelet scales (intraweek, weekly and fortnightly) the number of uni-directional and bi-directional causalities is greater during the crisis than in the pre-crisis period, because of financial contagion. Furthermore, the causality analysis shows that the direction of the Granger cause effect for the pre-crisis and crisis periods is not invariant in the considered time-scales, and that the causality directions among the studied stock markets do not seem to have a preferential direction. These results are relevant to better understand the behaviour of vulnerable stock markets, especially for investors and policymakers. © 2017 Elsevier B.V

    Enhanced Quantum Estimation via Purification

    Full text link
    We analyze the estimation of a finite ensemble of quantum bits which have been sent through a depolarizing channel. Instead of using the depolarized qubits directly, we first apply a purification step and show that this improves the fidelity of subsequent quantum estimation. Even though we lose some qubits of our finite ensemble the information is concentrated in the remaining purified ones.Comment: 6 pages, including 3 figure

    Long-Time Behavior of Macroscopic Quantum Systems: Commentary Accompanying the English Translation of John von Neumann's 1929 Article on the Quantum Ergodic Theorem

    Full text link
    The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann's 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one he calls the "quantum H-theorem", is actually a much weaker statement than Boltzmann's classical H-theorem, the other theorem, which he calls the "quantum ergodic theorem", is a beautiful and very non-trivial result. It expresses a fact we call "normal typicality" and can be summarized as follows: For a "typical" finite family of commuting macroscopic observables, every initial wave function ψ0\psi_0 from a micro-canonical energy shell so evolves that for most times tt in the long run, the joint probability distribution of these observables obtained from ψt\psi_t is close to their micro-canonical distribution.Comment: 34 pages LaTeX, no figures; v2: minor improvements and additions. The English translation of von Neumann's article is available as arXiv:1003.213

    Output spectrum of a detector measuring quantum oscillations

    Full text link
    We consider a two-level quantum system (qubit) which is continuously measured by a detector and calculate the spectral density of the detector output. In the weakly coupled case the spectrum exhibits a moderate peak at the frequency of quantum oscillations and a Lorentzian-shape increase of the detector noise at low frequency. With increasing coupling the spectrum transforms into a single Lorentzian corresponding to random jumps between two states. We prove that the Bayesian formalism for the selective evolution of the density matrix gives the same spectrum as the conventional master equation approach, despite the significant difference in interpretation. The effects of the detector nonideality and the finite-temperature environment are also discussed.Comment: 8 pages, 6 figure

    Selective quantum evolution of a qubit state due to continuous measurement

    Full text link
    We consider a two-level quantum system (qubit) which is continuously measured by a detector. The information provided by the detector is taken into account to describe the evolution during a particular realization of measurement process. We discuss the Bayesian formalism for such ``selective'' evolution of an individual qubit and apply it to several solid-state setups. In particular, we show how to suppress the qubit decoherence using continuous measurement and the feedback loop.Comment: 15 pages (including 9 figures

    Classical approach in quantum physics

    Full text link
    The application of a classical approach to various quantum problems - the secular perturbation approach to quantization of a hydrogen atom in external fields and a helium atom, the adiabatic switching method for calculation of a semiclassical spectrum of hydrogen atom in crossed electric and magnetic fields, a spontaneous decay of excited states of a hydrogen atom, Gutzwiller's approach to Stark problem, long-lived excited states of a helium atom recently discovered with the help of Poincareˊ\acute{\mathrm{e}} section, inelastic transitions in slow and fast electron-atom and ion-atom collisions - is reviewed. Further, a classical representation in quantum theory is discussed. In this representation the quantum states are treating as an ensemble of classical states. This approach opens the way to an accurate description of the initial and final states in classical trajectory Monte Carlo (CTMC) method and a purely classical explanation of tunneling phenomenon. The general aspects of the structure of the semiclassical series such as renormgroup symmetry, criterion of accuracy and so on are reviewed as well. In conclusion, the relation between quantum theory, classical physics and measurement is discussed.Comment: This review paper was rejected from J.Phys.A with referee's comment "The author has made many worthwhile contributions to semiclassical physics, but this article does not meet the standard for a topical review"
    corecore