2,093 research outputs found

    Evolution: Complexity, uncertainty and innovation

    Get PDF
    Complexity science provides a general mathematical basis for evolutionary thinking. It makes us face the inherent, irreducible nature of uncertainty and the limits to knowledge and prediction. Complex, evolutionary systems work on the basis of on-going, continuous internal processes of exploration, experimentation and innovation at their underlying levels. This is acted upon by the level above, leading to a selection process on the lower levels and a probing of the stability of the level above. This could either be an organizational level above, or the potential market place. Models aimed at predicting system behaviour therefore consist of assumptions of constraints on the micro-level – and because of inertia or conformity may be approximately true for some unspecified time. However, systems without strong mechanisms of repression and conformity will evolve, innovate and change, creating new emergent structures, capabilities and characteristics. Systems with no individual freedom at their lower levels will have predictable behaviour in the short term – but will not survive in the long term. Creative, innovative, evolving systems, on the other hand, will more probably survive over longer times, but will not have predictable characteristics or behaviour. These minimal mechanisms are all that are required to explain (though not predict) the co-evolutionary processes occurring in markets, organizations, and indeed in emergent, evolutionary communities of practice. Some examples will be presented briefly

    Extended object reconstruction in adaptive-optics imaging: the multiresolution approach

    Full text link
    We propose the application of multiresolution transforms, such as wavelets (WT) and curvelets (CT), to the reconstruction of images of extended objects that have been acquired with adaptive optics (AO) systems. Such multichannel approaches normally make use of probabilistic tools in order to distinguish significant structures from noise and reconstruction residuals. Furthermore, we aim to check the historical assumption that image-reconstruction algorithms using static PSFs are not suitable for AO imaging. We convolve an image of Saturn taken with the Hubble Space Telescope (HST) with AO PSFs from the 5-m Hale telescope at the Palomar Observatory and add both shot and readout noise. Subsequently, we apply different approaches to the blurred and noisy data in order to recover the original object. The approaches include multi-frame blind deconvolution (with the algorithm IDAC), myopic deconvolution with regularization (with MISTRAL) and wavelets- or curvelets-based static PSF deconvolution (AWMLE and ACMLE algorithms). We used the mean squared error (MSE) and the structural similarity index (SSIM) to compare the results. We discuss the strengths and weaknesses of the two metrics. We found that CT produces better results than WT, as measured in terms of MSE and SSIM. Multichannel deconvolution with a static PSF produces results which are generally better than the results obtained with the myopic/blind approaches (for the images we tested) thus showing that the ability of a method to suppress the noise and to track the underlying iterative process is just as critical as the capability of the myopic/blind approaches to update the PSF.Comment: In revision in Astronomy & Astrophysics. 19 pages, 13 figure

    Seismic Risk Assessment Tools Workshop

    Get PDF
    Held in the European Crisis Management Laboratory on 11-12 May 2017, the Workshop brought together on one side the developers of some of the most widely used modern seismic risk assessment tools and on the other a number of Civil Protection authorities from countries of the European Civil Protection Mechanism. The objective was to demonstrate the use and capabilities of the tools, explore the possible use in near-real-time impact assessment and promote their use in risk planning and disaster response. The systems presented in the workshop demonstrated a very high sophistication and increased flexibility in accepting data from a large number of sources and formats. Systems that were initially developed on a national scale can now work on a global level with little effort and the use of global-scale exposure data is almost seamless. An urgent need for more accurate exposure data being openly available was identified, as well as the need of proper use of the fragility curves. Inter-system collaboration and interoperability in some cases to increase ease of use was greatly appreciated and encouraged. All systems participated in a real-time simulation exercise on previously unknown seismic data provided by the JRC; some additional automation might be in order, but in general all systems demostrated a capacity to produce results on a near-real-time basis. The demonstrations were unanimously welcomed as very useful by the participating Civil Protection Authorities, most of which are either using a locally-developed system of moving towards using one of those presented in the workshop.JRC.E.1-Disaster Risk Managemen

    Pseudorandom Strings from Pseudorandom Quantum States

    Full text link
    A fundamental result in classical cryptography is that pseudorandom generators are equivalent to one-way functions and in fact implied by nearly every classical cryptographic primitive requiring computational assumptions. In this work, we consider a variant of pseudorandom generators called quantum pseudorandom generators (QPRGs), which are quantum algorithms that (pseudo)deterministically map short random seeds to long pseudorandom strings. We provide evidence that QPRGs can be as useful as PRGs by providing cryptographic applications of QPRGs such as commitments and encryption schemes. Our main result is showing that QPRGs can be constructed assuming the existence of logarithmic-length quantum pseudorandom states. This raises the possibility of basing QPRGs on assumptions weaker than one-way functions. We also consider quantum pseudorandom functions (QPRFs) and show that QPRFs can be based on the existence of logarithmic-length pseudorandom function-like states. Our primary technical contribution is a method for pseudodeterministically extracting uniformly random strings from Haar-random states.Comment: 45 pages, 1 figur

    Particle filtering in high-dimensional chaotic systems

    Full text link
    We present an efficient particle filtering algorithm for multiscale systems, that is adapted for simple atmospheric dynamics models which are inherently chaotic. Particle filters represent the posterior conditional distribution of the state variables by a collection of particles, which evolves and adapts recursively as new information becomes available. The difference between the estimated state and the true state of the system constitutes the error in specifying or forecasting the state, which is amplified in chaotic systems that have a number of positive Lyapunov exponents. The purpose of the present paper is to show that the homogenization method developed in Imkeller et al. (2011), which is applicable to high dimensional multi-scale filtering problems, along with important sampling and control methods can be used as a basic and flexible tool for the construction of the proposal density inherent in particle filtering. Finally, we apply the general homogenized particle filtering algorithm developed here to the Lorenz'96 atmospheric model that mimics mid-latitude atmospheric dynamics with microscopic convective processes.Comment: 28 pages, 12 figure

    A multi-criteria methodology for the integration of Risk Assessment into Spatial Planning as a basis for territorial resilience. The case of Seismic Risk

    Get PDF
    Rapid urban development and continuous demands for space have increased the pressure on the territory. The need for this “usable” space, no matter the purpose, leads to an excess of capacities of existing areas and the creation of new areas, both significantly increasing the level of exposure to natural disasters. Statistics show that within a period of almost two decades from 1994 to 2013, 218 million people were affected by natural disasters annually (CRED, 2015). In the situation where the demand for growth is accompanied by an increasing potentiality of damages in economic, social, environmental or cultural terms, disaster risk management (DRM) is having an important focus in terms of research. The way communities and urban systems react to a natural distress is tightly related to the economic and technological development as well as data availability. Developed countries have the capacities to consider mitigation strategies in pre-event situations, which is not always feasible for developing and poor countries. Also, as emphasized by (Gaillard & Mercer, 2012), the issue is related to the fact that disasters affect those who are marginalized and have partial or no access to resources and means of protection. Such paradigm imposes the need to develop preventive strategies focusing on the community, which is directly affected by aftermath of these natural events. The analysis of natural disasters and their impact on the society and the built environment is complex and requires an integration of multi-disciplinary information from social to exact sciences. The main issue that hinders the entire process is mainly related to the effectiveness of transmitting such an information between different stakeholders such as experts, responsible local and national authorities and the community itself. This process is even more difficult in the conditions where there is a lack of information, appropriate tools and also the lack of risk perception by the community, especially in the cases of disasters having a relatively large return period such as earthquakes. The purpose of this research is the analysis of a possible way to integrate disaster risk information within planning instruments aiming towards an inclusive disaster risk reduction (DRR) process through the proposal of a risk assessment methodology at a local scale for the case of seismic events. The analysis is carried out through the proposal of a hierarchic system containing several parameters that characterize firstly the hazard itself and secondly, the built environment in terms of exposure and vulnerability by a combination of a multi-scale information (building and local scale). The selection of relevant parameters, their value, the relationship to one another and their contribution will be given based on a thorough literature research, site visits, questionnaires and experts opinions. The results will be given in the form of a visual spatial information using mapping processes. The main objective is that the proposed methodology will serve as a preliminary tool for several decision-making processes in terms of strategic risk reduction measures, policies, prioritization, fund allocation etc. The methodology is also aimed to serve as an important node that connects the community, the experts and responsible authorities with one another towards an inclusive disaster risk reduction approach.Il rapido sviluppo urbano e le continue richieste di spazio hanno aumentato la pressione sul territorio. La necessità per questo spazio “utilizzabile”, indipendentemente dallo scopo, porta ad un eccesso di capacità delle aree esistenti e alla creazione di nuove aree, in entrambi casi aumentando notevolmente il livello di esposizione ai disastri naturali. Le statistiche mostrano che in un periodo di quasi due decenni, dal 1994 al 2013, 218 milioni di persone sono state colpite ogni anno da disastri naturali (CRED, 2015). Nella situazione in cui la richiesta di crescente utilizzo del terreno è accompagnata da una crescente potenzialità dei danni in termini economici, sociali, ambientali o culturali, la gestione del rischio dei disastri sta avendo un ruolo sempre più importante in termini di ricerca. Il modo in cui le comunità e i sistemi urbani reagiscono ad un evento naturale è strettamente correlato allo sviluppo economico e tecnologico, nonché alla disponibilità dei dati. I paesi sviluppati hanno la capacità di prendere in considerazione strategie di mitigazione in situazioni pre-evento, il che non è sempre fattibile nei Paesi in via di sviluppo e in quelli poveri. Inoltre, come sottolineato da (Gaillard & Mercer, 2012), la questione è legata al fatto che i disastri colpiscono la parte di comunità emarginata e che ha accesso parziale o nullo alle risorse e ai mezzi di protezione. Tale paradigma impone la necessità di sviluppare strategie preventive incentrate sulla comunità, che è direttamente colpita dalle conseguenze di questi eventi naturali. L'analisi dei disastri naturali e del loro impatto sulla società e sull'ambiente urbano è complessa e richiede un'integrazione di informazioni multidisciplinari dalle scienze sociali a quelle esatte. Il problema principale che ostacola l'intero processo è principalmente legato all'efficacia della trasmissione di tali informazioni tra le diverse parti interessate come esperti, autorità locali e nazionali che hanno responsabilità in tal senso e la comunità stessa. Questo processo è ancora più difficile nelle condizioni in cui mancano informazioni, strumenti adeguati e anche la mancanza di percezione del rischio da parte della comunità, soprattutto nei casi di catastrofi con un periodo di ritorno relativamente lungo come i terremoti. Lo scopo di questa ricerca è l'analisi della possibilità di integrare le informazioni sul rischio di disastro all'interno degli strumenti di pianificazione che mirano a un processo inclusivo di riduzione del rischio, attraverso la proposta di una metodologia di valutazione del rischio stesso a scala locale per il caso di eventi sismici. L'analisi viene condotta attraverso la proposta di un sistema gerarchico contenente diversi parametri che caratterizzano in primo luogo l’azzardo stesso e in secondo luogo l'ambiente urbano in termini di esposizione e vulnerabilità mediante una combinazione di informazioni multiscala (edificio e scala locale). La selezione dei parametri rilevanti, il loro valore, la relazione tra loro e il loro contributo, saranno analizzati sulla base di un'approfondita ricerca bibliografica, visite in situ, questionari e opinioni di esperti. I risultati saranno forniti sotto forma di informazioni spaziali visive utilizzando processi di mappatura. L'obiettivo principale è che la metodologia proposta serva da strumento preliminare per diversi processi decisionali in termini di misure strategiche di riduzione del rischio, normative, definizione delle priorità, allocazione dei fondi, ecc. Lo scopo ulteriore della ricerca è anche quello che la metodologia proposta serva da nodo di collegamento tra la comunità, gli esperti e le autorità responsabili tra loro verso un approccio inclusivo alla riduzione del rischio delle catastrofi

    Ideal quantum protocols in the non-ideal physical world

    Get PDF
    The development of quantum protocols from conception to experimental realizations is one of the main sources of the stimulating exchange between fundamental and experimental research characteristic to quantum information processing. In this thesis we contribute to the development of two recent quantum protocols, Universal Blind Quantum Computation (UBQC) and Quantum Digital Signatures (QDS). UBQC allows a client to delegate a quantum computation to a more powerful quantum server while keeping the input and computation private. We analyse the resilience of the privacy of UBQC under imperfections. Then, we introduce approximate blindness quantifying any compromise to privacy, and propose a protocol which enables arbitrary levels of security despite imperfections. Subsequently, we investigate the adaptability of UBQC to alternative implementations with practical advantages. QDS allow a party to send a message to other parties which cannot be forged, modified or repudiated. We analyse the security properties of a first proof-of-principle experiment of QDS, implemented in an optical system. We estimate the security failure probabilities of our system as a function of protocol parameters, under all but the most general types of attacks. Additionally, we develop new techniques for analysing transformations between symmetric sets of states, utilized not only in the security proofs of QDS but in other applications as well
    corecore