30 research outputs found

    Statistical analysis of natural disasters and related losses

    No full text
    The study of disaster statistics and disaster occurrence is a complicated interdisciplinary field involving the interplay of new theoretical findings from several scientific fields like mathematics, physics, and computer science. Statistical studies on the mode of occurrence of natural disasters largely rely on fundamental findings in the statistics of rare events, which were derived in the 20th century. With regard to natural disasters, it is not so much the fact that the importance of this problem for mankind was recognized during the last third of the 20th century - the myths one encounters in ancient civilizations show that the problem of disasters has always been recognized - rather, it is the fact that mankind now possesses the necessary theoretical and practical tools to effectively study natural disasters, which in turn supports effective, major practical measures to minimize their impact. All the above factors have resulted in considerable progress in natural disaster research. Substantial accrued material on natural disasters and the use of advanced recording techniques have opened new doors for empirical analysis. However, despite the considerable progress made, the situation is still far from ideal. Sufficiently complete catalogs of events are still not available for many types of disasters, and the methodological and even terminological bases of research need to be further developed and standardized. The present monograph summarizes recent advances in the field of disaster statistics, primarily focusing on the occurrence of disasters that can be described by distributions with heavy tails. These disasters typically occur on a very broad range of scales, the rare greatest events being capable of causing losses comparable to the total losses of all smaller disasters of the same type. Audience: This SpringerBrief will be a valuable resource for those working in the fields of  natural disaster research, risk assessment and loss mitigation at regional and federal governing bodies and in the insurance business, as well as for a broad range of readers interested in problems concerning natural disasters and their effects on human life

    On the Distribution of the Maximum of a Random Sample

    No full text

    A Neural Based Comparative Analysis for Feature Extraction from ECG Signals

    No full text
    Automated ECG analysis and classification are nowadays a fundamental tool for monitoring patient heart activity properly. The most important features used in literature are the raw data of a time window, the temporal attributes and the frequency information from the eigenvector techniques. This paper compares these approaches from a topological point of view, by using linear and nonlinear projections and a neural network for assessing the corresponding classification quality. The nonlinearity of the feature data manifold carries most of the QRS-complex information. Indeed, it yields high rates of classification with the smallest number of features. This is most evident if temporal features are used: Nonlinear dimensionality reduction techniques allow a very large data compression at the expense of a slight loss of accuracy. It can be an advantage in applications where the computing time is a critical factor. If, instead, the classification is performed offline, the raw data technique is the best one
    corecore