8 research outputs found
Quantification of the environmental structural risk with spoiling ties: Is randomization worth?
Many recent works show that copulas turn out to be useful in a variety of different ap-
plications, especially in environmental sciences. Here the variables of interest are usually
continuous, being times, lengths, weights, and so on. Unfortunately, the corresponding
observations may suffer from (instrumental) rounding and adjustments, and eventually
they may show several repeated values (i.e., ties). In turn, on the one hand, a tricky
issue of identifiability of the model arises, and, on the other hand, the assessment of the
risk may be adversely affected. A possible remedy is to introduce suitable randomization
procedures: here three different jittering strategies are outlined. The target of the work is to carry out a simulation study in order to evaluate the effects of the randomization
of multivariate observations when ties are present. In particular, it will be investigated
whether, how, and to what extent, the randomization may change the estimation of the
structural risk: for this purpose, a coastal engineering example will be used, as archetypical of a broad class of models and problems in engineering practice. Practical advices
and warnings about the use of randomization techniques are hence given
Goodness-of-fit testing based on a weighted bootstrap: A fast large-sample alternative to the parametric bootstrap
The process comparing the empirical cumulative distribution function of the
sample with a parametric estimate of the cumulative distribution function is
known as the empirical process with estimated parameters and has been
extensively employed in the literature for goodness-of-fit testing. The
simplest way to carry out such goodness-of-fit tests, especially in a
multivariate setting, is to use a parametric bootstrap. Although very easy to
implement, the parametric bootstrap can become very computationally expensive
as the sample size, the number of parameters, or the dimension of the data
increase. An alternative resampling technique based on a fast weighted
bootstrap is proposed in this paper, and is studied both theoretically and
empirically. The outcome of this work is a generic and computationally
efficient multiplier goodness-of-fit procedure that can be used as a
large-sample alternative to the parametric bootstrap. In order to approximately
determine how large the sample size needs to be for the parametric and weighted
bootstraps to have roughly equivalent powers, extensive Monte Carlo experiments
are carried out in dimension one, two and three, and for models containing up
to nine parameters. The computational gains resulting from the use of the
proposed multiplier goodness-of-fit procedure are illustrated on trivariate
financial data. A by-product of this work is a fast large-sample
goodness-of-fit procedure for the bivariate and trivariate t distribution whose
degrees of freedom are fixed.Comment: 26 pages, 5 tables, 1 figur
An overview of the goodness-of-fit test problem for copulas
We review the main "omnibus procedures" for goodness-of-fit testing for
copulas: tests based on the empirical copula process, on probability integral
transformations, on Kendall's dependence function, etc, and some corresponding
reductions of dimension techniques. The problems of finding asymptotic
distribution-free test statistics and the calculation of reliable p-values are
discussed. Some particular cases, like convenient tests for time-dependent
copulas, for Archimedean or extreme-value copulas, etc, are dealt with.
Finally, the practical performances of the proposed approaches are briefly
summarized
A goodness-of-fit test for multivariate multiparameter copulas based on multiplier central limit theorems
ACLInternational audienceno abstrac
Business-Oriented Leadership Competencies of K-12 Educational Leaders
Contemporary K-12 educational leaders must fulfill many roles and responsibilities similar to those fulfilled by traditional business leaders. There is, however, a lack of information about the business-oriented competencies of K12 educational leaders in comparison with business executive norms. This lack of information places K-12 institutions at risk of selecting leaders who are not capable of accomplishing institutional goals and objectives, improving the efficiency and sustainability of business operations, meeting stakeholder expectations, managing social responsibilities, and improving the educational foundation of the next-generation workforce. Grounded in leadership theory, this nonexperimental study included the California Psychological Inventory 260 assessment to capture leadership scale values of 20 K-12 educational leaders in the United States. A 2-tailed, 1-sample t test was used to examine the difference between the leadership scale mean of the sample (n = 20) and the leadership scale mean test value of 62 as measured by the Center for Creative Leadership within a group of business executives (n = 5,610). Using a 95% confidence level, the calculated leadership scale mean value for the sample was 61.96 (p = .982). Although no significant difference existed between the leadership scale means, the identification of gaps in business-oriented leadership competencies indicates that some K-12 leaders may require additional professional development. The findings from this study may influence positive social change by providing human resource and hiring managers with knowledge about using leadership scale measurements to improve the selection and professional development of K-12 educational leader
Probabilistic Models for Droughts: Applications in Trigger Identification, Predictor Selection and Index Development
The current practice of drought declaration (US Drought Monitor) provides a hard classification of droughts using various hydrologic variables. However, this method does not yield model uncertainty, and is very limited for forecasting upcoming droughts. The primary goal of this thesis is to develop and implement methods that incorporate uncertainty estimation into drought characterization, thereby enabling more informed and better decision making by water users and managers. Probabilistic models using hydrologic variables are developed, yielding new insights into drought characterization enabling fundamental applications in droughts
An integrated assessment framework for quantifying and forecasting water-related multi-hazard risk
PhD ThesisDisaster risks induced by different kinds of hazard may emerge in any place where human
activities or properties exist. Most human settlements are exposed to more than one hazard.
The multi-hazard risk analysis that assesses the potential loss caused by multiple natural
hazards can provide a valuable reference for regional land-use planning, disaster prevention
and emergency management. Although an increasing number of risk assessment methods
related to multi-hazard have been developed recently, three main challenges remain in the
current practices: (1) the disparate characteristics of hazards increase the difficulty of their
combination and comparison, (2) the dependence and interactions between different hazards
are often neglected, and (3) the results of multi-hazard risk assessment are not quantitative to
show the probability of disaster loss.
This thesis aims to construct an integrated framework to quantify and forecast the risk of
multiple water-related hazards including heavy rainfall, extreme river flow, and storm surge.
The framework consists of the three typical components of disaster risk assessment containing
hazard, vulnerability, and risk analysis and is applied in the Greater London and the Eden
Catchment, UK. For hazard analysis, the joint probability and return period distributions are
fitted for the three water-related hazards on the basis of dependence analysis and copula
theory. A newly developed 2D hydrodynamic model is enhanced with auto Input-Output
control and processing in a multi-GPU platform to drive numerous flood simulations. The
frequency-inundation curves due to the combination of the three hazards are generated by
connecting the joint return period functions and the results of flood simulations. The
distribution of human life and properties in the research area are analysed and classified with
different vulnerability curves that quantify the potential damage due to the severity of
inundation. The component of risk analysis evaluates the probability of loss for human life or
different types of properties according to the results from the hazard and vulnerability
analysis.
The risk assessment framework considers the interaction and dependence between the
multiple hazards by using hydrodynamic modelling and joint probability analysis,
respectively. It can produce fully quantitative results such as risk curves quantifying the
probability of different damage states, and risk maps illustrating the expected loss in the
research region. With the efficient 2D hydrodynamic model and the autoprocessing package,
the framework is further applied to give flood and risk forecasting to the Eden Catchment by
integrating with a numerical weather prediction model.
The framework shows a quantitative approach of multi-hazard risk assessment. It also
provides an integrated procedure of flood risk analysis and forecast in consideration of the
dependence and interactions between different water sources. The methodology and the
findings are of interest to insurance companies, regional planners, economists, disasterprevention authorities, and residents under the threat of flooding. The main source of
uncertainties of the framework and the limitations are identified. Future work and further
applications in other regions are recommended.Newcastle University, Sir James Knott Studentship
from Institute for Sustainability, and Henry Lester Trus