4,898 research outputs found
Recommended from our members
Wavelet-based response spectrum compatible synthesis of accelerograms-Eurocode application (EC8)
An integrated approach for addressing the problem of synthesizing artificial seismic accelerograms compatible with a given displacement design/target spectrum is presented in conjunction with aseismic design applications. Initially, a stochastic dynamics solution is used to obtain a family of simulated non-stationary earthquake records whose response spectrum is on the average in good agreement with the target spectrum. The degree of the agreement depends significantly on the adoption of an appropriate parametric evolutionary power spectral form, which is related to the target spectrum in an approximate manner. The performance of two commonly used spectral forms along with a newly proposed one is assessed with respect to the elastic displacement design spectrum defined by the European code regulations (EC8). Subsequently, the computational versatility of the family of harmonic wavelets is employed to modify iteratively the simulated records to satisfy the compatibility criteria for artificial accelerograms prescribed by EC8. In the process, baseline correction steps, ordinarily taken to ensure that the obtained accelerograms are characterized by physically meaningful velocity and displacement traces, are elucidated. Obviously, the presented approach can be used not only in the case of the EC8, for which extensive numerical results/examples are included, but also for any code provisions mandated by regulatory agencies. In any case, the presented numerical results can be quite useful in any aseismic design process dominated by the EC8 specifications
Where do statistical models come from? Revisiting the problem of specification
R. A. Fisher founded modern statistical inference in 1922 and identified its
fundamental problems to be: specification, estimation and distribution. Since
then the problem of statistical model specification has received scant
attention in the statistics literature. The paper traces the history of
statistical model specification, focusing primarily on pioneers like Fisher,
Neyman, and more recently Lehmann and Cox, and attempts a synthesis of their
views in the context of the Probabilistic Reduction (PR) approach. As argued by
Lehmann [11], a major stumbling block for a general approach to statistical
model specification has been the delineation of the appropriate role for
substantive subject matter information. The PR approach demarcates the
interrelated but complemenatry roles of substantive and statistical information
summarized ab initio in the form of a structural and a statistical model,
respectively. In an attempt to preserve the integrity of both sources of
information, as well as to ensure the reliability of their fusing, a purely
probabilistic construal of statistical models is advocated. This probabilistic
construal is then used to shed light on a number of issues relating to
specification, including the role of preliminary data analysis, structural vs.
statistical models, model specification vs. model selection, statistical vs.
substantive adequacy and model validation.Comment: Published at http://dx.doi.org/10.1214/074921706000000419 in the IMS
Lecture Notes--Monograph Series
(http://www.imstat.org/publications/lecnotes.htm) by the Institute of
Mathematical Statistics (http://www.imstat.org
Revisiting the Neyman-Scott model: an Inconsistent MLE or an Ill-defined Model?
The Neyman and Scott (1948) model is widely used to demonstrate a serious
weakness of the Maximum Likelihood (ML) method: it can give rise to
inconsistent estimators. The primary objective of this paper is to revisit this
example with a view to demonstrate that the culprit for the inconsistent
estimation is not the ML method but an ill-defined statistical model. It is
also shown that a simple recasting of this model renders it well-defined and
the ML method gives rise to consistent and asymptotically efficient estimators
The role of the classroom teacher in a speech improvement program.
Thesis (Ed.M.)--Boston Universit
Corporate governance in Greece: developments and policy implications
The upgrading of the Greek capital market and the effort to join other mature capital markets has posed corporate governance reform as a first priority. In addition, the 2004 Olympic Games put the Greek market in the international spotlight and will likely invite interest from foreign investors. More than ever, an efficient corporate governance framework is condition sine qua non for the competitive transformation of the capital market and the business world. At the same time the European Union (EU) faces both the pressure and challenge for harmonization of the laws and regulations and convergence of corporate governance systems, especially after the entrance of the new member states. The paper has two objectives: (i) to present the main aspects of corporate governance in Greece, contributing to the relevant growing body of literature, and (ii) to place the current corporate governance developments and trends in Greece within the international debate, especially in the light of the recent debate to improve and convergence corporate governance in EU. Firstly, I review the corporate governance debate and its implication at the EU level. Secondly, I describe the corporate governance framework in Greece in the light of the recent key reforms. Finally, I summarize the overall findings and proceed with some critical points and recommendations for the potential future direction of the corporate governance agenda in Greece.Corporate governance, rating, disclosure, ownership, Greece
Sequential Logistic Principal Component Analysis (SLPCA): Dimensional Reduction in Streaming Multivariate Binary-State System
Sequential or online dimensional reduction is of interests due to the
explosion of streaming data based applications and the requirement of adaptive
statistical modeling, in many emerging fields, such as the modeling of energy
end-use profile. Principal Component Analysis (PCA), is the classical way of
dimensional reduction. However, traditional Singular Value Decomposition (SVD)
based PCA fails to model data which largely deviates from Gaussian
distribution. The Bregman Divergence was recently introduced to achieve a
generalized PCA framework. If the random variable under dimensional reduction
follows Bernoulli distribution, which occurs in many emerging fields, the
generalized PCA is called Logistic PCA (LPCA). In this paper, we extend the
batch LPCA to a sequential version (i.e. SLPCA), based on the sequential convex
optimization theory. The convergence property of this algorithm is discussed
compared to the batch version of LPCA (i.e. BLPCA), as well as its performance
in reducing the dimension for multivariate binary-state systems. Its
application in building energy end-use profile modeling is also investigated.Comment: 6 pages, 4 figures, conference submissio
- …