4,881 research outputs found

    Where do statistical models come from? Revisiting the problem of specification

    Full text link
    R. A. Fisher founded modern statistical inference in 1922 and identified its fundamental problems to be: specification, estimation and distribution. Since then the problem of statistical model specification has received scant attention in the statistics literature. The paper traces the history of statistical model specification, focusing primarily on pioneers like Fisher, Neyman, and more recently Lehmann and Cox, and attempts a synthesis of their views in the context of the Probabilistic Reduction (PR) approach. As argued by Lehmann [11], a major stumbling block for a general approach to statistical model specification has been the delineation of the appropriate role for substantive subject matter information. The PR approach demarcates the interrelated but complemenatry roles of substantive and statistical information summarized ab initio in the form of a structural and a statistical model, respectively. In an attempt to preserve the integrity of both sources of information, as well as to ensure the reliability of their fusing, a purely probabilistic construal of statistical models is advocated. This probabilistic construal is then used to shed light on a number of issues relating to specification, including the role of preliminary data analysis, structural vs. statistical models, model specification vs. model selection, statistical vs. substantive adequacy and model validation.Comment: Published at http://dx.doi.org/10.1214/074921706000000419 in the IMS Lecture Notes--Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Revisiting the Neyman-Scott model: an Inconsistent MLE or an Ill-defined Model?

    Full text link
    The Neyman and Scott (1948) model is widely used to demonstrate a serious weakness of the Maximum Likelihood (ML) method: it can give rise to inconsistent estimators. The primary objective of this paper is to revisit this example with a view to demonstrate that the culprit for the inconsistent estimation is not the ML method but an ill-defined statistical model. It is also shown that a simple recasting of this model renders it well-defined and the ML method gives rise to consistent and asymptotically efficient estimators

    The role of the classroom teacher in a speech improvement program.

    Full text link
    Thesis (Ed.M.)--Boston Universit

    Corporate governance in Greece: developments and policy implications

    Get PDF
    The upgrading of the Greek capital market and the effort to join other mature capital markets has posed corporate governance reform as a first priority. In addition, the 2004 Olympic Games put the Greek market in the international spotlight and will likely invite interest from foreign investors. More than ever, an efficient corporate governance framework is condition sine qua non for the competitive transformation of the capital market and the business world. At the same time the European Union (EU) faces both the pressure and challenge for harmonization of the laws and regulations and convergence of corporate governance systems, especially after the entrance of the new member states. The paper has two objectives: (i) to present the main aspects of corporate governance in Greece, contributing to the relevant growing body of literature, and (ii) to place the current corporate governance developments and trends in Greece within the international debate, especially in the light of the recent debate to improve and convergence corporate governance in EU. Firstly, I review the corporate governance debate and its implication at the EU level. Secondly, I describe the corporate governance framework in Greece in the light of the recent key reforms. Finally, I summarize the overall findings and proceed with some critical points and recommendations for the potential future direction of the corporate governance agenda in Greece.Corporate governance, rating, disclosure, ownership, Greece

    Sequential Logistic Principal Component Analysis (SLPCA): Dimensional Reduction in Streaming Multivariate Binary-State System

    Full text link
    Sequential or online dimensional reduction is of interests due to the explosion of streaming data based applications and the requirement of adaptive statistical modeling, in many emerging fields, such as the modeling of energy end-use profile. Principal Component Analysis (PCA), is the classical way of dimensional reduction. However, traditional Singular Value Decomposition (SVD) based PCA fails to model data which largely deviates from Gaussian distribution. The Bregman Divergence was recently introduced to achieve a generalized PCA framework. If the random variable under dimensional reduction follows Bernoulli distribution, which occurs in many emerging fields, the generalized PCA is called Logistic PCA (LPCA). In this paper, we extend the batch LPCA to a sequential version (i.e. SLPCA), based on the sequential convex optimization theory. The convergence property of this algorithm is discussed compared to the batch version of LPCA (i.e. BLPCA), as well as its performance in reducing the dimension for multivariate binary-state systems. Its application in building energy end-use profile modeling is also investigated.Comment: 6 pages, 4 figures, conference submissio
    • …
    corecore