76,141 research outputs found

    After the Great Recession: Law and Economics\u27 Topics of Invention and Arrangement and Tropes of Style

    Get PDF
    AFTER THE GREAT RECESSION: LAW AND ECONOMICSā€™ TOPICS OF INVENTION AND ARRANGEMENT AND TROPES OF STYLE by Michael D. Murray Abstract The Great Recession of 2008 and onward has drawn attention to the American economic and financial system, and has cast a critical spotlight on the theories, policies, and assumptions of the modern, neoclassical school of law and economicsā€”often labeled the Chicago School ā€”because this school of legal economic thought has had great influence on the American economy and financial system. The Chicago School\u27s positions on deregulation and the limitation or elimination of oversight and government restraints on stock markets, derivative markets, and other financial practices are the result of decades of neoclassical economic assumptions regarding the efficiency of unregulated markets, the near-religious-like devotion to a hyper-simplified conception of rationality and self-interest with regard to the persons and institutions participating in the financial system, and a conception of laws and government policies as incentives and costs in a manner that excludes the actual conditions and complications of reality. This Article joins the critical conversation on the Great Recession and the role of law and economics in this crisis by examining neoclassical and contemporary law and economics from the perspective of legal rhetoric. Law and economics has developed into a school of contemporary legal rhetoric that provides topics of invention and arrangement and tropes of style to test and improve general legal discourse in areas beyond the economic analysis of law. The rhetorical canons of law and economicsā€”mathematical and scientific methods of analysis and demonstration; the characterization of legal phenomena as incentives and costs; the rhetorical economic concept of efficiency; and rational choice theory as corrected by modern behavioral social sciences, cognitive studies, and brain scienceā€”make law and economics a persuasive method of legal analysis and a powerful school of contemporary legal rhetoric, if used in the right hands. My Article is the first to examine the prescriptive implications of the rhetoric of law and economics for general legal discourse as opposed to examining the benefits and limitations of the economic analysis of law itself. This Article advances the conversation in two areas: first, as to the study and understanding of the persuasiveness of law and economics, particularly because that persuasiveness has played a role in influencing American economic and financial policy leading up to the Great Recession; and second, as to the study and understanding of the use of economic topics of invention and arrangement and tropes of style in general legal discourse when evaluated in comparison to the other schools of classical and contemporary legal rhetoric. I examine each of the rhetorical canons of law and economics and explain how each can be used to create meaning, inspire imagination, and improve the persuasiveness of legal discourse in every area of law. My conclusion is that the rhetorical canons of law and economics can be used to create meaning and inspire imagination in legal discourse beyond the economic analysis of law, but the canons are tools that only are as good as the user, and can be corrupted in ways that helped to bring about the current economic crisis

    Stochastic Frontier Models With Correlated Error Components

    Get PDF
    In the productivity modelling literature, the disturbances U (representing technical inefficiency) and V (representing noise) of the composite error W=V-U of the stochastic frontier model are assumed to be independent random variables. By employing the copula approach to statistical modelling, the joint behaviour of U and V can be parameterised thereby allowing the data the opportunity to determine the adequacy of the independence assumption. In this context, three examples of the copula approach are given: the first is algebraic (the Logistic-Exponential stochastic frontier model with margins bound by the Fairlie-Gumbel-Morgenstern copula) and the second and third are empirically oriented, using data sets well-known in productivity analysis. Analysed are a cross-section of cost data sampled from the US electrical power industry, and an unbalanced panel of data sampled from the US airline industryStochastic Frontier model; Copula; Copula approach; Sklar's theorem; Families of copulas; Spearman's rho.

    Nonlinear adaptive control using non-parametric Gaussian Process prior models

    Get PDF
    Nonparametric Gaussian Process prior models, taken from Bayesian statistics methodology are used to implement a nonlinear adaptive control law. The expected value of a quadratic cost function is minimised, without ignoring the variance of the model predictions. This leads to implicit regularisation of the control signal (caution), and excitation of the system. The controller has dual features, since it is both tracking a reference signal and learning a model of the system from observed responses. The general method and its main features are illustrated on a simulation example

    Vibrations of weakly-coupled nanoparticles

    Full text link
    The vibrations of a coupled pair of isotropic silver spheres are investigated and compared with the vibrations of the single isolated spheres. Situations of both strong coupling and also weak coupling are investigated using continuum elasticity and perturbation theory. The numerical calculation of the eigenmodes of such dimers is augmented with a symmetry analysis. This checks the convergence and applicability of the numerical method and shows how the eigenmodes of the dimer are constructed from those of the isolated spheres. The frequencies of the lowest frequency vibrations of such dimers are shown to be very sensitive to the strength of the coupling between the spheres. Some of these modes can be detected by inelastic light scattering and time-resolved optical measurements which provides a convenient way to study the nature of the mechanical coupling in dimers of micro and nanoparticles.Comment: expanded version, 8 pages, 5 figures, 2 table

    Unincorporated associations reform

    Get PDF

    Symbolic Maximum Likelihood Estimation with Mathematica

    Get PDF
    Mathematica is a symbolic programming language that empowers the user to undertake complicated algebraic tasks. One such task is the derivation of maximum likelihood estimators, demonstrably an important topic in statistics at both the research and expository level. In this paper, a Mathematica package is provided that contains a function entitled SuperLog. This function utilises pattern-matching code that enhances Mathematica's ability to simplify expressions involving the natural logarithm of a product of algebraic terms. This enhancement to Mathematica's functionality can be of particular benefit for maximum likelihood estimation

    Higgs fields, bundle gerbes and string structures

    Full text link
    We use bundle gerbes and their connections and curvings to obtain an explicit formula for a de Rham representative of the string class of a loop group bundle. This is related to earlier work on calorons.Comment: 15 page
    • ā€¦
    corecore