557 research outputs found
Some Algorithms for the Conditional Mean Vector and Covariance Matrix
We consider here the problem of computing the mean vector and covariance matrix for a conditional normal distribution, considering especially a sequence of problems where the conditioning variables are changing. The sweep operator provides one simple general approach that is easy to implement and update. A second, more goal-oriented general method avoids explicit computation of the vector and matrix, while enabling easy evaluation of the conditional density for likelihood computation or easy generation from the conditional distribution. The covariance structure that arises from the special case of an ARMA(p, q) time series can be exploited for substantial improvements in computational efficiency.
Judging Risk
Risk assessment plays an increasingly pervasive role in criminal justice in the United States at all stages of the process, from policing, to pre-trial, sentencing, corrections, and during parole. As efforts to reduce incarceration have led to adoption of risk-assessment tools, critics have begun to ask whether various instruments in use are valid and whether they might reinforce rather than reduce bias in criminal justice outcomes. Such work has neglected how decisionmakers use risk-assessment in practice. In this Article, we examine in detail the judging of risk assessment and we study why decisionmakers so often fail to consistently use such quantitative information
Recommended from our members
The application of automated perturbation theory to lattice QCD
Predictions of heavy quark parameters are an integral component of precision
tests of the Standard Model of particle physics. Experimental measurements
of electroweak processes involving heavy hadrons provide stringent tests of
Cabibbo-Kobayashi-Maskawa (CKM) matrix unitarity and serve as a probe
of new physics. Hadronic matrix elements parameterise the strong dynamics
of these interactions and these matrix elements must be calculated nonperturbatively.
Lattice quantum chromodynamics (QCD) provides the framework for
nonperturbative calculations of QCD processes. Current lattices are too coarse
to directly simulate b quarks. Therefore an effective theory, nonrelativistic
QCD (NRQCD), is used to discretise the heavy quarks. High precision simulations
are required so systematic uncertainties are removed by improving the
NRQCD action. Precise simulations also require improved sea quark actions,
such as the highly-improved staggered quark (HISQ) action. The renormalisation
parameters of these actions cannot be feasibly determined by hand
and thus automated procedures have been developed. In this dissertation I
apply automated lattice pertubartion theory to a number of heavy quark
calculations.
I first review the fundamentals of lattice QCD and the construction of
lattice NRQCD. I then motivate and discuss lattice perturbation theory in
detail, focussing on the tools and techniques that I use in this dissertation.
I calculate the two-loop tadpole improvement factors for improved gluons
with improved light quarks. I then compute the renormalisation parameters
of NRQCD. I use a mix of analytic and numerical methods to extract the
one-loop radiative corrections to the higher order kinetic operators in the
NRQCD action. I then employ a fully automated procedure to calculate
the heavy quark energy shift at two-loops. I use this result to extract a
new prediction of the mass of the b quark from lattice NRQCD simulations
by the HPQCD collaboration. I also review the calculation of the radiative
corrections to the chromo-magnetic operator in the NRQCD action. This
computation is the first outcome of our implementation of background field
gauge for automated lattice perturbation theory.
Finally, I calculate the heavy-light currents for highly-improved NRQCD
heavy quarks with massless HISQ light quarks and discuss the application of
these results to nonperturbative studies by the HPQCD collaboration
Some Algorithms for the Conditional Mean Vector and Covariance Matrix
We consider here the problem of computing the mean vector and covariance matrix for a conditional normal distribution, considering especially a sequence of problems where the conditioning variables are changing. The sweep operator provides one simple general approach that is easy to implement and update. A second, more goal-oriented general method avoids explicit computation of the vector and matrix, while enabling easy evaluation of the conditional density for likelihood computation or easy generation from the conditional distribution. The covariance structure that arises from the special case of an ARMA(p, q) time series can be exploited for substantial improvements in computational efficiency
Law’s power to safeguard global health: a Lancet–O’Neill Institute, Georgetown University Commission on Global Health and the Law
The law-–global, national, and subnational–-plays a vital, yet often underappreciated, role in safeguarding and promoting the public’s health. In this article, we launch the Lancet-O’Neill Institute, Georgetown University Commission on Global Health and the Law. Commissioners from around the world will explore the critical opportunities and challenges of using law as a tool, while evaluating the evidence base for legal interventions. The Commission aims to define and systematically describe the current landscape of law that affects global health and safety.
Commissioners were chosen from disciplines that range from health, policy, and law to economics and governance. The Commission aims to present a compelling argument as to why law should be viewed as a major determinant of health and safety and how the law can be used in a powerful and innovative way to address the global burdens of injury and disease. Above all, the Commission will pursue justice, finding innovative ways to narrow existing and unconscionable health inequalitie
- …