33,147 research outputs found
Parton Distributions in the Higgs Boson Era
Parton distributions are an essential ingredient of the LHC program. PDFs are
relevant for precision Standard Model measurements, for Higgs boson
characterization as well as for New Physics searches. In this contribution I
review recent progress in the determination of the parton distributions of the
proton during the last year. Important developments include the impact of new
LHC measurements to pin down poorly known PDFs, studies of theoretical
uncertainties, higher order calculations for processes relevant for PDF
determinations, PDF benchmarking exercises with LHC data, as well as
methodological and statistical improvements in the global analysis framework. I
conclude with some speculative considerations about future directions in PDF
determinations from the theory point of view.Comment: 11 pages, 5 figures, write-up of the plenary talk at the XXI
International Workshop on Deep-Inelastic Scattering and Related Subjects
(DIS2013), Marseille, 22-26 April 201
PDF uncertainties in the determination of the W boson mass and of the effective lepton mixing angle at the LHC
The precision measurement of the W boson mass allows to perform stringent
consistency tests of the Standard Model by means of global electroweak fits.
The accurate determination of the W boson mass is one of the legacy results of
the Tevatron, where the experimental accuracy is such that is now limited
by theoretical uncertainties related to the parton distributions of the proton.
In this contribution, we show how to quantify the impact of PDF uncertainties
in the measurement of at the Tevatron and the LHC by means of a template
method, and study both the use of the W transverse mass and the lepton pT
kinematical distributions to generate these templates. We also present
preliminary results on the quantification of the PDF uncertainties in the
determination of the effective lepton mixing angle at the LHC, based on the
same template method as for the W mass determinationComment: 5 pages, 4 figures, to appear in the proceedings of the XXI
International Workshop on Deep-Inelastic Scattering and Related Subjects
(DIS2013), Marseille, 22-26 April 201
Improving quark flavor separation with forward W and Z production at LHCb
We quantify the constraints on the flavour separation between the quarks and
antiquarks in the proton provided by the recent forward weak gauge boson
production data from the LHCb experiment at and 8 TeV. Performed
in the framework of the NNPDF3.1 global analysis, this study highlights the key
role that the LHCb W and Z data have in achieving a robust quark flavour
separation in the large-x region, including the strange and charm quarks. We
demonstrate how the LHCb measurements lead to improved determinations of the
the up and down quark PDFs in the region , with an uncertainty
reduction that can be as large as a factor 2. We also show how the LHCb forward
measurements severely restrict the size of the fitted charm PDF at large x,
imposing stringent constraints on non-perturbative models for the charm content
of the nucleon.Comment: 5 pages, 5 figures, to appear in the proceedings of the XXV
International Workshop on Deep-Inelastic Scattering and Related Subjects, 3-7
April 2017, University of Birmingham, U
Erich Leo Lehmann---A glimpse into his life and work
Through the use of a system-building approach, an approach that includes
finding common ground for the various philosophical paradigms within
statistics, Erich L. Lehmann is responsible for much of the synthesis of
classical statistical knowledge that developed from the Neyman--Pearson--Wald
school. A biographical sketch and a brief summary of some of his many
contributions are presented here. His complete bibliography is also included
and the references present many other sources of information on his life and
his work.Comment: Published in at http://dx.doi.org/10.1214/11-AOS927 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Brief history of the Lehmann Symposia: Origins, goals and motivation
The idea of the Lehmann Symposia as platforms to encourage a revival of
interest in fundamental questions in theoretical statistics, while keeping in
focus issues that arise in contemporary interdisciplinary cutting-edge
scientific problems, developed during a conversation that I had with Victor
Perez Abreu during one of my visits to Centro de Investigaci\'{o}n en
Matem\'{a}ticas (CIMAT) in Guanajuato, Mexico. Our goal was and has been to
showcase relevant theoretical work to encourage young researchers and students
to engage in such work. The First Lehmann Symposium on Optimality took place in
May of 2002 at Centro de Investigaci\'{o}n en Matem\'{a}ticas in Guanajuato,
Mexico. A brief account of the Symposium has appeared in Vol. 44 of the
Institute of Mathematical Statistics series of Lecture Notes and Monographs.
The volume also contains several works presented during the First Lehmann
Symposium. All papers were refereed. The program and a picture of the
participants can be found on-line at the website
http://www.stat.rice.edu/lehmann/lst-Lehmann.html.Comment: Published at http://dx.doi.org/10.1214/074921706000000347 in the IMS
Lecture Notes--Monograph Series
(http://www.imstat.org/publications/lecnotes.htm) by the Institute of
Mathematical Statistics (http://www.imstat.org
Review on recent developments in jet finding
We review recent developments related to jet clustering algorithms and jet
finding. These include fast implementations of sequential recombination
algorithms, new IRC safe algorithms, quantitative determination of jet areas
and quality measures for jet finding, among many others. We also briefly
discuss the status of jet finding in heavy ion collisions, where full QCD jets
have been measured for the first time at RHIC.Comment: 5 pages, 5 figures, proceedings of the International Symposium on
Multiparticle Dynamics 08, 15-20 september 2008, DES
Study of macroscopic and microscopic properties of liposomes produced using microfluidic methods
For the last decades, lipid vesicles or liposomes, vesicles formed by a bilayer of amphiphilic lipids, have been used as a toy model for studying the cell membrane and for applications in cosmetics and drug delivery. Traditional methods for producing liposomes face some problems such as the heterogeneity in size and composition of the liposomes produced. A few years ago, a novel method that produces liposomes with homogeneous size and composition was developed. This novel method is based on the use of water in oil in water ultra-thin double emulsions, with lipids dissolved in the oil phase, as templates for the liposome production. These ultra-thin double emulsions are produced using glass capillary microfluidic devices.
This new method for producing liposomes seems very promising, but since the liposomes are formed by the oil phase evaporation of the double emulsions, the doubt that some residual oil in the bilayer may alter the properties of the liposomes appears. In this work different phenomena and properties of liposomes that have been studied for the ones produced using conventional methods are studied for liposomes produced using microfluidic methods.
The microfluidic apprOutgoin
- …
