4,208 research outputs found
Measurements of the masses, lifetimes and mixings of B hadrons at the Tevatron
The Tevatron, with p p-bar collisions at sqrt(s)=1.96TeV can produce all
flavors of B hadrons and allows for unprecedented studies in the B physics
sector. The CDF and D0 collaborations have more than 5 fb-1 of data recorded. I
present here a selection of results on the masses, lifetimes and mixings of B
hadrons using between 1.0 and 2.8fb-1 of data.Comment: 5 pages, 3 figures. Proceedings for Recontres de Moriond QCD 2009,
references adde
Recommended from our members
Risk management: understanding why is as important as understanding how
The management of risk is a key area within a number of ACCA papers, and exam questions related to this area are common. It is vital that students are able to apply risk management techniques, such as using derivative instruments to hedge against risk, and offer advice and recommendations as required by the scenario in the question. It is also equally important that students understand why corporations manage risk in theory and in practice, because risk management costs money but does it actually add more value to a corporation? This article explores the circumstances where the management of risk may lead to an increase in the value of a corporation
Recommended from our members
Organisational slack, corporate reputation and financial performance
This discussion paper aims to fill the gap left by the latest research on organisational slack that has been focused on emerging economies or on a single company or on a single industry. Senior executivesâ perceptions that contribute to a measure of corporate reputation are tested as a proxy measure of unabsorbed slack. Disaggregating the components that make up reputation enables the perceptions of a companyâs âability to innovateâ and how efficiently they âuse their corporate assetsâ to also be tested as measures of unabsorbed, perceptual or discretionary slack. The impact of these variables is considered in terms of company performance
Moral imagination or heuristic toolbox? Events and the risk assessment of structured financial products in the financial bubble
The paper uses the example of the failure of bankers and financial managers to understand the risks of dealing in structured financial products, prior to the financial collapse, to investigate how people respond to crises. It focuses on whether crises cause people to challenge their habitual frames by the application of moral imagination. It is proposed that the structure of financial products and their markets triggered the use of heuristics that contributed to the underestimation of risks. It is further proposed that such framing heuristics are highly specialised to specific contexts, and are part of a wider set of heuristics that people carry in their cognitive âadaptive tool boxesâ. Consequently, it is argued, when a crisis occurs the heuristics are not challenged, but are simply put away, and other more appropriate heuristics put to use until a sense of normality returns, and the use of the old heuristics is resumed
Gaussian Process Emulators in coastal wave modelling
A majority of the coastal wave modelling analysis require using historical data from physical observations or from computer simulations. Such simulators are often computationally expensive (takes long for a single evaluation run) and therefore it is normally a bottleneck in the analysis. Meta models are increasingly used as surrogates of the complex simulators to improve the eïŹciency of the bottleneck step. The performance of the meta model is vital when selecting the model as this would greatly inïŹuence the conclusions that are drawn from the analysis.
In this thesis we apply the Gaussian Process Emulator as a meta model of a wave transformation simulator, SWAN. The GPE is advantageous compared to other meta models as the predictions from the GPE are in the form of a distribution (mean and variance) and predicting at an event used to train the GPE returns perfect predictions with no uncertainty. Univariate and multivariate approaches of the GPE are presented and compared in case studies. In addition simple diagnostics to validate the GPE are discussed.
Lookâup table (LUT) approach is a commonly used traditional meta model in coastal modelling. This is based on multidimensional linear interpolation of points on a regular grid. A case study shows the performance improvement that can be gained by using GPE over this traditional LUT approach. The GPE needs less than 2% of the simulator runs required for the LUT to obtain a similar accuracy. When introducing the multivariate GPE we identify two types of multiple outputs. We present a principal component GPE (PC-GPE) and a separable GPE for highly correlated and high dimensional output. These methods are compared to ïŹtting multiple univariate GPEâs. In terms of accuracy the multiple univariate GPE outperformed the other methods however the PC-GPE tends to be moreeïŹcient with only a small compromise on accuracy. For low dimensional output that is weekly correlated we present the linear model of coregionalisation (LMC) GPE which is a more ïŹexible technique than the separable GPE. We compared this with the separable GPE and to ïŹtting multiple univariate GPEs. The LMC GPE gave similar results as the multiple univariate GPE, but it is unstable and took a signiïŹcant amount of time to ïŹt.
Finally, we describe three approaches of selecting a design (simulator runs used to train the GPE). We aim to select a design that will maximise the information we can get from the simulator in order to inform the GPE given the limited simulator runs.
The aim of this thesis is to present the GPE methodology in a concise manner with running examples throughout. The novelty here is to show the application of GPEs to coastal wave modelling in order to help alleviate the computational burden and improve accuracy when using meta-models to avoid the bottleneck in the analysis
First determination of the content of and updated determination of the contents of and
Quantum-correlated decays collected by the CLEO-c
experiment are used to perform a first measurement of , the
fractional -even content of the self-conjugate decay , obtaining a value of . An important
input to the measurement comes from the use of
and decays to tag the signal mode. This same
technique is applied to the channels and , yielding and
, where the first uncertainty is
statistical and the second systematic. These measurements are consistent with
those of an earlier analysis, based on -eigenstate tags, and can be
combined to give values of and
. The results will enable the three modes to
be included in a model-independent manner in measurements of the unitarity
triangle angle using decays, and in time-dependent
studies of violation and mixing in the system.Comment: Minor revisions following journal acceptanc
First determination of the CP content of D->pi+pi-pi0 and D->K+K-pi0
Quantum-correlated psi(3770)->DDbar decays collected by the CLEO-c experiment
are used to perform first measurements of F+, the fractional CP-even content of
the self-conjugate decays D->pi+pi-pi0 and D->K+K-pi0. Values of 0.968 +/-
0.017 +/- 0.006 and 0.731 +/- 0.058 +/- 0.021 are obtained for pi+pi-pi0 and
K+K-pi0, respectively. It is demonstrated how modes of this sort can be cleanly
included in measurements of the unitarity triangle angle gamma using B+/- -> D
K+/- decays. The high CP-even content of D -> pi+pi-pi0, in particular, makes
this a promising mode for improving the precision on gamma.Comment: 18 pages, 4 figures, submitted to Phys. Lett.
- âŠ