6,190 research outputs found
Eff ect of the sub-chronic administration of some commonly used herbal products on the liver enzymes, body weight, and feed intake of albino Wistar rats: Any implication for public health?
The present study was focused to assess the effects of the administration of commonly used herbal products viz.:Yoyo cleansing bitters, T. angelica herbal tonic, and Bio-Strath elixir on the liver enzymes, body weight, and feedintake in adult albino Wistar rats. A total number of seventy Wistar rats were divided into three major groups.Each group received a particular herbal product and each of these groups was further subdivided into subgroupsthat received various dosages of each of the herbal products. The rats were acclimatized for 14 days after whichthey received different doses of each of the herbal products for 6 weeks. The body weight, feed intake, andmodulation in liver enzymes were evaluated. The feed intake and body weights were reduced in animals thatreceived T. angelica herbal tonic and Yoyo cleansing bitters at twice the normal dose once and twice daily, butthe reverse was the case for the rats that received Bio-Strath elixir even at higher doses. The liver enzymes wereincreased at all doses in rats which were given Bio-Strath elixir but it was not significant (P > 0.05), while thoseof Yoyo bitters and T. angelica herbal tonic were significantly increased (P < 0.05), especially at higher doses.From our study results it was suggested that a higher dose than the manufacturer’s recommended dose takenfor a longer duration can elevate liver enzyme, thus causing abnormal liver function
Snowmass 2001: Jet Energy Flow Project
Conventional cone jet algorithms arose from heuristic considerations of LO
hard scattering coupled to independent showering. These algorithms implicitly
assume that the final states of individual events can be mapped onto a unique
set of jets that are in turn associated with a unique set of underlying hard
scattering partons. Thus each final state hadron is assigned to a unique
underlying parton. The Jet Energy Flow (JEF) analysis described here does not
make such assumptions. The final states of individual events are instead
described in terms of flow distributions of hadronic energy. Quantities of
physical interest are constructed from the energy flow distribution summed over
all events. The resulting analysis is less sensitive to higher order
perturbative corrections and the impact of showering and hadronization than the
standard cone algorithms.Comment: REVTeX4, 13 pages, 6 figures; Contribution to the P5 Working Group on
QCD and Strong Interactions at Snowmass 200
Use of Artificial Neural Networks for Improvement of CMS Hadron Calorimeter Resolution
The Compact Muon Solenoid (CMS) experiment features an electromagnetic calorimeter (ECAL) composed of lead tungstate crystals and a sampling hadronic calorimeter (HCAL) made of brass and scintillator, along with other detectors. For hadrons, the response of the electromagnetic and hadronic calorimeters is inherently different. Because sampling calorimeters measure a fraction of the energy spread over several measuring towers, the energy resolution as well as the linearity are not easily preserved, especially at low energies. Several sophisticated algorithms have been developed to optimize the resolution of the CMS calorimeter system for single particles. One such algorithm, based on the artificial neural network application to the combined electromagnetic and hadronic calorimeter system, was developed and applied to test beam data using particles in the momentum range of 2-300 GeV/c. The method improves the energy measurement and linearity, especially at low energies below 10 GeV/c
On the presentation of the LHC Higgs Results
We put forth conclusions and suggestions regarding the presentation of the
LHC Higgs results that may help to maximize their impact and their utility to
the whole High Energy Physics community.Comment: Conclusions from the workshops "Likelihoods for the LHC Searches",
21-23 January 2013 at CERN, "Implications of the 125 GeV Higgs Boson", 18-22
March 2013 at LPSC Grenoble, and from the 2013 Les Houches "Physics at TeV
Colliders" workshop. 16 pages, 3 figures. Version 2: Comment added on the
first publication of signal strength likelihoods in digital form by ATLA
An Evolutionary Paradigm for Dusty Active Galaxies at Low Redshift
We apply methods from Bayesian inferencing and graph theory to a dataset of
102 mid-infrared spectra, and archival data from the optical to the millimeter,
to construct an evolutionary paradigm for z<0.4 infrared-luminous galaxies
(ULIRGs). We propose that the ULIRG lifecycle consists of three phases. The
first phase lasts from the initial encounter until approximately coalescence.
It is characterized by homogeneous mid-IR spectral shapes, and IR emission
mainly from star formation, with a contribution from an AGN in some cases. At
the end of this phase, a ULIRG enters one of two evolutionary paths depending
on the dynamics of the merger, the available quantities of gas, and the masses
of the black holes in the progenitors. On one branch, the contributions from
the starburst and the AGN to the total IR luminosity decline and increase
respectively. The IR spectral shapes are heterogeneous, likely due to feedback
from AGN-driven winds. Some objects go through a brief QSO phase at the end. On
the other branch, the decline of the starburst relative to the AGN is less
pronounced, and few or no objects go through a QSO phase. We show that the 11.2
micron PAH feature is a remarkably good diagnostic of evolutionary phase, and
identify six ULIRGs that may be archetypes of key stages in this lifecycle.Comment: ApJ accepted. Comments welcome. We suggest reading section 2 before
looking at the figures. 26 pages, 21 figures, 1 tabl
Reference priors for high energy physics
Bayesian inferences in high energy physics often use uniform prior
distributions for parameters about which little or no information is available
before data are collected. The resulting posterior distributions are therefore
sensitive to the choice of parametrization for the problem and may even be
improper if this choice is not carefully considered. Here we describe an
extensively tested methodology, known as reference analysis, which allows one
to construct parametrization-invariant priors that embody the notion of minimal
informativeness in a mathematically well-defined sense. We apply this
methodology to general cross section measurements and show that it yields
sensible results. A recent measurement of the single top quark cross section
illustrates the relevant techniques in a realistic situation
Top Quark Physics at the Tevatron
The discovery of the top quark in 1995, by the CDF and D0 collaborations at
the Fermilab Tevatron, marked the dawn of a new era in particle physics. Since
then, enormous efforts have been made to study the properties of this
remarkable particle, especially its mass and production cross section. In this
article, we review the status of top quark physics as studied by the two
collaborations using the p-pbar collider data at sqrt(s) = 1.8 TeV. The
combined measurement of the top quark mass, m_t = 173.8 +- 5.0 GeV/c^2, makes
it known to a fractional precision better than any other quark mass. The
production cross sections are measured as sigma (t-tbar) = 7.6 -1.5 +1.8 pb by
CDF and sigma (t-tbar) = 5.5 +- 1.8 pb by D0. Further investigations of t-tbar
decays and future prospects are briefly discussed.Comment: 119 pages, 59 figures, 17 tables Submitted to Int. J. Mod. Phys. A
Fixed some minor error
Recommended from our members
A Bayesian analysis of the solar neutrino problem
We illustrate how the Bayesian approach can be used to provide a simple but powerful way to analyze data from solar neutrino experiments. The data are analyzed assuming that the neutrinos are unaltered during their passage from the Sun to the Earth. We derive quantitative and easily understood information pertaining to the solar neutrino problem
Development of an exercise intervention for the prevention of musculoskeletal shoulder problems after breast cancer treatment : the prevention of shoulder problems trial (UK PROSPER)
Background
Musculoskeletal shoulder problems are common after breast cancer treatment. There is some evidence to suggest that early postoperative exercise is safe and may improve shoulder function. We describe the development and delivery of a complex intervention for evaluation within a randomised controlled trial (RCT), designed to target prevention of musculoskeletal shoulder problems after breast cancer surgery (The Prevention of Shoulder Problems Trial; PROSPER).
Methods
A pragmatic, multicentre RCT to compare the clinical and cost-effectiveness of best practice usual care versus a physiotherapy-led exercise and behavioural support intervention in women at high risk of shoulder problems after breast cancer treatment. PROSPER will recruit 350 women from approximately 15 UK centres, with follow-up at 6 and 12 months. The primary outcome is shoulder function at 12 months; secondary outcomes include postoperative pain, health related quality of life, adverse events and healthcare resource use. A multi-phased approach was used to develop the PROSPER intervention which was underpinned by existing evidence and modified for implementation after input from clinical experts and women with breast cancer. The intervention was tested and refined further after qualitative interviews with patients newly diagnosed with breast cancer; a pilot RCT was then conducted at three UK clinical centres.
Discussion
The PROSPER intervention incorporates three main components: shoulder-specific exercises targeting range of movement and strength; general physical activity; and behavioural strategies to encourage adherence and support exercise behaviour. The final PROSPER intervention is fully manualised with clear, documented pathways for clinical assessment, exercise prescription, use of behavioural strategies, and with guidance for treatment of postoperative complications. This paper adheres to TIDieR and CERT recommendations for the transparent, comprehensive and explicit reporting of complex interventions.
Trial registration: International Standard Randomised Controlled Trial Number: ISRCTN 35358984
Snowmass 2001: Jet Energy Flow Project
Conventional cone jet algorithms arose from heuristic considerations of LO hard scattering coupled to independent showering. These algorithms implicitly assume that the final states of individual events can be mapped onto a unique set of jets that are in turn associated with a unique set of underlying hard scattering partons. Thus each final state hadron is assigned to a unique underlying parton. The Jet Energy Flow (JEF) analysis described here does not make such assumptions. The final states of individual events are instead described in terms of flow distributions of hadronic energy. Quantities of physical interest are constructed from the energy flow distribution summed over all events. The resulting analysis is less sensitive to higher order perturbative corrections and the impact of showering and hadronization than the standard cone algorithms
- …