641 research outputs found
Estimation of average bioburden values on flexible gastrointestinal endoscopes after clinical use and cleaning: Assessment of the efficiency of cleaning processes
Background: Endoscopy is a vital part of medical diagnostic processes. There are different kinds of flexible endoscopes used in medicine. They differ between manufacturers and even between models from the same manufacturer. However, all flexible endoscopes have the same basic components. Infections related to flexible endoscopic procedures are caused by either endogenous flora or exogenous microbes. The first major challenge of reprocessing is infection control, most episodes of infection can be traced to procedural errors in cleaning and disinfecting, the second major challenge is to protect personnel and patients from the exposure to liquid biocides used for disinfection. Because the endoscopic accessories have complex nature, attention and adherence to a validated protocol is critical for reprocessing endoscopic accessories. Bioburden is defined as the number of bacteria living on a surface that has not been sterilized. The term is most often used in the context of bioburden testing, also known as microbial limit testing, which is performed on pharmaceutical products and medical products for quality control purposes. Flexible endoscopes, by virtue of the types of body cavities they enter, acquire high levels of microbial contamination (bioburden) during each use.Aim of the work: To detect the average bioburden values on different parts of flexible gastrointestinal endoscopes after clinical use and cleaning in order to assess the efficiency of different cleaning processes used in the endoscopy unit.Methods: The current study included a total of 120 endoscopes randomly selected from Medical Research Institute (MRI) hospital 60 (50%) of which were from Surgical Department endoscopy unit, and 60 (50%) of which were from Internal Medicine Department endoscopy unit. The endoscopes were divided as (40) endoscopes after use (40) endoscopes after manual cleaning, and (40) endoscopes after high level disinfection. All samples were cultured for aerobic and anaerobic bacteria, and for Candida species, the number of colonies were determined as colony forming units (cfu)/ml.Results: Microorganisms isolated immediately after use were Staphylococcus, Streptococcus, Klebsiella, Escherichia coli, and Bacteroides, whereas after manual cleaning the isolated strains were Staphylococcus, Streptococcus, Pseudomonas, Klebsiella, Bacteroides, and E. coli. The average Bioburden on endoscopy before cleaning ranged from 6 104 to 3.7 108 cfu per device (mean cfu per device 1.4 107), whereas after manual cleaning ranged from 2.1 102 to 3.5 103 cfu per device (mean cfu per device 4.9 102) and no colonies were found after sterilization. Manual cleaning resulted in a mean of 4.46 log10 reduction in viable colony count and high level disinfection (HLD) resulted in a reduction of CFU to zero.Conclusions: HLD is superior to manual cleaning in the process of endoscopic disinfection. Recommendations: Microbiological screening should be undertaken for all the Endoscopy Unit personnel responsible for cleaning or if there is a clinical suspicion of cross-infection related to endoscopy. All health-care personnel in an endoscopy unit in standard infection control should be trained to reprocess endoscopes. Safe working practices in the decontamination area of each unit should be written down and understood by all staff
A Review of Rabbit Diseases in Egypt
Promising approaches of the Egyptian governmental as well as non-governmental society to rabbit industry to overcome the unemployment of youth in the society required more efforts from scientific institutes to help in development of such industry. Epidemiological studies are of outmost importance to highlight disease nature and to help in meantime implement of successful preventive and control measures. The aim of this paper is to review the situation of rabbit diseases of economic impact in Egypt (1952 to 2013). The review will highlight the viral infection of rabbit hemorrhagic disease, bacterial disease of colibacillosis, clostridiosis, salmonellosis, pasteurellosis, staphylococcosis and listeriosis and parasitic infection of coccidiosis and mange. Key words: Rabbit, disease, bacteria, viral infectio
Severe axonal neuropathy is a late manifestation of SPG11
Complex hereditary spastic paraplegia (HSP) is a clinically heterogeneous group of disorders usually inherited in an autosomal recessive manner. In the past, complex recessive spastic paraplegias have been frequently associated with SPG11 mutations but also with defects in SPG15, SPG7 and a handful of other rare genes. Pleiotropy exists in HSP genes, exemplified in the recent association of SPG11 mutations with CMT2. In this study, we performed whole exome sequence analysis and identified two siblings with novel compound heterozygous frameshift SPG11 mutations. The mutations segregated with disease were not present in control databases and analysis of skin fibroblast derived mRNA indicated that the SPG11 truncated mRNA species were not degraded significantly by non-sense mediated mRNA decay. These siblings had severe early-onset spastic paraplegia but later in their disease developed severe axonal neuropathy, neuropathic pain and blue/black foot discolouration likely caused by a combination of the severe neuropathy with autonomic dysfunction and peripheral oedema. We also identified a similar late-onset axonal neuropathy in a Cypriot SPG11 family. Although neuropathy is occasionally present in SPG11, in our SPG11 patients reported here it was particularly severe, highlighting the association of axonal neuropathy with SPG11 and the late manifestation of axonal peripheral nerve damage
Discrimination of low missing energy look-alikes at the LHC
The problem of discriminating possible scenarios of TeV scale new physics
with large missing energy signature at the Large Hadron Collider (LHC) has
received some attention in the recent past. We consider the complementary, and
yet unexplored, case of theories predicting much softer missing energy spectra.
As there is enough scope for such models to fake each other by having similar
final states at the LHC, we have outlined a systematic method based on a
combination of different kinematic features which can be used to distinguish
among different possibilities. These features often trace back to the
underlying mass spectrum and the spins of the new particles present in these
models. As examples of "low missing energy look-alikes", we consider
Supersymmetry with R-parity violation, Universal Extra Dimensions with both
KK-parity conserved and KK-parity violated and the Littlest Higgs model with
T-parity violated by the Wess-Zumino-Witten anomaly term. Through detailed
Monte Carlo analysis of the four and higher lepton final states predicted by
these models, we show that the models in their minimal forms may be
distinguished at the LHC, while non-minimal variations can always leave scope
for further confusion. We find that, for strongly interacting new particle
mass-scale ~600 GeV (1 TeV), the simplest versions of the different theories
can be discriminated at the LHC running at sqrt{s}=14 TeV within an integrated
luminosity of 5 (30) fb^{-1}.Comment: 40 pages, 10 figures; v2: Further discussions, analysis and one
figure added, ordering of certain sections changed, minor modifications in
the abstract, version as published in JHE
Gluino Decay as a Probe of High Scale Supersymmetry Breaking
A supersymmetric standard model with heavier scalar supersymmetric particles
has many attractive features. If the scalar mass scale is O(10 - 10^4) TeV, the
standard model like Higgs boson with mass around 125 GeV, which is strongly
favored by the LHC experiment, can be realized. However, in this scenario the
scalar particles are too heavy to be produced at the LHC. In addition, if the
scalar mass is much less than O(10^4) TeV, the lifetime of the gluino is too
short to be measured. Therefore, it is hard to probe the scalar particles at a
collider. However, a detailed study of the gluino decay reveals that two body
decay of the gluino carries important information on the scalar scale. In this
paper, we propose a test of this scenario by measuring the decay pattern of the
gluino at the LHC.Comment: 29 pages, 9 figures; version published in JHE
Bigger, Better, Faster, More at the LHC
Multijet plus missing energy searches provide universal coverage for theories
that have new colored particles that decay into a dark matter candidate and
jets. These signals appear at the LHC further out on the missing energy tail
than two-to-two scattering indicates. The simplicity of the searches at the LHC
contrasts sharply with the Tevatron where more elaborate searches are necessary
to separate signal from background. The searches presented in this article
effectively distinguish signal from background for any theory where the LSP is
a daughter or granddaughter of the pair-produced colored parent particle
without ever having to consider missing energies less than 400 GeV.Comment: 26 pages, 8 Figures. Minor textual changes, typos fixed and
references adde
Emergent Gauge Fields in Holographic Superconductors
Holographic superconductors have been studied so far in the absence of
dynamical electromagnetic fields, namely in the limit in which they coincide
with holographic superfluids. It is possible, however, to introduce dynamical
gauge fields if a Neumann-type boundary condition is imposed on the
AdS-boundary. In 3+1 dimensions, the dual theory is a 2+1 dimensional CFT whose
spectrum contains a massless gauge field, signaling the emergence of a gauge
symmetry. We study the impact of a dynamical gauge field in vortex
configurations where it is known to significantly affect the energetics and
phase transitions. We calculate the critical magnetic fields H_c1 and H_c2,
obtaining that holographic superconductors are of Type II (H_c1 < H_c2). We
extend the study to 4+1 dimensions where the gauge field does not appear as an
emergent phenomena, but can be introduced, by a proper renormalization, as an
external dynamical field. We also compare our predictions with those arising
from a Ginzburg-Landau theory and identify the generic properties of Abrikosov
vortices in holographic models.Comment: 19 pages, 14 figures, few comments added, version published in JHE
Dijet signals of the Little Higgs model with T-parity
The Littest Higgs model with T-parity (LHT), apart from offering a viable
solution to the naturalness problem of the Standard Model, also predicts a set
of new fermions as well as a candidate for dark matter. We explore the
possibility of discovering the heavy T-odd quark Q_H at the LHC in a final
state comprising two hard jets with a large missing transverse momentum. Also
discussed is the role of heavy flavor tagging.Comment: Changes in text. Some references adde
DALC: Distributed Automatic LSTM Customization for Fine-Grained Traffic Speed Prediction
Over the past decade, several approaches have been introduced for short-term
traffic prediction. However, providing fine-grained traffic prediction for
large-scale transportation networks where numerous detectors are geographically
deployed to collect traffic data is still an open issue. To address this issue,
in this paper, we formulate the problem of customizing an LSTM model for a
single detector into a finite Markov decision process and then introduce an
Automatic LSTM Customization (ALC) algorithm to automatically customize an LSTM
model for a single detector such that the corresponding prediction accuracy can
be as satisfactory as possible and the time consumption can be as low as
possible. Based on the ALC algorithm, we introduce a distributed approach
called Distributed Automatic LSTM Customization (DALC) to customize an LSTM
model for every detector in large-scale transportation networks. Our experiment
demonstrates that the DALC provides higher prediction accuracy than several
approaches provided by Apache Spark MLlib.Comment: 12 pages, 5 figures, the 34th International Conference on Advanced
Information Networking and Applications (AINA 2020), Springe
On measurement of top polarization as a probe of production mechanisms at the LHC
In this note we demonstrate the use of top polarization in the study of resonances at the LHC, in the possible case where the dynamics implies
a non-zero top polarization. As a probe of top polarization we construct an
asymmetry in the decay-lepton azimuthal angle distribution (corresponding to
the sign of ) in the laboratory. The asymmetry is non-vanishing
even for a symmetric collider like the LHC, where a positive axis is not
uniquely defined. The angular distribution of the leptons has the advantage of
being a faithful top-spin analyzer, unaffected by possible anomalous
couplings, to linear order. We study, for purposes of demonstration, the case
of a as might exist in the little Higgs models. We identify kinematic cuts
which ensure that our asymmetry reflects the polarization in sign and
magnitude. We investigate possibilities at the LHC with two energy options:
TeV and TeV, as well as at the Tevatron. At the
LHC the model predicts net top quark polarization of the order of a few per
cent for GeV, being as high as for a smaller mass
of the of GeV and for the largest allowed coupling in the model, the
values being higher for the TeV option. These polarizations translate to a
deviation from the standard-model value of azimuthal asymmetry of up to about
() for () TeV LHC, whereas for the Tevatron, values as high as
are attained. For the TeV LHC with an integrated luminosity of 10
fb, these numbers translate into a sensitivity over a large
part of the range GeV.Comment: 28 page, LaTeX, requires JHEP style file, 12 figures. Typos corrected
and references adde
- …