4,163 research outputs found
Corporate Financing in Great Britain
Background: The antifungal compound ketoconazole has, in addition to its ability to interfere with fungal ergosterol synthesis, effects upon other enzymes including human CYP3A4, CYP17, lipoxygenase and thromboxane synthetase. In the present study, we have investigated whether ketoconazole affects the cellular uptake and hydrolysis of the endogenous cannabinoid receptor ligand anandamide (AEA). Methodology/Principal Findings: The effects of ketoconazole upon endocannabinoid uptake were investigated using HepG2, CaCo2, PC-3 and C6 cell lines. Fatty acid amide hydrolase (FAAH) activity was measured in HepG2 cell lysates and in intact C6 cells. Ketoconazole inhibited the uptake of AEA by HepG2 cells and CaCo2 cells with IC50 values of 17 and 18 mu M, respectively. In contrast, it had modest effects upon AEA uptake in PC-3 cells, which have a low expression of FAAH. In cell-free HepG2 lysates, ketoconazole inhibited FAAH activity with an IC50 value (for the inhibitable component) of 34 mu M. Conclusions/Significance: The present study indicates that ketoconazole can inhibit the cellular uptake of AEA at pharmacologically relevant concentrations, primarily due to its effects upon FAAH. Ketoconazole may be useful as a template for the design of dual-action FAAH/CYP17 inhibitors as a novel strategy for the treatment of prostate cancer
Proposal to Search for Heavy Neutral Leptons at the SPS
A new fixed-target experiment at the CERN SPS accelerator is proposed that
will use decays of charm mesons to search for Heavy Neutral Leptons (HNLs),
which are right-handed partners of the Standard Model neutrinos. The existence
of such particles is strongly motivated by theory, as they can simultaneously
explain the baryon asymmetry of the Universe, account for the pattern of
neutrino masses and oscillations and provide a Dark Matter candidate.
Cosmological constraints on the properties of HNLs now indicate that the
majority of the interesting parameter space for such particles was beyond the
reach of the previous searches at the PS191, BEBC, CHARM, CCFR and NuTeV
experiments. For HNLs with mass below 2 GeV, the proposed experiment will
improve on the sensitivity of previous searches by four orders of magnitude and
will cover a major fraction of the parameter space favoured by theoretical
models.
The experiment requires a 400 GeV proton beam from the SPS with a total of
2x10^20 protons on target, achievable within five years of data taking. The
proposed detector will reconstruct exclusive HNL decays and measure the HNL
mass. The apparatus is based on existing technologies and consists of a target,
a hadron absorber, a muon shield, a decay volume and two magnetic
spectrometers, each of which has a 0.5 Tm magnet, a calorimeter and a muon
detector. The detector has a total length of about 100 m with a 5 m diameter.
The complete experimental set-up could be accommodated in CERN's North Area.
The discovery of a HNL would have a great impact on our understanding of
nature and open a new area for future research
Shotgun Phage Display - Selection for Bacterial Receptins or other Exported Proteins
Shotgun phage display cloning involves construction of libraries from randomly fragmented bacterial chromosomal DNA, cloned genes, or eukaryotic cDNAs, into a phagemid vector. The library obtained consists of phages expressing polypeptides corresponding to all genes encoded by the organism, or overlapping peptides derived from the cloned gene. From such a library, polypeptides with affinity for another molecule can be isolated by affinity selection, panning. The technique can be used to identify bacterial receptins and identification of their minimal binding domain, and but also to identify epitopes recognised by antibodies. In addition, after modification of the phagemid vector, the technique has also been used to identify bacterial extracytoplasmic proteins
The LHCb experiment control system : on the path to full automation
http://accelconf.web.cern.ch/AccelConf/icalepcs2011/papers/mobaust06.pdfInternational audienceThe experiment control system is in charge of the configuration, control and monitoring of the different subdetectors and of all areas of the online system. The building blocks of the control system are based on the PVSS SCADA System complemented by a control Framework developed in common for the 4 LHC experiments. This framework includes an "expert system" like tool called SMI++ which is used for the system automation. The experiment's operations are now almost completely automated, driven by a top-level object called Big-Brother, which pilots all the experiment's standard procedures and the most common error-recovery procedures. The architecture, tools and mechanisms used for the implementation as well as some operational examples will be described
Prospective Epidemiological Observations on the Course of the Disease in Fibromyalgia Patients
OBJECTIVES: The aim of the study was to carry out a survey in patients with fibromyalgia (FM), to examine their general health status and work incapacity (disability-pension status), and their views on the effectiveness of therapy received, over a two-year observation period. METHODS: 48 patients diagnosed with FM, according to the American College of Rheumatology (ACR) criteria, took part in the study. At baseline, and on average two years later, the patients underwent clinical investigation (dolorimetry, laboratory diagnostics, medical history taking) and completed the Fibromyalgia questionnaire (Dettmer and Chrostek [1]). RESULTS: 27/48 (56%) patients participated in the two-year follow-up. In general, the patients showed no improvement in their symptoms over the observation period, regardless of the type of therapy they had received. General satisfaction with quality of life improved, as did satisfaction regarding health status and the family situation, although the degree of pain experienced remain unchanged. In comparison with the initial examination, there was no change in either work-capacity or disability-pension status. CONCLUSIONS: The FM patients showed no improvement in pain, despite the many various treatments received over the two-year period. The increase in general satisfaction over the observation period was believed to be the result of patient instruction and education about the disease. To what extent a population of patients with FM would show similar outcomes if they did not receive any instruction/education about their disorder, cannot be ascertained from the present study; and, indeed, the undertaking of a study to investigate this would be ethically questionable. As present, no conclusions can be made regarding the influence of therapy on the primary and secondary costs associated with FM
Generic and Layered Framework Components for the Control of a Large Scale Data Acquisition System
The complexity of today's experiments in High Energy Physics results in a large amount of readout channels which can count up to a million and above. The experiments in general consist of various subsystems which themselves comprise a large amount of detectors requiring sophisticated DAQ and readout electronics. We report here on the structured software layers to control such a data acquisition system for the case of LHCb which is one of the four experiments for LHC. Additional focus is given on the protocols in use as well as the required hardware. An abstraction layer was implemented to allow access on the different and distinct hardware types in a coherent and generic manner. The hierarchical structure which allows propagating commands down to the subsystems is explained. Via finite state machines an expert system with auto-recovery abilities can be modeled
Recommended from our members
Standardization as Institutional Work: The Regulatory Power of a Responsible Investment Standard
This paper conceptualizes standardization as institutional work to study the emergence of a standard and the deployment of its regulatory power. We rely on unique access to longitudinal archival data for exploring how the FTSE4Good index, a responsible investment index, emerged as a standard for socially responsible corporate behavior. Our results show how three types of standardization work - calculative framing, engaging and valorizing - support the design, legitimation and monitoring processes whereby a standard acquires its regulatory power. Our findings reveal new facets in the dynamics of standardization by approaching standardization as a product of institutional work and in showing how unintended consequences of that work can be recaptured to strengthen the regulatory power of the standard
- …