198 research outputs found
Quality of life in Type 1 (insulin-dependent) diabetic patients prior to and after pancreas and kidney transplantation in relation to organ function
Improvement of the quality of life in Type 1 (insulin-dependent) diabetic patients with severe late complications is one of the main goals of pancreas and/or kidney grafting. To assess the influences of these treatment modalities on the different aspects of the quality of life a cross-sectional study in 157 patients was conducted. They were categorized into patients pre-transplant without dialysis (n=29; Group A), pre-transplant under dialysis (n=44; Group B), post-transplant with pancreas and kidney functioning (n=31; Group C), post-transplant with functioning kidney, but insulin therapy (n=29; Group D), post-transplant under dialysis and insulin therapy again (n=15; Group E) and patients after single pancreas transplantation and rejection, with good renal function, but insulin therapy (n=9; Group F). All patients answered a mailed, self-administered questionnaire (217 questions) consisting of a broad spectrum of rehabilitation criteria. The results indicate a better quality of life in Groups C and D as compared to the other groups. In general the scores are highest in C, but without any significant difference to D. Impressive significant differences between C or D and the other groups were found especially in their satisfaction with physical capacity, leisure-time activities or the overall quality of life. The satisfaction with the latter is highest in C (mean±SEM: 4.0±0.2 on a 1 to 5-rating scale; significantly different from A: 3.1±0.1, B: 2.7±0.2 and E: 2.6±0.3; p<0.01), followed by D (3.8±0.2; significantly different from B and E; p<0.01). Group F shows a mean of 3.1±0.4, which is not significantly different from C. The percentages of patients in each group, who are not working: A: 38 %, B: 64 %, C: 74 %, D: 66 %, E: 87 % and F: 78 % indicate that there is no marked improvement in the vocational situation after successful grafting
A feasibility study incorporating a pilot randomised controlled trial of oral feeding plus pre-treatment gastrostomy tube versus oral feeding plus as-needed nasogastric tube feeding in patients undergoing chemoradiation for head and neck cancer (TUBE trial): study protocol
Background
There are 7000 new cases of head and neck squamous cell cancers (HNSCC) treated by the NHS each year. Stage III and IV HNSCC can be treated non-surgically by radio therapy (RT) or chemoradiation therapy (CRT). CRT can affect eating and drinking through a range of side effects with 90 % of patients undergoing this treatment requiring nutritional support via gastrostomy (G) or nasogastric (NG) tube feeding.
Long-term dysphagia following CRT is a primary concern for patients. The effect of enteral feeding routes on swallowing function is not well understood, and the two feeding methods have, to date, not been compared to assess which leads to a better patient outcome.
The purpose of this study is to explore the feasibility of conducting a randomised controlled trial (RCT) comparing these two options with particular emphasis on patient willingness to be randomised and clinician willingness to approach eligible patients.
Methods/design
This is a mixed methods multicentre study to establish the feasibility of a randomised controlled trial comparing oral feeding plus pre-treatment gastrostomy versus oral feeding plus as required nasogastric tube feeding in patients with HNSCC. A total of 60 participants will be randomised to the two arms of the study (1:1 ratio). The primary outcome of feasibility is a composite of recruitment (willingness to randomise and be randomised) and retention. A qualitative process evaluation investigating patient, family and friends and staff experiences of trial participation will also be conducted alongside an economic modelling exercise to synthesise available evidence and provide estimates of cost-effectiveness and value of information. Participants will be assessed at baseline (pre-randomisation), during CRT weekly, 3 months and 6 months.
Discussion
Clinicians are in equipoise over the enteral feeding options for patients being treated with CRT. Swallowing outcomes have been identified as a top priority for patients following treatment and this trial would inform a future larger scale RCT in this area to inform best practice
Targeting the hypoxic fraction of tumours using hypoxia activated prodrugs
The presence of a microenvironment within most tumours containing regions of low oxygen tension or hypoxia has profound biological and therapeutic implications. Tumour hypoxia is known to promote the development of an aggressive phenotype, resistance to both chemotherapy and radiotherapy and is strongly associated with poor clinical outcome. Paradoxically, it is recognised as a high priority target and one therapeutic strategies designed to eradicate hypoxic cells in tumours are a group of compounds known collectively as hypoxia activated prodrugs (HAPs) or bioreductive drugs. These drugs are inactive prodrugs that require enzymatic activation (typically by 1 or 2 electron oxidoreductases) to generate cytotoxic species with selectivity for hypoxic cells being determined by (i) the ability of oxygen to either reverse or inhibit the activation process and (ii) the presence of elevated expression of oxidoreductases in tumours. The concepts underpinning HAP development were established over 40 years ago and have been refined over the years to produce a new generation of HAPs that are under preclinical and clinical development. The purpose of this article is to describe current progress in the development of HAPs focusing on the mechanisms of action, preclinical properties and clinical progress of leading examples
Developments in the Photonic Theory of Fluorescence
Conventional fluorescence commonly arises when excited molecules relax to their ground electronic state, and most of the surplus energy dissipates in the form of photon emission. The consolidation and full development of theory based on this concept has paved the way for the discovery of several mechanistic variants that can come into play with the involvement of laser input – most notably the phenomenon of multiphoton-induced fluorescence. However, other effects can become apparent when off-resonant laser input is applied during the lifetime of the initial excited state. Examples include a recently identified scheme for laser-controlled fluorescence. Other systems of interest are those in which fluorescence is emitted from a set of two or more coupled nanoemitters. This chapter develops a quantum theoretical outlook to identify and describe these processes, leading to a discussion of potential applications ranging from all-optical switching to the generation of optical vortices
Acquired immunologic tolerance: with particular reference to transplantation
The first unequivocally successful bone marrow cell transplantation in humans was recorded in 1968 by the University of Minnesota team of Robert A. Good (Gatti et al. Lancet 2: 1366–1369, 1968). This achievement was a direct extension of mouse models of acquired immunologic tolerance that were established 15 years earlier. In contrast, organ (i.e. kidney) transplantation was accomplished precociously in humans (in 1959) before demonstrating its feasibility in any experimental model and in the absence of a defensible immunologic rationale. Due to the striking differences between the outcomes with the two kinds of procedure, the mechanisms of organ engraftment were long thought to differ from the leukocyte chimerism-associated ones of bone marrow transplantation. This and other concepts of alloengraftment and acquired tolerance have changed over time. Current concepts and their clinical implications can be understood and discussed best from the perspective provided by the life and times of Bob Good
History of clinical transplantation
The emergence of transplantation has seen the development of increasingly potent immunosuppressive agents, progressively better methods of tissue and organ preservation, refinements in histocompatibility matching, and numerous innovations is surgical techniques. Such efforts in combination ultimately made it possible to successfully engraft all of the organs and bone marrow cells in humans. At a more fundamental level, however, the transplantation enterprise hinged on two seminal turning points. The first was the recognition by Billingham, Brent, and Medawar in 1953 that it was possible to induce chimerism-associated neonatal tolerance deliberately. This discovery escalated over the next 15 years to the first successful bone marrow transplantations in humans in 1968. The second turning point was the demonstration during the early 1960s that canine and human organ allografts could self-induce tolerance with the aid of immunosuppression. By the end of 1962, however, it had been incorrectly concluded that turning points one and two involved different immune mechanisms. The error was not corrected until well into the 1990s. In this historical account, the vast literature that sprang up during the intervening 30 years has been summarized. Although admirably documenting empiric progress in clinical transplantation, its failure to explain organ allograft acceptance predestined organ recipients to lifetime immunosuppression and precluded fundamental changes in the treatment policies. After it was discovered in 1992 that long-surviving organ transplant recipient had persistent microchimerism, it was possible to see the mechanistic commonality of organ and bone marrow transplantation. A clarifying central principle of immunology could then be synthesized with which to guide efforts to induce tolerance systematically to human tissues and perhaps ultimately to xenografts
Fast, Multiphase Volume Adaptation to Hyperosmotic Shock by Escherichia coli
All living cells employ an array of different mechanisms to help them survive changes in extra cellular osmotic pressure. The difference in the concentration of chemicals in a bacterium's cytoplasm and the external environment generates an osmotic pressure that inflates the cell. It is thought that the bacterium Escherichia coli use a number of interconnected systems to adapt to changes in external pressure, allowing them to maintain turgor and live in surroundings that range more than two-hundred-fold in external osmolality. Here, we use fluorescence imaging to make the first measurements of cell volume changes over time during hyperosmotic shock and subsequent adaptation on a single cell level in vivo with a time resolution on the order of seconds. We directly observe two previously unseen phases of the cytoplasmic water efflux upon hyperosmotic shock. Furthermore, we monitor cell volume changes during the post-shock recovery and observe a two-phase response that depends on the shock magnitude. The initial phase of recovery is fast, on the order of 15–20 min and shows little cell-to-cell variation. For large sucrose shocks, a secondary phase that lasts several hours adds to the recovery. We find that cells are able to recover fully from shocks as high as 1 Osmol/kg using existing systems, but that for larger shocks, protein synthesis is required for full recovery
Chemoradiation for advanced hypopharyngeal carcinoma: a retrospective study on efficacy, morbidity and quality of life
Chemoradiation (CRT) is a valuable treatment option for advanced hypopharyngeal squamous cell cancer (HSCC). However, long-term toxicity and quality of life (QOL) is scarcely reported. Therefore, efficacy, acute and long-term toxic effects, and long-term QOL of CRT for advanced HSCC were evaluated,using retrospective study and post-treatment quality of life questionnaires. in a tertiary hospital setting. Analysis was performed of 73 patients that had been treated with CRT. Toxicity was rated using the CTCAE score list. QOL questionnaires EORTC QLQ-C30, QLQ-H&N35, and VHI were analyzed. The most common acute toxic effects were dysphagia and mucositis. Dysphagia and xerostomia remained problematic during long-term follow-up. After 3 years, the disease-specific survival was 41%, local disease control was 71%, and regional disease control was 97%. The results indicated that CRT for advanced HSCC is associated with high locoregional control and disease-specific survival. However, significant acute and long-term toxic effects occur, and organ preservation appears not necessarily equivalent to preservation of function and better QOL
Allergens induce enhanced bronchoconstriction and leukotriene production in C5 deficient mice
BACKGROUND: Previous genetic analysis has shown that a deletion in the complement component 5 gene-coding region renders mice more susceptible to allergen-induced airway hyperresponsiveness (AHR) due to reduced IL-12 production. We investigated the role of complement in a murine model of asthma-like pulmonary inflammation. METHODS: In order to evaluate the role of complement B10 mice either sufficient or deficient in C5 were studied. Both groups of mice immunized and challenged with a house dust extract (HDE) containing high levels of cockroach allergens. Airways hyper-reactivity was determined with whole-body plesthysmography. Bronchoalveolar lavage (BAL) was performed to determine pulmonary cellular recruitment and measure inflammatory mediators. Lung homogenates were assayed for mediators and plasma levels of IgE determined. Pulmonary histology was also evaluated. RESULTS: C5-deficient mice showed enhanced AHR to methylcholine challenge, 474% and 91% increase above baseline Penh in C5-deficient and C5-sufficient mice respectively, p < 0.001. IL-12 levels in the lung homogenate (LH) were only slightly reduced and BAL IL-12 was comparable in C5-sufficient and C5-deficient mice. However, C5-deficient mice had significantly higher cysteinyl-leukotriene levels in the BAL fluid, 1913 +/- 246 pg/ml in C5d and 756 +/- 232 pg/ml in C5-sufficient, p = 0.003. CONCLUSION: These data demonstrate that C5-deficient mice show enhanced AHR due to increased production of cysteinyl-leukotrienes
- …
