149 research outputs found
The effect of four-phasic versus three-phasic contrast media injection protocols on extravasation rate in coronary CT angiography: a randomized controlled trial.
OBJECTIVES: Contrast media (CM) extravasation is a well-known complication of CT angiography (CTA). Our prospective randomized control study aimed to assess whether a four-phasic CM administration protocol reduces the risk of extravasation compared to the routinely used three-phasic protocol in coronary CTA. METHODS: Patients referred to coronary CTA due to suspected coronary artery disease were included in the study. All patients received 400 mg/ml iomeprol CM injected with dual-syringe automated injector. Patients were randomized into a three-phasic injection-protocol group, with a CM bolus of 85 ml followed by 40 ml of 75%:25% saline/CM mixture and 30 ml saline chaser bolus; and a four-phasic injection-protocol group, with a saline pacer bolus of 10 ml injected at a lower flow rate before the three-phasic protocol. RESULTS: 2,445 consecutive patients were enrolled (mean age 60.6 +/- 12.1 years; females 43.6%). Overall rate of extravasation was 0.9% (23/2,445): 1.4% (17/1,229) in the three-phasic group and 0.5% (6/1,216) in the four-phasic group (p = 0.034). CONCLUSIONS: Four-phasic CM administration protocol is easy to implement in the clinical routine at no extra cost. The extravasation rate is reduced by 65% with the application of the four-phasic protocol compared to the three-phasic protocol in coronary CTA. KEY POINTS: * Four-phasic CM injection-protocol reduces extravasation rate by 65% compared to three-phasic. * The saline pacer bolus substantially reduces the risk of CM extravasation. * The implementation of four-phasic injection-protocol is at no cost
Type 1 Fimbriae, a Colonization Factor of Uropathogenic Escherichia coli, Are Controlled by the Metabolic Sensor CRP-cAMP
Type 1 fimbriae are a crucial factor for the virulence of uropathogenic Escherichia coli during the first steps of infection by mediating adhesion to epithelial cells. They are also required for the consequent colonization of the tissues and for invasion of the uroepithelium. Here, we studied the role of the specialized signal transduction system CRP-cAMP in the regulation of type 1 fimbriation. Although initially discovered by regulating carbohydrate metabolism, the CRP-cAMP complex controls a major regulatory network in Gram-negative bacteria, including a broad subset of genes spread into different functional categories of the cell. Our results indicate that CRP-cAMP plays a dual role in type 1 fimbriation, affecting both the phase variation process and fimA promoter activity, with an overall repressive outcome on fimbriation. The dissection of the regulatory pathway let us conclude that CRP-cAMP negatively affects FimB-mediated recombination by an indirect mechanism that requires DNA gyrase activity. Moreover, the underlying studies revealed that CRP-cAMP controls the expression of another global regulator in Gram-negative bacteria, the leucine-responsive protein Lrp. CRP-cAMP-mediated repression is limiting the switch from the non-fimbriated to the fimbriated state. Consistently, a drop in the intracellular concentration of cAMP due to altered physiological conditions (e.g. growth in presence of glucose) increases the percentage of fimbriated cells in the bacterial population. We also provide evidence that the repression of type 1 fimbriae by CRP-cAMP occurs during fast growth conditions (logarithmic phase) and is alleviated during slow growth (stationary phase), which is consistent with an involvement of type 1 fimbriae in the adaptation to stress conditions by promoting biofilm growth or entry into host cells. Our work suggests that the metabolic sensor CRP-cAMP plays a role in coupling the expression of type 1 fimbriae to environmental conditions, thereby also affecting subsequent attachment and colonization of host tissues
Numerosity Estimation in Visual Stimuli in the Absence of Luminance-Based Cues
Numerosity estimation is a basic preverbal ability that humans share with many animal species and that is believed to be foundational of numeracy skills. It is notoriously difficult, however, to establish whether numerosity estimation is based on numerosity itself, or on one or more non-numerical cues like-in visual stimuli-spatial extent and density. Frequently, different non-numerical cues are held constant on different trials. This strategy, however, still allows numerosity estimation to be based on a combination of non-numerical cues rather than on any particular one by itself.Here we introduce a novel method, based on second-order (contrast-based) visual motion, to create stimuli that exclude all first-order (luminance-based) cues to numerosity. We show that numerosities can be estimated almost as well in second-order motion as in first-order motion.The results show that numerosity estimation need not be based on first-order spatial filtering, first-order density perception, or any other processing of luminance-based cues to numerosity. Our method can be used as an effective tool to control non-numerical variables in studies of numerosity estimation
Effects of Extreme Precipitation to the Distribution of Infectious Diseases in Taiwan, 1994–2008
The incidence of extreme precipitation has increased with the exacerbation of worldwide climate disruption. We hypothesize an association between precipitation and the distribution patterns that would affect the endemic burden of 8 infectious diseases in Taiwan, including water- and vector-borne infectious diseases. A database integrating daily precipitation and temperature, along with the infectious disease case registry for all 352 townships in the main island of Taiwan was analysed for the period from 1994 to 2008. Four precipitation levels, <130 mm, 130–200 mm, 200–350 mm and >350 mm, were categorized to represent quantitative differences, and their associations with each specific disease was investigated using the Generalized Additive Mixed Model and afterwards mapped on to the Geographical Information System. Daily precipitation levels were significantly correlated with all 8 mandatory-notified infectious diseases in Taiwan. For water-borne infections, extreme torrential precipitation (>350 mm/day) was found to result in the highest relative risk for bacillary dysentery and enterovirus infections when compared to ordinary rain (<130 mm/day). Yet, for vector-borne diseases, the relative risk of dengue fever and Japanese encephalitis increased with greater precipitation only up to 350 mm. Differential lag effects following precipitation were statistically associated with increased risk for contracting individual infectious diseases. This study’s findings can help health resource sector management better allocate medical resources and be better prepared to deal with infectious disease outbreaks following future extreme precipitation events
Guanosine reduces apoptosis and inflammation associated with restoration of function in rats with acute spinal cord injury
Spinal cord injury results in progressive waves of secondary injuries, cascades of noxious pathological mechanisms that substantially exacerbate the primary injury and the resultant permanent functional deficits. Secondary injuries are associated with inflammation, excessive cytokine release, and cell apoptosis. The purine nucleoside guanosine has significant trophic effects and is neuroprotective, antiapoptotic in vitro, and stimulates nerve regeneration. Therefore, we determined whether systemic administration of guanosine could protect rats from some of the secondary effects of spinal cord injury, thereby reducing neurological deficits. Systemic administration of guanosine (8 mg/kg per day, i.p.) for 14 consecutive days, starting 4 h after moderate spinal cord injury in rats, significantly improved not only motor and sensory functions, but also recovery of bladder function. These improvements were associated with reduction in the inflammatory response to injury, reduction of apoptotic cell death, increased sparing of axons, and preservation of myelin. Our data indicate that the therapeutic action of guanosine probably results from reducing inflammation resulting in the protection of axons, oligodendrocytes, and neurons and from inhibiting apoptotic cell death. These data raise the intriguing possibility that guanosine may also be able to reduce secondary pathological events and thus improve functional outcome after traumatic spinal cord injury in humans
Recommended from our members
Northern Eurasia Future Initiative (NEFI): facing the challenges and pathways of global change in the 21st century
During the past several decades, the Earth system has changed significantly, especially across Northern Eurasia. Changes in the socio-economic conditions of the larger countries in the region have also resulted in a variety of regional environmental changes that can
have global consequences. The Northern Eurasia Future Initiative (NEFI) has been designed as an essential continuation of the Northern Eurasia Earth Science
Partnership Initiative (NEESPI), which was launched in 2004. NEESPI sought to elucidate all aspects of ongoing environmental change, to inform societies and, thus, to
better prepare societies for future developments. A key principle of NEFI is that these developments must now be secured through science-based strategies co-designed
with regional decision makers to lead their societies to prosperity in the face of environmental and institutional challenges. NEESPI scientific research, data, and
models have created a solid knowledge base to support the NEFI program. This paper presents the NEFI research vision consensus based on that knowledge. It provides the reader with samples of recent accomplishments in regional studies and formulates new NEFI science questions. To address these questions, nine research foci are identified and their selections are briefly justified. These foci include: warming of the Arctic; changing frequency, pattern, and intensity of extreme and inclement environmental conditions; retreat of the cryosphere; changes in terrestrial water cycles; changes in the biosphere; pressures on land-use; changes in infrastructure; societal actions in response to environmental change; and quantification of Northern Eurasia's role in the global Earth system. Powerful feedbacks between the Earth and human systems in Northern Eurasia (e.g., mega-fires, droughts, depletion of the cryosphere essential for water supply, retreat of sea ice) result from past and current human activities (e.g., large scale water withdrawals, land use and governance change) and
potentially restrict or provide new opportunities for future human activities. Therefore, we propose that Integrated Assessment Models are needed as the final stage of global
change assessment. The overarching goal of this NEFI modeling effort will enable evaluation of economic decisions in response to changing environmental conditions and justification of mitigation and adaptation efforts
Endocrinologic, neurologic, and visual morbidity after treatment for craniopharyngioma
Craniopharyngiomas are locally aggressive tumors which typically are focused in the sellar and suprasellar region near a number of critical neural and vascular structures mediating endocrinologic, behavioral, and visual functions. The present study aims to summarize and compare the published literature regarding morbidity resulting from treatment of craniopharyngioma. We performed a comprehensive search of the published English language literature to identify studies publishing outcome data of patients undergoing surgery for craniopharyngioma. Comparisons of the rates of endocrine, vascular, neurological, and visual complications were performed using Pearson’s chi-squared test, and covariates of interest were fitted into a multivariate logistic regression model. In our data set, 540 patients underwent surgical resection of their tumor. 138 patients received biopsy alone followed by some form of radiotherapy. Mean overall follow-up for all patients in these studies was 54 ± 1.8 months. The overall rate of new endocrinopathy for all patients undergoing surgical resection of their mass was 37% (95% CI = 33–41). Patients receiving GTR had over 2.5 times the rate of developing at least one endocrinopathy compared to patients receiving STR alone or STR + XRT (52 vs. 19 vs. 20%, χ2P < 0.00001). On multivariate analysis, GTR conferred a significant increase in the risk of endocrinopathy compared to STR + XRT (OR = 3.45, 95% CI = 2.05–5.81, P < 0.00001), after controlling for study size and the presence of significant hypothalamic involvement. There was a statistical trend towards worse visual outcomes in patients receiving XRT after STR compared to GTR or STR alone (GTR = 3.5% vs. STR 2.1% vs. STR + XRT 6.4%, P = 0.11). Given the difficulty in obtaining class 1 data regarding the treatment of this tumor, this study can serve as an estimate of expected outcomes for these patients, and guide decision making until these data are available
Criteria for the use of omics-based predictors in clinical trials: Explanation and elaboration
High-throughput 'omics' technologies that generate molecular profiles for biospecimens have been extensively used in preclinical studies to reveal molecular subtypes and elucidate the biological mechanisms of disease, and in retrospective studies on clinical specimens to develop mathematical models to predict clinical endpoints. Nevertheless, the translation of these technologies into clinical tests that are useful for guiding management decisions for patients has been relatively slow. It can be difficult to determine when the body of evidence for an omics-based test is sufficiently comprehensive and reliable to support claims that it is ready for clinical use, or even that it is ready for definitive evaluation in a clinical trial in which it may be used to direct patient therapy. Reasons for this difficulty include the exploratory and retrospective nature of many of these studies, the complexity of these assays and their application to clinical specimens, and the many potential pitfalls inherent in the development of mathematical predictor models from the very high-dimensional data generated by these omics technologies. Here we present a checklist of criteria to consider when evaluating the body of evidence supporting the clinical use of a predictor to guide patient therapy. Included are issues pertaining to specimen and assay requirements, the soundness of the process for developing predictor models, expectations regarding clinical study design and conduct, and attention to regulatory, ethical, and legal issues. The proposed checklist should serve as a useful guide to investigators preparing proposals for studies involving the use of omics-based tests. The US National Cancer Institute plans to refer to these guidelines for review of proposals for studies involving omics tests, and it is hoped that other sponsors will adopt the checklist as well. © 2013 McShane et al.; licensee BioMed Central Ltd
- …