259 research outputs found

    Ischemia monitoring in off-pump coronary artery bypass surgery using intravascular near-infrared spectroscopy

    Get PDF
    BACKGROUND: In off-pump coronary artery bypass surgery, manipulations on the beating heart can lead to transient interruptions of myocardial oxygen supply, which can generate an accumulation of oxygen-dependent metabolites in coronary venous blood. The objective of this study was to evaluate the reliability of intravascular near-infrared spectroscopy as a monitoring method to detect possible ischemic events in off-pump coronary artery bypass procedures. METHODS: In 15 elective patients undergoing off-pump myocardial revascularization, intravascular near-infrared spectroscopic analysis of coronary venous blood was performed. NIR signals were transferred through a fiberoptic catheter for signal emission and collection. For data analysis and processing, a miniature spectrophotometer with multivariate statistical package was used. Signal acquisition and analysis were performed before and after revascularization. Spectroscopic data were compared with hemodynamic parameters, electrocardiogram, transesophageal echocardiography and laboratory findings. RESULTS: A conversion to extracorporeal circulation was not necessary. The mean number of grafts per patient was 3.1 ± 0.6. An intraoperative myocardial ischemia was not evident, as indicated by electrocardiogram and transesophageal echocardiography. Continuous spectroscopic analysis showed reproducible absorption spectra of coronary sinus blood. Due to uneventful intraoperative courses, clear ischemia-related changes could be detected in none of the patients. CONCLUSION: Our initial results show that intravascular near-infrared spectroscopy can reliably be used for an online intraoperative ischemia monitoring in off-pump coronary artery bypass surgery. However, the method has to be further evaluated and standardized to determine the role of spectroscopy in off-pump coronary artery bypass surgery

    Influence of genotyping error in linkage mapping for complex traits – an analytic study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Despite the current trend towards large epidemiological studies of unrelated individuals, linkage studies in families are still thoroughly being utilized as tools for disease gene mapping. The use of the single-nucleotide-polymorphisms (SNP) array technology in genotyping of family data has the potential to provide more informative linkage data. Nevertheless, SNP array data are not immune to genotyping error which, as has been suggested in the past, could dramatically affect the evidence for linkage especially in selective designs such as affected sib pair (ASP) designs. The influence of genotyping error on selective designs for continuous traits has not been assessed yet.</p> <p>Results</p> <p>We use the identity-by-descent (IBD) regression-based paradigm for linkage testing to analytically quantify the effect of simple genotyping error models under specific selection schemes for sibling pairs. We show, for example, that in extremely concordant (EC) designs, genotyping error leads to decreased power whereas it leads to increased type I error in extremely discordant (ED) designs. Perhaps surprisingly, the effect of genotyping error on inference is most severe in designs where selection is least extreme. We suggest a genomic control for genotyping errors via a simple modification of the intercept in the regression for linkage.</p> <p>Conclusion</p> <p>This study extends earlier findings: genotyping error can substantially affect type I error and power in selective designs for continuous traits. Designs involving both EC and ED sib pairs are fairly immune to genotyping error. When those designs are not feasible the simple genomic control strategy that we suggest offers the potential to deliver more robust inference, especially if genotyping is carried out by SNP array technology.</p

    Long-term all-sites cancer mortality time trends in Ohio, USA, 1970–2001: differences by race, gender and age

    Get PDF
    BACKGROUND: There were significant changes in cancer mortality in the USA over the last several decades, in the whole country and in particular states. However, no in depth analysis has been published so far, dealing with changes in mortality time trends in the state of Ohio. Since the state of Ohio belongs to the states of relatively high level of all-sites mortality in both males and females, it is of interest to analyze recent changes in mortality rates, as well as to compare them with the situation in the rest of the USA. The main aim of this study was to analyze, describe and interpret all-sites cancer mortality time trends in the population of the State of Ohio. METHODS: Cancer mortality data by age, sex, race and year for the period 1970–2001 were obtained from the Surveillance Research Program of the National Cancer Institute SEER*Stat software. A joinpoint regression methodology was used to provide estimated annual percentage changes (EAPCs) and to detect points in time where significant changes in the trends occurred. RESULTS: In both, males and females mortality rates were higher in blacks compared with whites. The difference was bigger in males (39.9%) than in women (23.3%). Mortality rates in Ohio are generally higher than average USA rates – an overall difference was 7.5% in men in 1997–2001, and 6.1% in women. All-sites mortality trends in Ohio and in the whole USA are similar. However, in general, mortality rates in Ohio remained elevated compared with the USA rates throughout the entire analyzed period. The exceptions are the rates in young and middle-aged African Americans. CONCLUSION: Although direction of time trends in Ohio are similar in Ohio and the whole US, Ohio still have cancer mortality rates higher than the US average. In addition, there is a significant discrepancy between white and black population of Ohio in all-sites mortality level, with disadvantage for Blacks. To diminish disparities in cancer mortality between African Americans and white inhabitants of Ohio efforts should be focused on increasing knowledge of black people regarding healthy lifestyle and behavioral risk factors, but also on diminishing socioeconomic differences, and last but not least, on better access to medical care

    Using clinical risk factors and bone mineral density to determine who among patients undergoing bone densitometry should have vertebral fracture assessment

    Get PDF
    Vertebral fracture assessment (VFA) is a new method for imaging thoracolumbar spine on bone densitometer. Among patients referred for bone densitometry, the selection of patients for VFA testing can be optimized using an index derived from clinical risk factors and bone density measurement. VFA, a method for imaging thoracolumbar spine on bone densitometer, was developed because vertebral fractures, although common and predictive of future fractures, are often not clinically diagnosed. The study objective was to develop a strategy for selecting patients for VFA. A convenience sample from a university hospital bone densitometry center included 892 subjects (795 women) referred for bone mineral density (BMD) testing. We used questionnaires to capture clinical risk factors and dual-energy X-ray absorptiometry to obtain BMD and VFA. Prevalence of vertebral fractures was 18% in women and 31% in men (p = 0.003 for gender difference). In women, age, height loss, glucocorticoid use, history of vertebral and other fractures, and BMD T-score were significantly and independently associated with vertebral fractures. A multivariate model which included above predictors had an area under the receiver operating curve of 0.85 with 95% confidence interval (CI) of 0.81 to 0.89. A risk factor index was derived from the above multivariate model. Using a level of 2 as a cut-off yielded 93% sensitivity (95% CI 87, 96) and 48% specificity (95% CI 69, 83). Assuming a 15% prevalence of vertebral fractures, this cut-off value had a 24% positive and 97% negative predictive value and required VFA scanning of three women at a cost of 60(assuminga60 (assuming a 20 cost/VFA scan) to detect one with vertebral fracture(s). Selecting patients for VFA can be optimized using an index derived from BMD measurement and easily obtained clinical risk factors

    Osteopenia: A Diagnostic and Therapeutic Challenge

    Get PDF
    We discussed whether we are able to select a subgroup of patients with osteopenia having a high fracture risk, in which anti-osteoporotic drug treatment can be advocated. We concluded that in individuals in whom, based on clinical risk factors, a dual-energy x-ray absorptiometry (DXA) was performed in which osteopenia was diagnosed, anti-osteoporotic treatment should be prescribed in those patients with prevalent vertebral fractures, and in patients chronically using glucocorticoids, in a dosage of 7.5 mg per day or more. Although recent developments with regard to high-resolution imaging techniques (eg, peripheral quantitative computed tomography) seem to be promising, until now they do not provide substantial more reliable information than DXA in the prediction of fractures. We think that more data are urgently needed, since safe and effective drugs are available, but there is uncertainty to which patients with osteopenia these drugs should be prescribed

    Comparison of four mathematical models to analyze indicator-dilution curves in the coronary circulation

    Get PDF
    While several models have proven to result in accurate estimations when measuring cardiac output using indicator dilution, the mono-exponential model has primarily been chosen for deriving coronary blood/plasma volume. In this study, we compared four models to derive coronary plasma volume using indicator dilution; the mono-exponential, power-law, gamma-variate, and local density random walk (LDRW) model. In anesthetized goats (N = 14), we determined the distribution volume of high molecular weight (2,000 kDa) dextrans. A bolus injection (1.0 ml, 0.65 mg/ml) was given intracoronary and coronary venous blood samples were taken every 0.5–1.0 s; outflow curves were analyzed using the four aforementioned models. Measurements were done at baseline and during adenosine infusion. Absolute coronary plasma volume estimates varied by ~25% between models, while the relative volume increase during adenosine infusion was similar for all models. The gamma-variate, LDRW, and mono-exponential model resulted in volumes corresponding with literature, whereas the power-model seemed to overestimate the coronary plasma volume. The gamma-variate and LDRW model appear to be suitable alternative models to the mono-exponential model to analyze coronary indicator-dilution curves, particularly since these models are minimally influenced by outliers and do not depend on data of the descending slope of the curve only

    After-hours colorectal surgery: a risk factor for anastomotic leakage

    Get PDF
    __Purpose:__ This study aims to increase knowledge of colorectal anastomotic leakage by performing an incidence study and risk factor analysis with new potential risk factors in a Dutch tertiary referral center. __Methods:__ All patients whom received a primary colorectal anastomosis between 1997 and 2007 were selected by means of operation codes. Patient records were studied for population description and risk factor analysis. __Results:__ In total 739 patients were included. Anastomotic leakage (AL) occurred in 64 (8.7%) patients of whom nine (14.1%) died. Median interval between operation and diagnosis was 8 days. The risk for AL was higher as the anastomoses were constructed more distally (p = 0.019). Univariate analysis showed duration of surgery (p = 0.038), BMI (p = 0.001), time of surgery (p = 0.029), prophylactic drainage (p = 0.006) and time under anesthesia (p = 0.012) to be associated to AL. Multivariate analysis showed BMI greater than 30 kg/m2(p = 0.006; OR 2.6 CI 1.3-5.2) and "after hours" construction of an anastomosis (p = 0.030; OR 2.2 CI 1.1-4.5) to be independent risk factors. __Conclusion:__ BMI greater than 30 kg/m2and "after hours" construction of an anastomosis were independent risk factors for colorectal anastomotic leakage

    Comparative Genomics and Transcriptomics of Propionibacterium acnes

    Get PDF
    The anaerobic Gram-positive bacterium Propionibacterium acnes is a human skin commensal that is occasionally associated with inflammatory diseases. Recent work has indicated that evolutionary distinct lineages of P. acnes play etiologic roles in disease while others are associated with maintenance of skin homeostasis. To shed light on the molecular basis for differential strain properties, we carried out genomic and transcriptomic analysis of distinct P. acnes strains. We sequenced the genome of the P. acnes strain 266, a type I-1a strain. Comparative genome analysis of strain 266 and four other P. acnes strains revealed that overall genome plasticity is relatively low; however, a number of island-like genomic regions, encoding a variety of putative virulence-associated and fitness traits differ between phylotypes, as judged from PCR analysis of a collection of P. acnes strains. Comparative transcriptome analysis of strains KPA171202 (type I-2) and 266 during exponential growth revealed inter-strain differences in gene expression of transport systems and metabolic pathways. In addition, transcript levels of genes encoding possible virulence factors such as dermatan-sulphate adhesin, polyunsaturated fatty acid isomerase, iron acquisition protein HtaA and lipase GehA were upregulated in strain 266. We investigated differential gene expression during exponential and stationary growth phases. Genes encoding components of the energy-conserving respiratory chain as well as secreted and virulence-associated factors were transcribed during the exponential phase, while the stationary growth phase was characterized by upregulation of genes involved in stress responses and amino acid metabolism. Our data highlight the genomic basis for strain diversity and identify, for the first time, the actively transcribed part of the genome, underlining the important role growth status plays in the inflammation-inducing activity of P. acnes. We argue that the disease-causing potential of different P. acnes strains is not only determined by the phylotype-specific genome content but also by variable gene expression

    Impacts of savanna trees on forage quality for a large African herbivore

    Get PDF
    Recently, cover of large trees in African savannas has rapidly declined due to elephant pressure, frequent fires and charcoal production. The reduction in large trees could have consequences for large herbivores through a change in forage quality. In Tarangire National Park, in Northern Tanzania, we studied the impact of large savanna trees on forage quality for wildebeest by collecting samples of dominant grass species in open grassland and under and around large Acacia tortilis trees. Grasses growing under trees had a much higher forage quality than grasses from the open field indicated by a more favourable leaf/stem ratio and higher protein and lower fibre concentrations. Analysing the grass leaf data with a linear programming model indicated that large savanna trees could be essential for the survival of wildebeest, the dominant herbivore in Tarangire. Due to the high fibre content and low nutrient and protein concentrations of grasses from the open field, maximum fibre intake is reached before nutrient requirements are satisfied. All requirements can only be satisfied by combining forage from open grassland with either forage from under or around tree canopies. Forage quality was also higher around dead trees than in the open field. So forage quality does not reduce immediately after trees die which explains why negative effects of reduced tree numbers probably go initially unnoticed. In conclusion our results suggest that continued destruction of large trees could affect future numbers of large herbivores in African savannas and better protection of large trees is probably necessary to sustain high animal densities in these ecosystems
    corecore