4,937 research outputs found

    Phase II trial of debulking surgery and photodynamic therapy for disseminated intraperitoneal tumors

    Get PDF
    Background: Photodynamic therapy (PDT) combines photosensitizer drug, oxygen, and laser light to kill tumor cells on surfaces. This is the initial report of our phase II trial, designed to evaluate the effectiveness of surgical debulking and PDT in carcinomatosis and sarcomatosis. Methods: Fifty-six patients were enrolled between April 1997 and January 2000. Patients were given Photofrin (2.5 mg/kg) intravenously 2 days before tumor-debulking surgery. Laser light was delivered to all peritoneal surfaces. Patients were followed with CT scans and laparoscopy to evaluate responses to treatment. Results: Forty-two patients were adequately debulked at surgery; these comprise the treatment group. There were 14 GI malignancies, 12 ovarian cancers and 15 sarcomas. Actuarial median survival was 21 months. Median time to recurrence was 3 months (range, 1-21 months). The most common serious toxicities were anemia (38%), liver function test (LFT) abnormalities (26%), and gastrointestinal toxicities(19%), and one patient died. Conclusions: Photofrin PDT for carcinomatosis has been successfully administered to 42 patients, with acceptable toxicity. The median survival of 21 months exceeds our expectations; however, the relative contribution of surgical resection versus PDT is unknown. Deficiencies in photosensitizer delivery, tissue oxygenation, or laser light distribution leading to recurrences may be addressed through the future use of new photosensitizers

    A novel approach to simulate gene-environment interactions in complex diseases

    Get PDF
    Background: Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. Results: We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. Conclusions: By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study

    Quantifying single nucleotide variant detection sensitivity in exome sequencing

    Get PDF
    BACKGROUND: The targeted capture and sequencing of genomic regions has rapidly demonstrated its utility in genetic studies. Inherent in this technology is considerable heterogeneity of target coverage and this is expected to systematically impact our sensitivity to detect genuine polymorphisms. To fully interpret the polymorphisms identified in a genetic study it is often essential to both detect polymorphisms and to understand where and with what probability real polymorphisms may have been missed. RESULTS: Using down-sampling of 30 deeply sequenced exomes and a set of gold-standard single nucleotide variant (SNV) genotype calls for each sample, we developed an empirical model relating the read depth at a polymorphic site to the probability of calling the correct genotype at that site. We find that measured sensitivity in SNV detection is substantially worse than that predicted from the naive expectation of sampling from a binomial. This calibrated model allows us to produce single nucleotide resolution SNV sensitivity estimates which can be merged to give summary sensitivity measures for any arbitrary partition of the target sequences (nucleotide, exon, gene, pathway, exome). These metrics are directly comparable between platforms and can be combined between samples to give “power estimates” for an entire study. We estimate a local read depth of 13X is required to detect the alleles and genotype of a heterozygous SNV 95% of the time, but only 3X for a homozygous SNV. At a mean on-target read depth of 20X, commonly used for rare disease exome sequencing studies, we predict 5–15% of heterozygous and 1–4% of homozygous SNVs in the targeted regions will be missed. CONCLUSIONS: Non-reference alleles in the heterozygote state have a high chance of being missed when commonly applied read coverage thresholds are used despite the widely held assumption that there is good polymorphism detection at these coverage levels. Such alleles are likely to be of functional importance in population based studies of rare diseases, somatic mutations in cancer and explaining the “missing heritability” of quantitative traits

    Hahn-Steinthal fracture: a case report

    Get PDF
    Isolated fracture of the capitellum is rare. We present clinical and radiological data on a single case of a fracture of capitellum. We came across a 31 year old woman who sustained an isolated Hahn Steinthal type of fracture. It was treated operatively by open reduction and internal fixation using mini fragment screws. The elbow was immobilized for 4 weeks. The patient regained full range of movement at 12 weeks post operatively. We reiterate that anatomical reduction and fixation is the right way to treat this injury

    Dioxin Toxicity In Vivo Results from an Increase in the Dioxin-Independent Transcriptional Activity of the Aryl Hydrocarbon Receptor

    Get PDF
    The Aryl hydrocarbon receptor (Ahr) is the nuclear receptor mediating the toxicity of dioxins -widespread and persistent pollutants whose toxic effects include tumor promotion, teratogenesis, wasting syndrome and chloracne. Elimination of Ahr in mice eliminates dioxin toxicity but also produces adverse effects, some seemingly unrelated to dioxin. Thus the relationship between the toxic and dioxin-independent functions of Ahr is not clear, which hampers understanding and treatment of dioxin toxicity. Here we develop a Drosophila model to show that dioxin actually increases the in vivo dioxin-independent activity of Ahr. This hyperactivation resembles the effects caused by an increase in the amount of its dimerisation partner Ahr nuclear translocator (Arnt) and entails an increased transcriptional potency of Ahr, in addition to the previously described effect on nuclear translocation. Thus the two apparently different functions of Ahr, dioxin-mediated and dioxin-independent, are in fact two different levels (hyperactivated and basal, respectively) of a single function

    Preventing complicated transseptal puncture with intracardiac echocardiography: case report

    Get PDF
    BACKGROUND: Recently, intracardiac echocardiography emerged as a useful tool in the electrophysiology laboratories for guiding transseptal left heart catheterizations, for avoiding thromboembolic and mechanical complications and assessing the ablation lesions characteristics. Although the value of ICE is well known, it is not a universal tool for achieving uncomplicated access to the left atrium. We present a case in which ICE led to interruption of a transseptal procedure because several risk factors for mechanical complications were revealed. CASE PRESENTATION: A case of a patient with paroxysmal atrial fibrillation and atrial flutter, and distorted intracardiac anatomy is presented. Intracardiac echocardiography showed a small oval fossa abouting to an enlarged aorta anteriorly. A very small distance from the interatrial septum to the left atrial free wall was seen. The latter two conditions were predisposing to a complicated transseptal puncture. According to fluoroscopy the transseptal needle had a correct position, but the intracardiac echo image showed that it was actually pointing towards the aortic root and most importantly, that it was virtually impossible to stabilize it in the fossa itself. Based on intracardiac echo findings a decision was made to limit the procedure only to ablation of the cavotricuspid isthmus and not to proceed further so as to avoid complications. CONCLUSION: This case report illustrates the usefulness of the intracardiac echocardiography in preventing serious or even fatal complications in transseptal procedures when the cardiac anatomy is unusual or distorted. It also helps to understand the possible mechanisms of mechanical complications in cases where fluoroscopic images are apparently normal
    • 

    corecore