2,090 research outputs found

    SIGMA: A System for Integrative Genomic Microarray Analysis of Cancer Genomes

    Get PDF
    BACKGROUND: The prevalence of high resolution profiling of genomes has created a need for the integrative analysis of information generated from multiple methodologies and platforms. Although the majority of data in the public domain are gene expression profiles, and expression analysis software are available, the increase of array CGH studies has enabled integration of high throughput genomic and gene expression datasets. However, tools for direct mining and analysis of array CGH data are limited. Hence, there is a great need for analytical and display software tailored to cross platform integrative analysis of cancer genomes. RESULTS: We have created a user-friendly java application to facilitate sophisticated visualization and analysis such as cross-tumor and cross-platform comparisons. To demonstrate the utility of this software, we assembled array CGH data representing Affymetrix SNP chip, Stanford cDNA arrays and whole genome tiling path array platforms for cross comparison. This cancer genome database contains 267 profiles from commonly used cancer cell lines representing 14 different tissue types. CONCLUSION: In this study we have developed an application for the visualization and analysis of data from high resolution array CGH platforms that can be adapted for analysis of multiple types of high throughput genomic datasets. Furthermore, we invite researchers using array CGH technology to deposit both their raw and processed data, as this will be a continually expanding database of cancer genomes. This publicly available resource, the System for Integrative Genomic Microarray Analysis (SIGMA) of cancer genomes, can be accessed at

    Pandemic H1N1 in Canada and the use of evidence in developing public health policies e A policy analysis

    Get PDF
    a b s t r a c t When responding to a novel infectious disease outbreak, policies are set under time constraints and uncertainty which can limit the ability to control the outbreak and result in unintended consequences including lack of public confidence. The H1N1 pandemic highlighted challenges in public health decision-making during a public health emergency. Understanding this process to identify barriers and modifiable influences is important to improve the response to future emergencies. The purpose of this study is to examine the H1N1 pandemic decision-making process in Canada with an emphasis on the use of evidence for public health decisions. Using semi-structured key informant interviews conducted after the pandemic (JulyeNovember 2010) and a document analysis, we examined four highly debated pandemic policies: use of adjuvanted vaccine by pregnant women, vaccine priority groups and sequencing, school closures and personal protective equipment. Data were analysed for thematic content guided by Lomas' policy decision-making framework as well as indicative coding using iterative methods. We interviewed 40 public health officials and scientific advisors across Canada and reviewed 76 pandemic policy documents. Our analysis revealed that pandemic pre-planning resulted in strong beliefs, which defined the decision-making process. Existing ideological perspectives of evidence strongly influenced how information was used such that the same evidentiary sources were interpreted differently according to the ideological perspective. Participants recognized that current models for public health decision-making failed to make explicit the roles of scientific evidence in relation to contextual factors. Conflict avoidance theory explained policy decisions that went against the prevailing evidence. Clarification of roles and responsibilities within the public health system would reduce duplication and maintain credibility. A more transparent and iterative approach to incorporating evidence into public health decision-making that reflects the realities of the external pressures present during a public health emergency is needed

    An Arthroscopic Device to Assess Articular Cartilage Defects and Treatment with a Hydrogel

    Get PDF
    The hydraulic resistance R across osteochondral tissue, especially articular cartilage, decreases with degeneration and erosion. Clinically useful measures to quantify and diagnose the extent of cartilage degeneration and efficacy of repair strategies, especially with regard to pressure maintenance, are still developing. The hypothesis of this study was that hydraulic resistance provides a quantitative measure of osteochondral tissue that could be used to evaluate the state of cartilage damage and repair. The aims were to (1) develop a device to measure R in an arthroscopic setting, (2) determine whether the device could detect differences in R for cartilage, an osteochondral defect, and cartilage treated using a hydrogel ex vivo, and (3) determine how quickly such differences could be discerned. The apparent hydraulic resistance of defect samples was ~35% less than intact cartilage controls, while the resistance of hydrogel-filled groups was not statistically different than controls, suggesting some restoration of fluid pressurization in the defect region by the hydrogel. Differences in hydraulic resistance between control and defect groups were apparent after 4 s. The results indicate that the measurement of R is feasible for rapid and quantitative functional assessment of the extent of osteochondral defects and repair. The arthroscopic compatibility of the device demonstrates the potential for this measurement to be made in a clinical setting

    Outcome of liver transplantation for hepatitis B: Report of a single center's experience

    Full text link
    Results of liver transplantation (LT) for hepatitis B have improved significantly with the use of hepatitis B immune globulin (HBIG) and/or lamivudine. The aim of this study is to review the long-term outcome of patients who underwent LT for hepatitis B. Records of 41 patients who underwent LT for hepatitis B and survived 3 months or longer post-LT were reviewed. Twenty patients were administered no immunoprophylaxis or short-term intramuscular HBIG, whereas 21 patients were administered high-dose intravenous (IV) HBIG. Median post-LT follow-up in these 2 groups was 76 months (range, 4 to 155 months) and 25 months (range, 4 to 68 months), respectively. Hepatitis B recurred in 15 (75%) and 4 patients (19%) who underwent LT in the pre-HBIG and post-HBIG eras, respectively. Cumulative rates of recurrent hepatitis B at 1 and 3 years post-LT in these 2 groups were 66% and 77% and 20% and 20%, respectively ( P < .001). Recurrent hepatitis B in the post-HBIG era correlated with antibody to hepatitis B surface antigen titer less than 100 IU/L. Nine patients with recurrent hepatitis B were administered lamivudine for 13 to 49 months (median, 28 months); 6 patients continued to have stable or improved liver disease, whereas 3 patients developed virological breakthrough with slow deterioration of liver disease. Long-term IV HBIG is effective in preventing recurrent hepatitis B. The risk for recurrent hepatitis B is negligible after the first year post-LT. Among patients with no virological breakthrough, lamivudine can stabilize or improve liver disease for up to 4 years in patients with recurrent hepatitis B post-LT.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/35277/1/500070811_ftp.pd

    Long-rising Type II Supernovae in the Zwicky Transient Facility Census of the Local Universe

    Full text link
    SN 1987A was an unusual hydrogen-rich core-collapse supernova originating from a blue supergiant star. Similar blue supergiant explosions remain a small family of events, and are broadly characterized by their long rises to peak. The Zwicky Transient Facility (ZTF) Census of the Local Universe (CLU) experiment aims to construct a spectroscopically complete sample of transients occurring in galaxies from the CLU galaxy catalog. We identify 13 long-rising (>40 days) Type II supernovae from the volume-limited CLU experiment during a 3.5 year period from June 2018 to December 2021, approximately doubling the previously known number of these events. We present photometric and spectroscopic data of these 13 events, finding peak r-band absolute magnitudes ranging from -15.6 to -17.5 mag and the tentative detection of Ba II lines in 9 events. Using our CLU sample of events, we derive a long-rising Type II supernova rate of 1.370.30+0.26×1061.37^{+0.26}_{-0.30}\times10^{-6} Mpc3^{-3} yr1^{-1}, \approx1.4% of the total core-collapse supernova rate. This is the first volumetric rate of these events estimated from a large, systematic, volume-limited experiment.Comment: 32 pages, 17 figures, 5 tables. Submitted to Ap

    LSST Science Book, Version 2.0

    Get PDF
    A survey that can cover the sky in optical bands over wide fields to faint magnitudes with a fast cadence will enable many of the exciting science opportunities of the next decade. The Large Synoptic Survey Telescope (LSST) will have an effective aperture of 6.7 meters and an imaging camera with field of view of 9.6 deg^2, and will be devoted to a ten-year imaging survey over 20,000 deg^2 south of +15 deg. Each pointing will be imaged 2000 times with fifteen second exposures in six broad bands from 0.35 to 1.1 microns, to a total point-source depth of r~27.5. The LSST Science Book describes the basic parameters of the LSST hardware, software, and observing plans. The book discusses educational and outreach opportunities, then goes on to describe a broad range of science that LSST will revolutionize: mapping the inner and outer Solar System, stellar populations in the Milky Way and nearby galaxies, the structure of the Milky Way disk and halo and other objects in the Local Volume, transient and variable objects both at low and high redshift, and the properties of normal and active galaxies at low and high redshift. It then turns to far-field cosmological topics, exploring properties of supernovae to z~1, strong and weak lensing, the large-scale distribution of galaxies and baryon oscillations, and how these different probes may be combined to constrain cosmological models and the physics of dark energy.Comment: 596 pages. Also available at full resolution at http://www.lsst.org/lsst/sciboo

    Do Biofilm Formation and Interactions with Human Cells Explain the Clinical Success of Acinetobacter baumannii?

    Get PDF
    BACKGROUND: The dramatic increase in antibiotic resistance and the recent manifestation in war trauma patients underscore the threat of Acinetobacter baumannii as a nosocomial pathogen. Despite numerous reports documenting its epidemicity, little is known about the pathogenicity of A. baumannii. The aim of this study was to obtain insight into the factors that might explain the clinical success of A. baumannii. METHODOLOGY/PRINCIPAL FINDINGS: We compared biofilm formation, adherence to and inflammatory cytokine induction by human cells for a large panel of well-described strains of A. baumannii and compared these features to that of other, clinically less relevant Acinetobacter species. Results revealed that biofilm formation and adherence to airway epithelial cells varied widely within the various species, but did not differ among the species. However, airway epithelial cells and cultured human macrophages produced significantly less inflammatory cytokines upon exposure to A. baumannii strains than to strains of A. junii, a species infrequently causing infection. CONCLUSION/SIGNIFICANCE: The induction of a weak inflammatory response may provide a clue to the persistence of A. baumannii in patients

    A computational approach to identify point mutations associated with occult hepatitis B: significant mutations affect coding regions but not regulative elements of HBV

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Occult Hepatitis B Infection (OBI) is characterized by absence of serum HBsAg and persistence of HBV-DNA in liver tissue, with low to undetectable serum HBV-DNA. The mechanisms underlying OBI remain to be clarified. To evaluate if specific point mutations of HBV genome may be associated with OBI, we applied an approach based on bioinformatics analysis of complete genome HBV sequences. In addition, the feasibility of bioinformatics prediction models to classify HBV infections into OBI and non-OBI by molecular data was evaluated.</p> <p>Methods</p> <p>41 OBI and 162 non-OBI complete genome sequences were retrieved from GenBank, aligned and subjected to univariable analysis including statistical evaluation. Their S coding region was analyzed for Stop codon mutations too, while S amino acid variability could be evaluated for genotype D only, due to the too small number of available complete genome OBI sequences from other genotypes.</p> <p>Prediction models were derived by multivariable analysis using Logistic Regression, Rule Induction and Random Forest approaches, with extra-sample error estimation by Multiple ten-fold Cross-Validation (MCV). Models were compared by t-test on the Area Under the Receiver Operating Characteristic curve (AUC) distributions obtained from the MCV runs for each model against the best-performing model.</p> <p>Results</p> <p>Variations in seven nucleotide positions were significantly associated with OBI, and occurred in 11 out of 41 OBI sequences (26.8%): likely, other mutations did not reach statistical significance due to the small size of OBI dataset. All variations affected at least one HBV coding region, but none of them mapped to regulative elements. All viral proteins, with the only exception of the X, were affected. Stop codons in the S, that might account for absence of serum HBsAg, were not significantly enriched in OBI sequences. In genotype D, amino acid variability in the S was higher in OBI than non-OBI, particularly in the immunodominant region. A Random Forest prediction model showed the best performance, but all models were not satisfactory in terms of specificity, due to the small sample size of OBIs; however results are promising in the perspective of a broader dataset of complete genome OBI sequences.</p> <p>Conclusions</p> <p>Data suggest that point mutations rarely occur in regulative elements of HBV, if ever, and contribute to OBI by affecting different viral proteins, suggesting heterogeneous mechanisms may be responsible for OBI, including, at least in genotype D, an escape mutation mechanism due to imperfect immune control. It appears possible to derive prediction models based on molecular data when a larger set of complete genome OBI sequences will become available.</p

    Solving the Task Variant Allocation Problem in Distributed Robotics

    Get PDF
    We consider the problem of assigning software processes (or tasks) to hardware processors in distributed robotics environments. We introduce the notion of a task variant, which supports the adaptation of software to specific hardware configurations. Task variants facilitate the trade-off of functional quality versus the requisite capacity and type of target execution processors. We formalise the problem of assigning task variants to processors as a mathematical model that incorporates typical constraints found in robotics applications; the model is a constrained form of a multi-objective, multi-dimensional, multiple-choice knapsack problem. We propose and evaluate three different solution methods to the problem: constraint programming, a constructive greedy heuristic and a local search metaheuristic. Furthermore, we demonstrate the use of task variants in a real instance of a distributed interactive multi-agent navigation system, showing that our best solution method (constraint programming) improves the system’s quality of service, as compared to the local search metaheuristic, the greedy heuristic and a randomised solution, by an average of 16, 31 and 56% respectively
    corecore