475 research outputs found

    BioWarehouse: a bioinformatics database warehouse toolkit

    Get PDF
    BACKGROUND: This article addresses the problem of interoperation of heterogeneous bioinformatics databases. RESULTS: We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL) but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. CONCLUSION: BioWarehouse embodies significant progress on the database integration problem for bioinformatics

    Health-related quality of life as measured with EQ-5D among populations with and without specific chronic conditions: A population-based survey in Shaanxi province, China

    Get PDF
    © 2013 Tan et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Introduction: The aim of this study was to examine health-related quality of life (HRQoL) as measured by EQ-5D and to investigate the influence of chronic conditions and other risk factors on HRQoL based on a distributed sample located in Shaanxi Province, China. Methods: A multi-stage stratified cluster sampling method was performed to select subjects. EQ-5D was employed to measure the HRQoL. The likelihood that individuals with selected chronic diseases would report any problem in the EQ-5D dimensions was calculated and tested relative to that of each of the two reference groups. Multivariable linear regression models were used to investigate factors associated with EQ VAS. Results: The most frequently reported problems involved pain/discomfort (8.8%) and anxiety/depression (7.6%). Nearly half of the respondents who reported problems in any of the five dimensions were chronic patients. Higher EQ VAS scores were associated with the male gender, higher level of education, employment, younger age, an urban area of residence, access to free medical service and higher levels of physical activity. Except for anemia, all the selected chronic diseases were indicative of a negative EQ VAS score. The three leading risk factors were cerebrovascular disease, cancer and mental disease. Increases in age, number of chronic conditions and frequency of physical activity were found to have a gradient effect. Conclusion: The results of the present work add to the volume of knowledge regarding population health status in this area, apart from the known health status using mortality and morbidity data. Medical, policy, social and individual attention should be given to the management of chronic diseases and improvement of HRQoL. Longitudinal studies must be performed to monitor changes in HRQoL and to permit evaluation of the outcomes of chronic disease intervention programs. © 2013 Tan et al.National Nature Science Foundation (No. 8107239

    Application of prolonged microdialysis sampling in carboplatin-treated cancer patients

    Get PDF
    Purpose: To better understand the mechanisms underlying (in)sensitivity of tumors to anticancer drugs, assessing intra-tumor drug pharmacokinetics (PKs) could be important. We explored the feasibility of microdialysis in tumor tissue for multiple days in a clinical setting, using carboplatin as model drug. Methods: Plasma and microdialysate samples from tumor and adipose normal tissues were collected up to 47 h after dosing in eight carboplatin-treated patients with an accessible (sub)cutaneous tumor. Results: Pharmacokinetics were evaluable in tumor tissue in 6/8 patients and in adipose normal tissue in 3/8 patients. Concentration-time curves of unbound platinum in both the tissues followed the pattern of the curves in plasma, with exposure ratios of tissue versus plasma ranging from 0.64 to 1.46. Conclusions: Microdialysis can be successfully employed in ambulant patients for multiple days, which enables one to study tissue PK of anticancer drugs in normal and malignant tissues in more detail

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV

    Get PDF
    The azimuthal anisotropy of charged particles in PbPb collisions at nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS detector at the LHC over an extended transverse momentum (pt) range up to approximately 60 GeV. The data cover both the low-pt region associated with hydrodynamic flow phenomena and the high-pt region where the anisotropies may reflect the path-length dependence of parton energy loss in the created medium. The anisotropy parameter (v2) of the particles is extracted by correlating charged tracks with respect to the event-plane reconstructed by using the energy deposited in forward-angle calorimeters. For the six bins of collision centrality studied, spanning the range of 0-60% most-central events, the observed v2 values are found to first increase with pt, reaching a maximum around pt = 3 GeV, and then to gradually decrease to almost zero, with the decline persisting up to at least pt = 40 GeV over the full centrality range measured.Comment: Replaced with published version. Added journal reference and DO

    Search for new physics with same-sign isolated dilepton events with jets and missing transverse energy

    Get PDF
    A search for new physics is performed in events with two same-sign isolated leptons, hadronic jets, and missing transverse energy in the final state. The analysis is based on a data sample corresponding to an integrated luminosity of 4.98 inverse femtobarns produced in pp collisions at a center-of-mass energy of 7 TeV collected by the CMS experiment at the LHC. This constitutes a factor of 140 increase in integrated luminosity over previously published results. The observed yields agree with the standard model predictions and thus no evidence for new physics is found. The observations are used to set upper limits on possible new physics contributions and to constrain supersymmetric models. To facilitate the interpretation of the data in a broader range of new physics scenarios, information on the event selection, detector response, and efficiencies is provided.Comment: Published in Physical Review Letter

    Compressed representation of a partially defined integer function over multiple arguments

    Get PDF
    In OLAP (OnLine Analitical Processing) data are analysed in an n-dimensional cube. The cube may be represented as a partially defined function over n arguments. Considering that often the function is not defined everywhere, we ask: is there a known way of representing the function or the points in which it is defined, in a more compact manner than the trivial one
    corecore