249 research outputs found

    Minimizing radiographic contrast administration during coronary angiography using a novel contrast reduction system: A multicenter observational study of the DyeVert™ plus contrast reduction system

    Full text link
    ObjectiveTo evaluate contrast media (CM) volume (CMV) saved using the DyeVert™ Plus Contrast Reduction System (DyeVert Plus System, Osprey Medical) in patients undergoing diagnostic coronary angiogram (CAG) and/or percutaneous coronary interventional (PCI) procedures performed with manual injections.BackgroundCurrent guidelines advocate for monitoring and minimization of the total volume of CM in chronic kidney disease (CKD) patients undergoing invasive cardiac procedures. The DyeVert Plus System is an FDA cleared device designed to reduce CMV delivered during angiography and permit real‐time CMV monitoring.MethodsWe performed a multicenter, single‐arm, observational study. Eligible subjects were ≥ 18 years old with baseline estimated glomerular filtration rate (eGFR) 20–60 mL/min/1.73 m2. The primary endpoint was % CMV saved over the total procedure. A secondary objective was to evaluate adverse events (AEs) related to DyeVert Plus System or to CM use.ResultsA total of 114 subjects were enrolled at eight centers. Mean age was 72 ± 9 years, 72% were male, and mean body mass index was 29 ± 5. Baseline eGFR was 43 ± 11 mL/min/1.73 m2. CAG‐only was performed in 65% of cases. One hundred and five subjects were evaluable for the primary endpoint. Mean CMV attempted was 112 ± 85 mL (range 22–681) and mean CMV delivered was 67 ± 51 mL (range 12–403), resulting in an overall CMV savings of 40.1 ± 8.8% (95% CI 38.4, 41.8; P 0.3 mg/dL from baseline) was reported in 11 cases with seven occurring in subjects with baseline eGFR < 30 and three AKI events were attributed to CM. AKI rates increased as CMV/eGFR ratios increased.ConclusionsThese data suggest DyeVert Plus System use in CKD patients undergoing CAG and/or PCI results in clinically meaningful CMV savings while maintaining image quality.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/149537/1/ccd27935_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/149537/2/ccd27935.pd

    Improving road safety knowledge in Africa through crowdsourcing. The African Road Safety Observatory

    Get PDF
    Africa is the worst performing continent in road safety: the fatality rate, 26.6 per 100.000 inhabitants, is almost three times that of Europe's and fatalities per capita are projected to double from 2015 to 2030 (WHO, 2015). This is mainly due to the fact that Emerging Economies are experiencing increases in traffic, for which their traffic systems are not sufficiently prepared. On one hand, there is a significant demand for data and knowledge to be used for road safety-related decision making. On the other hand, there is a substantial lack of a reliable and detailed knowledge on road casualties in terms of the number of road accidents and fatalities occurring and, on the factors, leading to road accidents or affecting their consequences. When official data are poor or missing these could be integrated with other sources. The objective of this paper is to describe the African Road Safety Observatory (African RSO), a participative web portal developed in the field of the "SaferAfrica-Innovating dialogue and problems appraisal for a safer Africa" project, funded by the European Union's Horizon 2020 program. The African RSO combines traditional functions of analyzing and sharing road safety performance data and provide knowledge and information, with the more innovative ones: a Dialogue Platform and the crowdsourcing tool. The Dialogue Platform is dedicated to experts and stakeholders and aims at encouraging and facilitating a constructive engagement and dialogue on road safety in Africa, producing knowledge to inspire road safety funding, policies and interventions in Africa and providing recommendations to update the African Road Safety Action Plan and the African Road Safety Charter. The crowdsourcing tool allows African citizens to report and highlight road safety needs, to share opinions as well as to discuss solutions in their own Countries

    Complete genome sequence of the Medicago microsymbiont Ensifer (Sinorhizobium) medicae strain WSM419

    Get PDF
    Ensifer (Sinorhizobium) medicae is an effective nitrogen fixing microsymbiont of a diverse range of annual Medicago (medic) species. Strain WSM419 is an aerobic, motile, non-spore forming, Gram-negative rod isolated from a M. murex root nodule collected in Sardinia, Italy in 1981. WSM419 was manufactured commercially in Australia as an inoculant for annual medics during 1985 to 1993 due to its nitrogen fixation, saprophytic competence and acid tolerance properties. Here we describe the basic features of this organism, together with the complete genome sequence, and annotation. This is the first report of a complete genome se-quence for a microsymbiont of the group of annual medic species adapted to acid soils. We reveal that its genome size is 6,817,576 bp encoding 6,518 protein-coding genes and 81 RNA only encoding genes. The genome contains a chromosome of size 3,781,904 bp and 3 plasmids of size 1,570,951 bp, 1,245,408 bp and 219,313 bp. The smallest plasmid is a fea-ture unique to this medic microsymbiont

    Comparison of techniques for computing shell-model effective operators

    Get PDF
    Different techniques for calculating effective operators within the framework of the shell model using the same effective interaction and the same excitation spaces are presented. Starting with the large-basis no-core approach, we compare the time-honored perturbation-expansion approach and a model-space truncation approach. Results for the electric quadrupole and magnetic dipole operators are presented for 6^6Li. The convergence trends and dependence of the effective operators on differing excitation spaces and Pauli Q-operators is studied. In addition, the dependence of the electric-quadrupole effective charge on the harmonic-oscillator frequency and the mass number, for A=5,6, is investigated in the model-space truncation approach.Comment: 18 pages. REVTEX. 4 PostScript figure

    The genome and transcriptome of Trichormus sp NMC-1: insights into adaptation to extreme environments on the Qinghai-Tibet Plateau

    Get PDF
    The Qinghai-Tibet Plateau (QTP) has the highest biodiversity for an extreme environment worldwide, and provides an ideal natural laboratory to study adaptive evolution. In this study, we generated a draft genome sequence of cyanobacteria Trichormus sp. NMC-1 in the QTP and performed whole transcriptome sequencing under low temperature to investigate the genetic mechanism by which T. sp. NMC-1 adapted to the specific environment. Its genome sequence was 5.9 Mb with a G+C content of 39.2% and encompassed a total of 5362 CDS. A phylogenomic tree indicated that this strain belongs to the Trichormus and Anabaena cluster. Genome comparison between T. sp. NMC-1 and six relatives showed that functionally unknown genes occupied a much higher proportion (28.12%) of the T. sp. NMC-1 genome. In addition, functions of specific, significant positively selected, expanded orthogroups, and differentially expressed genes involved in signal transduction, cell wall/membrane biogenesis, secondary metabolite biosynthesis, and energy production and conversion were analyzed to elucidate specific adaptation traits. Further analyses showed that the CheY-like genes, extracellular polysaccharide and mycosporine-like amino acids might play major roles in adaptation to harsh environments. Our findings indicate that sophisticated genetic mechanisms are involved in cyanobacterial adaptation to the extreme environment of the QTP

    Average Entropy of a Subsystem from its Average Tsallis Entropy

    Full text link
    In the nonextensive Tsallis scenario, Page's conjecture for the average entropy of a subsystem[Phys. Rev. Lett. {\bf 71}, 1291(1993)] as well as its demonstration are generalized, i.e., when a pure quantum system, whose Hilbert space dimension is mnmn, is considered, the average Tsallis entropy of an mm-dimensional subsystem is obtained. This demonstration is expected to be useful to study systems where the usual entropy does not give satisfactory results.Comment: Revtex, 6 pages, 2 figures. To appear in Phys. Rev.

    A basis for variational calculations in d dimensions

    Full text link
    In this paper we derive expressions for matrix elements (\phi_i,H\phi_j) for the Hamiltonian H=-\Delta+\sum_q a(q)r^q in d > 1 dimensions. The basis functions in each angular momentum subspace are of the form phi_i(r)=r^{i+1+(t-d)/2}e^{-r^p/2}, i >= 0, p > 0, t > 0. The matrix elements are given in terms of the Gamma function for all d. The significance of the parameters t and p and scale s are discussed. Applications to a variety of potentials are presented, including potentials with singular repulsive terms of the form b/r^a, a,b > 0, perturbed Coulomb potentials -D/r + B r + Ar^2, and potentials with weak repulsive terms, such as -g r^2 + r^4, g > 0.Comment: 22 page

    Evaluating Statistical Methods Using Plasmode Data Sets in the Age of Massive Public Databases: An Illustration Using False Discovery Rates

    Get PDF
    Plasmode is a term coined several years ago to describe data sets that are derived from real data but for which some truth is known. Omic techniques, most especially microarray and genomewide association studies, have catalyzed a new zeitgeist of data sharing that is making data and data sets publicly available on an unprecedented scale. Coupling such data resources with a science of plasmode use would allow statistical methodologists to vet proposed techniques empirically (as opposed to only theoretically) and with data that are by definition realistic and representative. We illustrate the technique of empirical statistics by consideration of a common task when analyzing high dimensional data: the simultaneous testing of hundreds or thousands of hypotheses to determine which, if any, show statistical significance warranting follow-on research. The now-common practice of multiple testing in high dimensional experiment (HDE) settings has generated new methods for detecting statistically significant results. Although such methods have heretofore been subject to comparative performance analysis using simulated data, simulating data that realistically reflect data from an actual HDE remains a challenge. We describe a simulation procedure using actual data from an HDE where some truth regarding parameters of interest is known. We use the procedure to compare estimates for the proportion of true null hypotheses, the false discovery rate (FDR), and a local version of FDR obtained from 15 different statistical methods

    Fast Identification and Removal of Sequence Contamination from Genomic and Metagenomic Datasets

    Get PDF
    High-throughput sequencing technologies have strongly impacted microbiology, providing a rapid and cost-effective way of generating draft genomes and exploring microbial diversity. However, sequences obtained from impure nucleic acid preparations may contain DNA from sources other than the sample. Those sequence contaminations are a serious concern to the quality of the data used for downstream analysis, causing misassembly of sequence contigs and erroneous conclusions. Therefore, the removal of sequence contaminants is a necessary and required step for all sequencing projects. We developed DeconSeq, a robust framework for the rapid, automated identification and removal of sequence contamination in longer-read datasets (150 bp mean read length). DeconSeq is publicly available as standalone and web-based versions. The results can be exported for subsequent analysis, and the databases used for the web-based version are automatically updated on a regular basis. DeconSeq categorizes possible contamination sequences, eliminates redundant hits with higher similarity to non-contaminant genomes, and provides graphical visualizations of the alignment results and classifications. Using DeconSeq, we conducted an analysis of possible human DNA contamination in 202 previously published microbial and viral metagenomes and found possible contamination in 145 (72%) metagenomes with as high as 64% contaminating sequences. This new framework allows scientists to automatically detect and efficiently remove unwanted sequence contamination from their datasets while eliminating critical limitations of current methods. DeconSeq's web interface is simple and user-friendly. The standalone version allows offline analysis and integration into existing data processing pipelines. DeconSeq's results reveal whether the sequencing experiment has succeeded, whether the correct sample was sequenced, and whether the sample contains any sequence contamination from DNA preparation or host. In addition, the analysis of 202 metagenomes demonstrated significant contamination of the non-human associated metagenomes, suggesting that this method is appropriate for screening all metagenomes. DeconSeq is available at http://deconseq.sourceforge.net/
    corecore