50 research outputs found

    Electronic resources and web sites: replacing a backend database with Innovative's Electronic Resource Management

    Get PDF
    This is a preprint of an article accepted for publication in the December, 2005, issue of "Information Technology and Libraries."In the fall of 2002, Ohio State University along with the University of Washington, the University of Western Australia, Washington State University and Glasgow University entered into a development partnership with Innovative Interfaces, Inc. The goal was to develop a module to manage electronic resources, integrated into Innovative’s Millennium library system. The product, Electronic Resource Management (ERM), became available in 2004 and is based on the work of the Digital Library Federation Electronic Resources Management Initiative. This article focuses on one aspect of ERM, the integration of the module with the Web OPAC, and describes how the Ohio State University Libraries replaced a backend database with ERM to support lists of electronic resources on their web site

    Integrating and streamlining electronic resources workflows via Innovative’s Electronic Resource Management

    Get PDF
    This is a preprint of an article that has been accepted for publication in The Serials Librarian, v. 47, no. 4.Publisher links: http://www.taylorandfrancisgroup.com/ ; http://www.tandf.co.uk/journals/titles/0361526X.aspLibraries have been grappling with the management of the growing number of electronic resources, such as e-journals and electronic article indexes, for the last decade especially after the availability of many of these resources on the World Wide Web. The integrated library system wasn’t originally designed to accommodate many of these functions. In 2002, Innovative Interfaces, Inc. partnered with several of their customer libraries to develop a module to manage electronic resources based on the work of the Digital Library Federation’s Electronic Resources Management Initiative. The result of this partnership is a module that addresses functions such as tracking trial access, license negotiations, maintenance, troubleshooting as well as integration into the online catalog

    Methods for the analysis of ordinal response data in medical image quality assessment.

    Get PDF
    The assessment of image quality in medical imaging often requires observers to rate images for some metric or detectability task. These subjective results are used in optimisation, radiation dose reduction or system comparison studies and may be compared to objective measures from a computer vision algorithm performing the same task. One popular scoring approach is to use a Likert scale, then assign consecutive numbers to the categories. The mean of these response values is then taken and used for comparison with the objective or second subjective response. Agreement is often assessed using correlation coefficients. We highlight a number of weaknesses in this common approach, including inappropriate analyses of ordinal data, and the inability to properly account for correlations caused by repeated images or observers. We suggest alternative data collection and analysis techniques such as amendments to the scale and multilevel proportional odds models. We detail the suitability of each approach depending upon the data structure and demonstrate each method using a medical imaging example. Whilst others have raised some of these issues, we evaluated the entire study from data collection to analysis, suggested sources for software and further reading, and provided a checklist plus flowchart, for use with any ordinal data. We hope that raised awareness of the limitations of the current approaches will encourage greater method consideration and the utilisation of a more appropriate analysis. More accurate comparisons between measures in medical imaging will lead to a more robust contribution to the imaging literature and ultimately improved patient care

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    Actin-interacting and flagellar proteins in Leishmania spp.: Bioinformatics predictions to functional assignments in phagosome formation

    Get PDF
    Several motile processes are responsible for the movement of proteins into and within the flagellar membrane, but little is known about the process by which specific proteins (either actin-associated or not) are targeted to protozoan flagellar membranes. Actin is a major cytoskeleton protein, while polymerization and depolymerization of parasite actin and actin-interacting proteins (AIPs) during both processes of motility and host cell entry might be key events for successful infection. For a better understanding the eukaryotic flagellar dynamics, we have surveyed genomes, transcriptomes and proteomes of pathogenic Leishmania spp. to identify pertinent genes/proteins and to build in silico models to properly address their putative roles in trypanosomatid virulence. In a search for AIPs involved in flagellar activities, we applied computational biology and proteomic tools to infer from the biological meaning of coronins and Arp2/3, two important elements in phagosome formation after parasite phagocytosis by macrophages. Results presented here provide the first report of Leishmania coronin and Arp2/3 as flagellar proteins that also might be involved in phagosome formation through actin polymerization within the flagellar environment. This is an issue worthy of further in vitro examination that remains now as a direct, positive bioinformatics-derived inference to be presented

    KnowledgeBank: powered by DSpace

    No full text
    The University Archives has determined that this item is of continuing value to OSU's history

    Implementing cloning software in a library environment

    No full text
    corecore