2,101 research outputs found

    Active OCR: Tightening the Loop in Human Computing for OCR Correction

    Get PDF
    We propose a proof-of-concept application that will experiment with the use of active learning and other iterative techniques for the correction of eighteenth-century texts provided by the HathiTrust Digital Library and the 2,231 ECCO text transcriptions released into the public domain by Gale and distributed by the Text Creation Partnership (TCP) and 18thConnect. In an application based on active learning or a similar approach, the user could identify dozens or hundreds of difficult characters that appear in the articles from that same time period, and the system would use this new knowledge to improve optical character recognition (OCR) across the entire corpus. A portion of our efforts will focus on the need to incentivize engagement in tasks of this type, whether they are traditionally crowdsourced or through a more active, iterative process like the one we propose. We intend to examine how explorations of a users' preferences can improve their engagement with corpora of materials

    Inactivation of Bacteriophage Φ6 on Tyvek Suit Surfaces by Chemical Disinfection

    Get PDF
    The 2014 West Africa Ebola outbreak saw a substantial number of healthcare workers (HCWs) being infected, despite the use of personal protective equipment (PPE). PPE is intended to protect HCWs when caring for patients with Ebola virus disease (EVD), but PPE may play a role in the spread of Ebola in healthcare environments. Before the removal of PPE, chemical disinfection may prevent the transfer of pathogens to HCWs, but the efficacy of common disinfectants against enveloped viruses, such as Ebola, on PPE surfaces is relatively unknown. The purpose of this study is to assess the efficacy of two common disinfectants, chlorine bleach (Clorox® bleach) and quaternary ammonium (Micro-Chem Plus®), used in healthcare settings for inactivation of enveloped viruses on PPE. The virucidal activity of the two disinfectants were tested against bacteriophage Φ6, an enveloped, non-pathogenic surrogate for enveloped viruses, on Tyvek suit surfaces. Virus was dried onto Tyvek suit surface, exposed to the disinfectants at use-dilution for a contact time of one minute, and the surviving virus was quantified using a double agar layer (DAL) assay. The Clorox® bleach and Micro-Chem Plus® produced a \u3e3.21 log10 reduction and \u3e4.33 log10 reduction, respectively, in Φ6 infectivity. The results of this study suggest that chlorine bleach and quaternary ammonium are effective in the inactivation of enveloped viruses on Tyvek suit surfaces. Chemical disinfection of PPE should be considered as a viable method to reduce the spread of pathogenic, enveloped viruses to HCWs, patients, and other environmental surfaces in healthcare settings

    Salvaging General Jurisdiction: Satisfying Daimler And Proposing A New Framework

    Get PDF
    General jurisdiction is slowly being eroded. What was once a well-trodden path used to hale corporate defendants into the courthouse is now increasingly barred or shut. In its most recent general jurisdiction opinion, Daimler AG v. Bauman, the U.S. Supreme Court continued its trend towards divesting general jurisdiction of its utility. This is a mistake. The 21st century’s economy is increasingly complex, and general jurisdiction must evolve with this complexity. Failing to do so allows intricate corporate structures to insulate corporate defendants from the jurisdiction of U.S courts. Although the theory of personal jurisdiction has come a long way since the landmark decisions in Pennoyer v. Neff and International Shoe v. Washington, it must continue to evolve. Arguably, another catalytic opinion is needed to belatedly nudge general jurisdiction into modernity. This note explores the history of general jurisdiction, provides a means to satisfy the currently rigorous general jurisdiction standard, and proposes a new standard that is more cogent in the modern age. In doing so, Part I of this note explains the theory behind general jurisdiction and how it differs from specific jurisdiction, and Part II describes the history of the Supreme Court’s general jurisdiction jurisprudence since Pennoyer. After examining the theory of general jurisdiction and Supreme Court precedent on the issue, Part III traces the steps of the Daimler case, from the Northern District of California to the Supreme Court, with each courts’ nuances highlighted. Part IV then explains the necessary steps plaintiffs must take to satisfy the new rigors of general jurisdiction. Finally, Part V provides a new definition and standard of general jurisdiction, one that will hopefully be more consistent with the original theory of general jurisdiction that was first outlined in Pennoyer and International Shoe

    An analysis of the genetic implications of maternal and grandmaternal effects in beef cattle selection programs

    Get PDF
    A study utilizing 3,220 performance records of Angus calves dropped over a 19-year period from 1957 to 1975 was undertaken in an attempt to estimate the importance of direct, maternal, and grandmaternal variances and to evaluate their interrelationship as causative factors in creating phenotypic variation in birth weight, gain from birth to weaning, weaning condition, and weaning weights. All of these records were obtained from cattle at the Ames Plantation in Tennessee and were from non-creep-fed calves. The data were adjusted by least squares procedures for the effects of year of birth, season of birth, and age of dam. These adjusted data were used to calculate the various covariances among relatives. The model for maternal effects utilized the covariances of the individual with itself and paternal half-sibs, maternal half-sibs, full-sibs, dam-offspring, and granddam offspring covariances. While the model for assessing grandmaternal genetic influences utilized, in addition to the above six, covariances between cousins, within cousins, and within paternal half-sibs. All of these were equated to their expected biological components, direct genetic variance, maternal genetic variance, grandmaternal genetic variance, covariances between direct and maternal, between direct and grandmaternal, between maternal and grandmaternal, direct environmental variance, maternal environmental variance, and the covariance between direct and maternal environmental effects. The maternal model yielded positive effects for all estimates of variances with the direct environmental variance contributing the largest fraction of the total phenotypic variances for all traits except adjusted weaning weight (6.6%). The estimates ranged quite high (up to 83.4% for birth weight); however, the heritability estimates are in line with accepted values for these traits. The direct estimates of variance, ranging from a low of 16.1% for weaning condition to a high of 41.5% for weaning weight, were, therefore, considered quite reasonable. Estimates of the maternal variance all tended to be low (from 1.4% to 4.5%); however, they are positive and do exist. The covariance between direct and maternal effects and the environmental covariances between direct and maternal effects exhibited negative signs except for the genetic covariance for birth weight (7.4% and 6.1%) and environmental covariance for weaning condition (16.4% and 17.0%). This negative covariance supports the theory of an antagonism existing between direct and maternal effects for the weaning and preweaning traits. The grandmaternal model showed the variance estimates for all effects to be positive except adjusted weaning weight and adjusted gain (-1.2 and -1.2) for the maternal environmental variance. These estimates ranged from 20.2% for birth to 42.8% for weaning weight for direct effects, while the maternal variances were in the 6% to 16% range as to their influence upon the total phenotypic variance. The estimates of the grandmaternal variance were in the range of 5% to 10%, thus very evident as to their importance upon the total phenotypic variance. The estimates for grandmaternal genetic variance were all fairly large in magnitude and were thought to play an important role in validating the alternate generation phenomenon

    Exploring Computer Science: Coding Can Be Fun

    Get PDF
    The focus of this project is to adopt a course/curriculum that builds interest in computer science and appeals to high school students through a variety of programs and activities. One of our goals is to increase interest in computer science among high school students especially minority and female students. Students will learn how to use programming languages, utilize higher level problem solving and program robots using the skills they acquired. The course, Exploring Computer Science, gives students the opportunity to learn a variety of computer science topics and have fun doing it

    RLBWT Tricks

    Get PDF
    Until recently, most experts would probably have agreed we cannot backwards-step in constant time with a run-length compressed Burrows-Wheeler Transform (RLBWT), since doing so relies on rank queries on sparse bitvectors and those inherit lower bounds from predecessor queries. At ICALP '21, however, Nishimoto and Tabei described a new, simple and constant-time implementation. For a permutation π\pi, it stores an O(r)O (r)-space table -- where rr is the number of positions ii where either i=0i = 0 or π(i+1)≠π(i)+1\pi (i + 1) \neq \pi (i) + 1 -- that enables the computation of successive values of π(i)\pi(i) by table look-ups and linear scans. Nishimoto and Tabei showed how to increase the number of rows in the table to bound the length of the linear scans such that the query time for computing π(i)\pi(i) is constant while maintaining O(r)O (r)-space. In this paper we refine Nishimoto and Tabei's approach, including a time-space tradeoff, and experimentally evaluate different implementations demonstrating the practicality of part of their result. We show that even without adding rows to the table, in practice we almost always scan only a few entries during queries. We propose a decomposition scheme of the permutation π\pi corresponding to the LF-mapping that allows an improved compression of the data structure, while limiting the query time. We tested our implementation on real-world genomic datasets and found that without compression of the table, backward-stepping is drastically faster than with sparse bitvector implementations but, unfortunately, also uses drastically more space. After compression, backward-stepping is competitive both in time and space with the best existing implementations.Comment: 15 pages, 8 figures. New edition with expanded experimental results after poster acceptance at earlier conference. Section 4 removed, sections added for implementation detail

    The eCommentary Machine Project

    Get PDF
    The eCommentary Machine web application ("eComma") enables groups of students, scholars, or general readers to build collaborative commentaries on a text and to search, display, and share those commentaries online

    Notes on Recent Cases

    Get PDF
    • …
    corecore