5,228 research outputs found

    A Survey of Human Intestinal Protozoa of Logan City and Vicinity

    Get PDF
    Surveys of human intestinal protozoa in the United States have been confined mostly to Eastern sections of the country and to the Pacific coast. There has been little work done concerning these parasites in western mountain states, and no previous surveys have been made of these organisms in the Intermountain West. It is important that the kinds oand numbers of these parasitic protozoa be determined for this locality; and it is only through surveys that the harmful, as well as the commensal, intestinal protozoa can be determined and treated. In 1933, teh city of Chicago experienced a general epidemic of amoebic dysentry, believed to be caused by a Endameba histolytica carrier. The seriousness of this epidmic led to the realization that the amoebic dysentary of the tropics could occur in tempearte regions. It is one of the purposes of surveys to recognized the incidence of pathogenic protozoa, with the purpose of averting possible epidemics of dysentary, diarrhea and other minor intestinal disturbances

    Reliability and performance evaluation of systems containing embedded rule-based expert systems

    Get PDF
    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system

    SpaceOAR to improve dosimetric outcomes for monotherapy high-dose-rate prostate implantation in a patient with ulcerative colitis.

    Get PDF
    High-dose-rate (HDR) brachytherapy is an attractive option for patients receiving definitive radiation therapy for prostate cancer with decreased overall dose to the pelvis. However, ulcerative colitis increases rectal toxicity risk and may be a contraindication. A synthetic hydrogel, SpaceOAR (Augmentix Inc., Waltham, MA, USA), can facilitate the use of HDR brachytherapy for patients where rectal toxicity is a limiting factor. SpaceOAR gel (13.19 cc) was utilized in a monotherapy HDR prostate treatment with Ir-192 under transrectal ultrasound guidance, with the intention of decreasing rectal dose. SpaceOAR gel was inserted transperineally into the patient 18 days prior to the procedure. The HDR brachytherapy procedure was tolerated without incident. All planning constraints were met, and the following dosimetry was achieved: Prostate – V100% = 97.3%, V150% = 35%, V200% = 14.5%; Urethra – V118% = 0%; Rectum – D2 cc = 51.6%, V75% = 0 cc. The rectum-catheter spacing was on average between 6-8 mm. Average spacing for our 10 most recent patients without SpaceOAR was 3 mm. SpaceOAR did not hinder or distort ultrasound imaging or increase treatment time. SpaceOAR successfully increases catheter-rectal wall spacing and decreases rectal dose due to improved planning capabilities, while decreasing the likelihood of rectal perforation. One application of this tool is presented to mitigate potential toxicities associated with ulcerative colitis. At five months, one week, and one day follow-up, the patient reported no bowel issues following HDR brachytherapy. © 2018 Termedia Publishing House Ltd. All Rights Reserved

    Generation of Cosmological Seed Magnetic Fields from Inflation with Cutoff

    Full text link
    Inflation has the potential to seed the galactic magnetic fields observed today. However, there is an obstacle to the amplification of the quantum fluctuations of the electromagnetic field during inflation: namely the conformal invariance of electromagnetic theory on a conformally flat underlying geometry. As the existence of a preferred minimal length breaks the conformal invariance of the background geometry, it is plausible that this effect could generate some electromagnetic field amplification. We show that this scenario is equivalent to endowing the photon with a large negative mass during inflation. This effective mass is negligibly small in a radiation and matter dominated universe. Depending on the value of the free parameter of the theory, we show that the seed required by the dynamo mechanism can be generated. We also show that this mechanism can produce the requisite galactic magnetic field without resorting to a dynamo mechanism.Comment: Latex, 16 pages, 2 figures, 4 references added, minor corrections; v4: more references added, boundary term written in a covariant form, discussion regarding other gauge fields added, submitted to PRD; v5: matched with the published versio

    High Performance Attack Estimation in Large-Scale Network Flows

    Get PDF
    Network based attacks are the major threat to security on the Internet. The volume of traffic and the high variability of the attacks place threat detection squarely in the domain of big data. Conventional approaches are mostly based on signatures. While these are relatively inexpensive computationally, they are inflexible and insensitive to small variations in the attack vector. Therefore we explored the use of machine learning techniques on real flow data. We found that benign traffic could be identified with high accuracy

    CATHEDRAL: A Fast and Effective Algorithm to Predict Folds and Domain Boundaries from Multidomain Protein Structures

    Get PDF
    We present CATHEDRAL, an iterative protocol for determining the location of previously observed protein folds in novel multidomain protein structures. CATHEDRAL builds on the features of a fast secondary-structure–based method (using graph theory) to locate known folds within a multidomain context and a residue-based, double-dynamic programming algorithm, which is used to align members of the target fold groups against the query protein structure to identify the closest relative and assign domain boundaries. To increase the fidelity of the assignments, a support vector machine is used to provide an optimal scoring scheme. Once a domain is verified, it is excised, and the search protocol is repeated in an iterative fashion until all recognisable domains have been identified. We have performed an initial benchmark of CATHEDRAL against other publicly available structure comparison methods using a consensus dataset of domains derived from the CATH and SCOP domain classifications. CATHEDRAL shows superior performance in fold recognition and alignment accuracy when compared with many equivalent methods. If a novel multidomain structure contains a known fold, CATHEDRAL will locate it in 90% of cases, with <1% false positives. For nearly 80% of assigned domains in a manually validated test set, the boundaries were correctly delineated within a tolerance of ten residues. For the remaining cases, previously classified domains were very remotely related to the query chain so that embellishments to the core of the fold caused significant differences in domain sizes and manual refinement of the boundaries was necessary. To put this performance in context, a well-established sequence method based on hidden Markov models was only able to detect 65% of domains, with 33% of the subsequent boundaries assigned within ten residues. Since, on average, 50% of newly determined protein structures contain more than one domain unit, and typically 90% or more of these domains are already classified in CATH, CATHEDRAL will considerably facilitate the automation of protein structure classification

    Illicit Activity Detection in Large-Scale Dark and Opaque Web Social Networks

    Get PDF
    Many online chat applications live in a grey area between the legitimate web and the dark net. The Telegram network in particular can aid criminal activities. Telegram hosts “chats” which consist of varied conversations and advertisements. These chats take place among automated “bots” and human users. Classifying legitimate activity from illegitimate activity can aid law enforcement in finding criminals. Social network analysis of Telegram chats presents a difficult problem. Users can change their username or create new accounts. Users involved in criminal activity often do this to obscure their identity. This makes establishing the unique identity behind a given username challenging. Thus we explored classifying users from their language usage in their chat messages.The volume and velocity of Telegram chat data place it well within the domain of big data. Machine learning and natural language processing (NLP) tools are necessary to classify this chat data. We developed NLP tools for classifying users and the chat group to which their messages belong. We found that legitimate and illegitimate chat groups could be classified with high accuracy. We also were able to classify bots, humans, and advertisements within conversations

    A retrospective study of circumpubertal cleft lip and palate patients treated in infancy with primary alveolar bone grafting

    Get PDF
    Indiana University-Purdue University Indianapolis (IUPUI)The Riley Children's Hospital Craniofacial Anomalies Team rigorously follows a treatment protocol developed by Dr. Sheldon Rosenstein for the treatment of cleft lip and palate patients. Rosenstein's protocol incorporates primary bone grafting and alveolar molding appliances for cleft lip and palate patients. While other cleft lip and palate treatment centers utilize alveolar molding appliances, there remains debate concerning the efficacy of primary bone grafting. The principal detraction of primary bone grafting is the concern that such early surgical treatment affects maxillary and craniofacial growth and development. The purpose of this retrospective study was to analyze post-treatment lateral head plates and dental casts of cleft lip and palate circumpubertal patients treated in infancy at Riley Hospital in Indianapolis by the Craniofacial Team following Rosenstein's protocol. The hypothesis was that primary alveolar bone grafting in conjunction with the use of alveolar molding appliances contributes to the early stabilization of the alveolar segments, and produces no statistically significant difference in craniofacial development among primary bone grafted patients and nongrafted patients. The dental arch dimensions of the nongrafted patients (control group) consisted of the same data utilized by Moorrees in his study of the dentition of the growing child. The dental arch dimensions of nongrafted cleft patients consisted of the same data utilized by Athanasiou in his study of the dentition of cleft patients treated surgically without bone grafting. Of the eight measurements made by the three examiners, six demonstrated excellent interexaminer agreement, one demonstrated moderate interexaminer agreement, and one demonstrated poor interexaminer agreement. The arch width and length for the grafted group was significantly smaller (p < .05, Student's t-test) than the normal group in all measures except for the mandibular canine width. The arch width and length for the grafted group was not significantly different (p < .05, Student's t-test) than the nongrafted group, except for the maxillary molar width where the grafted group was smaller than the nongrafted group. The cephalometric values of the Riley group were compared against a nongrafted group, an early primary grafted group, and the Bolton standard values cited in Rosenstein's study. The Bolton standard values were used as the control group. This study found the cephalometric values of the Riley experimental group (treated following Rosenstein's protocol) to be of no statistically significant difference (p < .05, Students t-test) when compared with cephalometric values of the nongrafted and primary alveolar grafted groups cited in Rosenstein's 1982 study. The cephalometric values of the Riley experimental group were less than the cephalometric values of the nonclefted patients (Bolton standard control group) cited in Rosenstein's 1982 study. Interexaminer agreement ranged from poor to good with the poorest agreement among the linear values of ANS/PNS and GO/ME. The intraclass correlation coefficient values for SNA,m ANB, and SNB ranged from fair to moderate. The Riley cephalometric values were equal or slightly better than Rosenstein's grafted and nongrafted groups. Though smaller than the control group, the Riley cephalometric values were of no statistical significance (p < .05, Students t-test) when compared with the same parameters cited in Rosenstein's study. Although these findings infer that the patients treated following Rosenstein's protocol demonstrate some degree of craniofacial growth attenuation when compared with nonclefted patients (Bolton standard control group), the Riley patients showed no worse growth attenuation than similar patients treated without Rosenstein's protocol for primary alveolar grafting. The hypothesis of this thesis was that Rosenstein's protocol was viable and non-detrimental when compared with other treatment regimens. The results of this study support the hypothesis that Rosenstein's surgical protocol is not a contributing factor in craniofacial growth attenuation among cleft lip and palate patients

    [Introduction to] The Cambridge Handbook of Stakeholder Theory

    Get PDF
    In the decades since R. Edward Freeman first introduced stakeholder theory, which views firms in terms of their relationships to a broad set of partners, the stakeholder approach has drawn increasing attention as a model for ethical business. Edited by Freeman, alongside other leading scholars in stakeholder theory and strategic management, this handbook provides a comprehensive foundation for study in the field, with eighteen chapters covering some of the most important topics in stakeholder theory written by respected and highly cited experts. The chapters contain an overview of the topic, an examination of the most important research on the topic to date, an evaluation of that research, and suggestions for future directions. Given the pace of new scholarship in the field, this handbook will provide an essential reference on both foundational topics as well as new applications of stakeholder theory to entrepreneurship, sustainable business, corporate responsibility, and beyond.https://scholarship.richmond.edu/bookshelf/1374/thumbnail.jp

    Algorithmic Discovery of Methylation “Hot Spots” in DNA from Lymphoma Patients

    Get PDF
    The computational aspects of the problem in this paper involve, firstly, selective mapping of methylated DNA clones according to methylation level and, secondly, extracting motif information from all the mapped elements in the absence of prior probability distribution. Our novel implementation of algorithms to map and maximize expectation in this setting has generated data that appear to be distinct for each lymphoma subtype examined. A “clone” represents a polymerase chain reaction (PCR) product (on average ~500 bp) which belongs to a microarray of 8544 such sequences preserving CpG-rich islands (CGIs) [1]. Accumulating evidence indicates that cancers including lymphomas demonstrate hypermethylation of CGIs “silencing” an increasing number of tumor suppressor (TS) genes which can lead to tumorigenesis
    corecore