1,823 research outputs found

    A Constraint on Complements in Swahili

    Get PDF

    Analysis of dropout learning regarded as ensemble learning

    Full text link
    Deep learning is the state-of-the-art in fields such as visual object recognition and speech recognition. This learning uses a large number of layers, huge number of units, and connections. Therefore, overfitting is a serious problem. To avoid this problem, dropout learning is proposed. Dropout learning neglects some inputs and hidden units in the learning process with a probability, p, and then, the neglected inputs and hidden units are combined with the learned network to express the final output. We find that the process of combining the neglected hidden units with the learned network can be regarded as ensemble learning, so we analyze dropout learning from this point of view.Comment: 9 pages, 8 figures, submitted to Conferenc

    Hemostatic Agents in Neurosurgery

    Get PDF

    Rim Pathway-Mediated Alterations in the Fungal Cell Wall Influence Immune Recognition and Inflammation

    Get PDF
    ACKNOWLEDGMENTS We acknowledge Jennifer Lodge, Woei Lam, and Rajendra Upadhya for developing and sharing the chitin and chitosan MTBH assay. We thank Todd Brennan of Duke University for providing MyD88-deficient mice. We acknowledge Neil Gow for providing access to the Dionex HPAEC-PAD instrumentation. We also acknowledge Connie Nichols for critical reading of the manuscript. These experiments were supported by an NIH grant to J.A.A. and F.L.W., Jr. (R01 AI074677). C.M.L.W. was supported by a fellowship provided through the Army Research Office of the Department of Defense (no. W911NF-11-1-0136 f) (F.L.W., Jr.). J.W., L.W., and C.M. were supported by the Wellcome Trust Strategic Award in Medical Mycology and Fungal Immunology (097377) and the MRC, Centre for Medical Mycology (MR/N006364/1). FUNDING INFORMATION MRC Centre for Medical MycologyMR/N006364/1 Carol A. Munro HHS | NIH | National Institute of Allergy and Infectious Diseases (NIAID) https://doi.org/10.13039/100000060R01 AI074677J. Andrew Alspaugh Wellcome https://doi.org/10.13039/100010269097377 Carol A. Munro DOD | United States Army | RDECOM | Army Research Office (ARO) https://doi.org/10.13039/100000183W911NF-11-1-0136 f Chrissy M. Leopold WagerPeer reviewe

    Taylor dispersion of gyrotactic swimming micro-organisms in a linear flow

    Get PDF
    The theory of generalized Taylor dispersion for suspensions of Brownian particles is developed to study the dispersion of gyrotactic swimming micro-organisms in a linear shear flow. Such creatures are bottom-heavy and experience a gravitational torque which acts to right them when they are tipped away from the vertical. They also suffer a net viscous torque in the presence of a local vorticity field. The orientation of the cells is intrinsically random but the balance of the two torques results in a bias toward a preferred swimming direction. The micro-organisms are sufficiently large that Brownian motion is negligible but their random swimming across streamlines results in a mean velocity together with diffusion. As an example, we consider the case of vertical shear flow and calculate the diffusion coefficients for a suspension of the alga <i>Chlamydomonas nivalis</i>. This rational derivation is compared with earlier approximations for the diffusivity

    Interdiffusion: A probe of vacancy diffusion in III-V materials

    Get PDF
    Copyright 1997 by the American Physical Society. Article is available at

    Towards a Framework to Elicit and Manage Security and Privacy Requirements from Laws and Regulations

    Get PDF
    [Context and motivation] The increasing demand of software systems to process and manage sensitive information has led to the need that software systems should comply with relevant laws and regulations, which enforce the privacy and other aspects of the stored information. [Question/problem] However, the task is challenging because concepts and terminology used for requirements engineering are mostly different to those used in the legal domain and there is a lack of appropriate modelling languages and techniques to support such activities. [Principal ideas/results] The legislation need to be analysed and align with the system requirements. [Contribution] This paper motivates the need to introduce a framework to assist the elicitation and management of security and privacy requirements from relevant legislation and it briefly presents the foundations of such a framework along with an example

    Collaborative Deep Learning for Recommender Systems

    Full text link
    Collaborative filtering (CF) is a successful approach commonly used by many recommender systems. Conventional CF-based methods use the ratings given to items by users as the sole source of information for learning to make recommendation. However, the ratings are often very sparse in many applications, causing CF-based methods to degrade significantly in their recommendation performance. To address this sparsity problem, auxiliary information such as item content information may be utilized. Collaborative topic regression (CTR) is an appealing recent method taking this approach which tightly couples the two components that learn from two different sources of information. Nevertheless, the latent representation learned by CTR may not be very effective when the auxiliary information is very sparse. To address this problem, we generalize recent advances in deep learning from i.i.d. input to non-i.i.d. (CF-based) input and propose in this paper a hierarchical Bayesian model called collaborative deep learning (CDL), which jointly performs deep representation learning for the content information and collaborative filtering for the ratings (feedback) matrix. Extensive experiments on three real-world datasets from different domains show that CDL can significantly advance the state of the art

    New zebrafish models of neurodegeneration

    Get PDF
    In modern biomedicine, the increasing need to develop experimental models to further our understanding of disease conditions and delineate innovative treatments has found in the zebrafish (Danio rerio) an experimental model, and indeed a valuable asset, to close the gap between in vitro and in vivo assays. Translation of ideas at a faster pace is vital in the field of neurodegeneration, with the attempt to slow or prevent the dramatic impact on the society's welfare being an essential priority. Our research group has pioneered the use of zebrafish to contribute to the quest for faster and improved understanding and treatment of neurodegeneration in concert with, and inspired by, many others who have primed the study of the zebrafish to understand and search for a cure for disorders of the nervous system. Aware of the many advantages this vertebrate model holds, here, we present an update on the recent zebrafish models available to study neurodegeneration with the goal of stimulating further interest and increasing the number of diseases and applications for which they can be exploited. We shall do so by citing and commenting on recent breakthroughs made possible via zebrafish, highlighting their benefits for the testing of therapeutics and dissecting of disease mechanisms

    Economies of Extremes: Lessons from Venture-Capital Decision Making

    Get PDF
    An organization's ability to exploit extreme events such as exceptional opportunities depends on its capacity strategy. The venture capital industry illustrates the interplay of expensive capacity and negative externalities from high utilization. The cost of adding a venture capitalist provides a strong incentive to run lean, but such leanness may make it impossible to evaluate all interesting investment opportunities. Using concepts from extreme-value theory, we analyze the trade-off between the costs and benefits arising from an increase in the number of evaluated deals. We ground our analysis in 11 years of archival data from a venture capital firm, representing 3631 deals, the decisions made, the reasons for those decisions, and the decision lead times. The firm identified 20% of arriving deals as worth evaluating during the screening process, but was not able to evaluate approximately 9% of those interesting deals due to a lack of capacity. We show that the value of increasing the number of deals evaluated increases with the tail weight of the distribution of deal values. When the right tail is light, increasing the number of deals evaluated may provide too modest a benefit to justify the cost. When, however, the right tail is heavy, the value of increasing the number of deals is likely to more than compensate for the cost of capacity. Our results provide new insight into the relative value of a chase capacity strategy that emphasizes responsiveness versus a high-utilization heuristic that emphasizes productivity. Our approach can be applied to other search operations such as personnel selection, quality circles seeking to identify root causes, and making employee capacity available for innovation
    corecore