2,796 research outputs found

    Verifying the fully “Laplacianised” posterior Naïve Bayesian approach and more

    Get PDF
    Mussa and Glen would like to thank Unilever for financial support, whereas Mussa and Mitchell thank the BBSRC for funding this research through grant BB/I00596X/1. Mitchell thanks the Scottish Universities Life Sciences Alliance (SULSA) for financial support.Background In a recent paper, Mussa, Mitchell and Glen (MMG) have mathematically demonstrated that the “Laplacian Corrected Modified Naïve Bayes” (LCMNB) algorithm can be viewed as a variant of the so-called Standard Naïve Bayes (SNB) scheme, whereby the role played by absence of compound features in classifying/assigning the compound to its appropriate class is ignored. MMG have also proffered guidelines regarding the conditions under which this omission may hold. Utilising three data sets, the present paper examines the validity of these guidelines in practice. The paper also extends MMG’s work and introduces a new version of the SNB classifier: “Tapered Naïve Bayes” (TNB). TNB does not discard the role of absence of a feature out of hand, nor does it fully consider its role. Hence, TNB encapsulates both SNB and LCMNB. Results LCMNB, SNB and TNB performed differently on classifying 4,658, 5,031 and 1,149 ligands (all chosen from the ChEMBL Database) distributed over 31 enzymes, 23 membrane receptors, and one ion-channel, four transporters and one transcription factor as their target proteins. When the number of features utilised was equal to or smaller than the “optimal” number of features for a given data set, SNB classifiers systematically gave better classification results than those yielded by LCMNB classifiers. The opposite was true when the number of features employed was markedly larger than the “optimal” number of features for this data set. Nonetheless, these LCMNB performances were worse than the classification performance achieved by SNB when the “optimal” number of features for the data set was utilised. TNB classifiers systematically outperformed both SNB and LCMNB classifiers. Conclusions The classification results obtained in this study concur with the mathematical based guidelines given in MMG’s paper—that is, ignoring the role of absence of a feature out of hand does not necessarily improve classification performance of the SNB approach; if anything, it could make the performance of the SNB method worse. The results obtained also lend support to the rationale, on which the TNB algorithm rests: handled judiciously, taking into account absence of features can enhance (not impair) the discriminatory classification power of the SNB approach.Publisher PDFPeer reviewe

    First Steps Towards an Annotated Database of American English

    Get PDF
    This paper reports on one of the first steps in building a very large annotated database of American English. We present and discuss the results of an experiment comparing manual part-of-speech tagging with manual verification and correction of automatic stochastic tagging. The experiment shows that correcting is superior to tagging with respect to speed, consistency and accuracy

    Deducing linguistic structure from the statistics of large corpora

    Get PDF
    Within the last two years, approaches using both stochastic and symbolic techniques have proved adequate to deduce lexical ambiguity resolution rules with less than 3-4 % error rate, when trained on moderat

    "On the Spot": travelling artists and Abolitionism, 1770-1830

    Get PDF
    Until recently the visual culture of Atlantic slavery has rarely been critically scrutinised. Yet in the first decades of the nineteenth century slavery was frequently represented by European travelling artists, often in the most graphic, sometimes voyeuristic, detail. This paper examines the work of several itinerant artists, in particular Augustus Earle (1793-1838) and Agostino Brunias (1730–1796), whose very mobility along the edges of empire was part of a much larger circulatory system of exchange (people, goods and ideas) and diplomacy that characterised Europe’s Age of Expansion. It focuses on the role of the travelling artist, and visual culture more generally, in the development of British abolitionism between 1770 and 1830. It discusses the broad circulation of slave imagery within European culture and argues for greater recognition of the role of such imagery in the abolitionist debates that divided Britain. Furthermore, it suggests that the epistemological authority conferred on the travelling artist—the quintessential eyewitness—was key to the rhetorical power of his (rarely her) images. Artists such as Earle viewed the New World as a boundless source of fresh material that could potentially propel them to fame and fortune. Johann Moritz Rugendas (1802-1858), on the other hand, was conscious of contributing to a global scientific mission, a Humboldtian imperative that by the 1820s propelled him and others to travel beyond the traditional itinerary of the Grand Tour. Some artists were implicated in the very fabric of slavery itself, particularly those in the British West Indies such as William Clark (working 1820s) and Richard Bridgens (1785-1846); others, particularly those in Brazil, expressed strong abolitionist sentiments. Fuelled by evangelical zeal to record all aspects of the New World, these artists recognised the importance of representing the harsh realities of slave life. Unlike those in the metropole who depicted slavery (most often in caustic satirical drawings), many travelling artists believed strongly in the evidential value of their images, a value attributed to their global mobility. The paper examines the varied and complex means by which visual culture played a significant and often overlooked role in the political struggles that beset the period

    Understanding meta-population trends of the Australian fur seal, with insights for adaptive monitoring

    Full text link
    Effective ecosystem-based management requires estimates of abundance and population trends of species of interest. Trend analyses are often limited due to sparse or short-term abundance estimates for populations that can be logistically difficult to monitor over time. Therefore it is critical to assess regularly the quality of the metrics in long-term monitoring programs. For a monitoring program to provide meaningful data and remain relevant, it needs to incorporate technological improvements and the changing requirements of stakeholders, while maintaining the integrity of the data. In this paper we critically examine the monitoring program for the Australian fur seal (AFS) Arctocephalus pusillus doriferus as an example of an ad-hoc monitoring program that was co-ordinated across multiple stakeholders as a range-wide census of live pups in the Austral summers of 2002, 2007 and 2013. This 5-yearly census, combined with historic counts at individual sites, successfully tracked increasing population trends as signs of population recovery up to 2007. The 2013 census identified the first reduction in AFS pup numbers (14,248 live pups, -4.2% change per annum since 2007), however we have limited information to understand this change. We analyse the trends at breeding colonies and perform a power analysis to critically examine the reliability of those trends. We then assess the gaps in the monitoring program and discuss how we may transition this surveillance style program to an adaptive monitoring program than can evolve over time and achieve its goals. The census results are used for ecosystem-based modelling for fisheries management and emergency response planning. The ultimate goal for this program is to obtain the data we need with minimal cost, effort and impact on the fur seals. In conclusion we identify the importance of power analyses for interpreting trends, the value of regularly assessing long-term monitoring programs and proper design so that adaptive monitoring principles can be applied

    Film as architectural theory

    Get PDF
    Publications on architectural theory have predominantly taken on the form of text-based books, monographs, and articles. With the rise of transdisciplinary and practice-based research in architecture, new opportunities are opening up for other forms of architectural theory, such as film-based mediums, which promise to expand and alter the convention of the written practice of theory. Two possible types of filmic theory are presented here. One follows the method of ethnographic documentary filmmaking inspired by Sarah Pinkfilm-based mediums, which promise to expand and alter thellows the line of art house filmmaking inspired by Kathryn Rameyyn Rameyg inspired by Sarah Pinkfilm-based mediums, which promise to expand ae to expand ad mediums, which promise to expand a convention of the written practice of theory. or constructing knowledge, new discourses on filmic theory can be opened up. It is argued here that film as architectural theory is part of this new discourse, broadening the audience’u engagement with architecture through not only “readership” but also “viewership.

    Random Matrix Theories in Quantum Physics: Common Concepts

    Full text link
    We review the development of random-matrix theory (RMT) during the last decade. We emphasize both the theoretical aspects, and the application of the theory to a number of fields. These comprise chaotic and disordered systems, the localization problem, many-body quantum systems, the Calogero-Sutherland model, chiral symmetry breaking in QCD, and quantum gravity in two dimensions. The review is preceded by a brief historical survey of the developments of RMT and of localization theory since their inception. We emphasize the concepts common to the above-mentioned fields as well as the great diversity of RMT. In view of the universality of RMT, we suggest that the current development signals the emergence of a new "statistical mechanics": Stochasticity and general symmetry requirements lead to universal laws not based on dynamical principles.Comment: 178 pages, Revtex, 45 figures, submitted to Physics Report

    Meta-analysis of genome-wide association studies from the CHARGE consortium identifies common variants associated with carotid intima media thickness and plaque

    Get PDF
    Carotid intima media thickness (cIMT) and plaque determined by ultrasonography are established measures of subclinical atherosclerosis that each predicts future cardiovascular disease events. We conducted a meta-analysis of genome-wide association data in 31,211 participants of European ancestry from nine large studies in the setting of the Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE) Consortium. We then sought additional evidence to support our findings among 11,273 individuals using data from seven additional studies. In the combined meta-analysis, we identified three genomic regions associated with common carotid intima media thickness and two different regions associated with the presence of carotid plaque (P < 5 × 10 -8). The associated SNPs mapped in or near genes related to cellular signaling, lipid metabolism and blood pressure homeostasis, and two of the regions were associated with coronary artery disease (P < 0.006) in the Coronary Artery Disease Genome-Wide Replication and Meta-Analysis (CARDIoGRAM) consortium. Our findings may provide new insight into pathways leading to subclinical atherosclerosis and subsequent cardiovascular events
    corecore