96 research outputs found

    The avian dawn chorus across Great Britain: using new technology to study breeding bird song

    Get PDF
    The avian dawn chorus is a period of high song output performed daily around sunrise during the breeding season. Singing at dawn is of such significance to birds that they remain motivated to do so amid the noise of numerous others. Yet, we still do not fully understand why the dawn chorus exists. Technological advances in recording equipment, data storage and sound analysis tools now enable collection and scrutiny of large acoustic datasets, encouraging research on sound-producing organisms and promoting ‘the soundscape’ as an indicator of ecosystem health. Using an unrivalled dataset of dawn chorus recordings collected during this thesis, I explore the chorus throughout Great Britain with the prospect of furthering our understanding and appreciation of this daily event. I first evaluate the performance of four automated signal recognition tools (‘recognisers’) when identifying the singing events of target species during the dawn chorus, and devise a new ensemble approach that improves detection of singing events significantly over each of the recognisers in isolation. I then examine daily variation in the timing and peak of the chorus across the country in response to minimum overnight temperature. I conclude that cooler temperatures result in later chorus onset and peak the following dawn, but that the magnitude of this effect is greater at higher latitude sites with cooler and less variable overnight temperature regimes. Next, I present evidence of competition for acoustic space during the dawn chorus between migratory and resident species possessing similar song traits, and infer that this may lead either to fine-scale temporal partitioning of song, such that each competitor maintains optimal output, or to one competitor yielding. Finally, I investigate day-to-day attenuation of song during the leaf-out period from budburst through to full-leaf in woodland trees, and establish the potential for climate-driven advances in leaf-out phenology to attenuate song if seasonal singing activity in birds has not advanced to the same degree. I find that gradual attenuation of sound through the leaf-out process is dependent on the height of the receiver, and surmise that current advances in leaf-out phenology are unlikely to have undue effect on song propagation. This project illustrates the advantage of applying new technology to ecological studies of complex acoustic environments, and highlights areas in need of improvement, which is essential if we are to comprehend and preserve our natural soundscapes

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Complexity perspectives and investment decisions

    Get PDF
    Thesis (MPhil (Information Science))--University of Stellenbosch, 2010.ENGLISH ABSTRACT: This thesis investigates investment theory in the light of complexity theory. These insights from diverse fields contain powerful images, metaphors and ways of thinking that allows one to seek new ways of comprehending the nature of the economy and therefore the nature of investment and the related issues of uncertainty and decision making. Complexity theory views the economy as being a dynamic, continuously adaptive, nonlinear system. This is in contrast to traditional or classical economic theory that views the economy as being a simple, linear, equilibrium deterministic system. This thesis is a conceptual study exploring the implications of a complexity worldview for investment decisions by looking at the nature and characteristics of complexity and then overlaying it on the characteristics of the economy. It is argued that complexity is caused by three elements: the structure of the system, human behaviour and exogenous factors. Thereafter follows an analysis of how investment decisions are made in the light of complexity by illustrating the investment models of two very successful, yet different investors: Warren Buffet and George Soros. Buffet’s model hinges on value. He realises that emergent phenomenon driven by irrational behaviour of investors leads to intrinsic values of shares to differ widely from perceived value. When quoted or perceived values are low than it is advisable to purchase as you have a margin of safety. Over the long term the market recognises the real value of the share. He tries to ignore the vagaries of the market and to focus on fundamentals. His list of fundamentals include; the franchise value of the company, quality of management and industry dynamics. George Soros in contrast utilises emergence patterns to locate potential investments. His model is that systems are flawed, human thinking and decision making is flawed and the interaction of the two lead to perturbations and oscillations. He focuses in trying to understand the flaw in systems and in human behaviour and to find some kind of pattern that he could utilise to make a profit. It is shown that both investment models can be understood from a complexity perspective and that these two investors built aspects from complexity into their decision models.AFRIKAANSE OPSOMMING: Die tesis ondersoek investeringsteorie in die lig van kompleksiteitsteorie. Met die hulp van metafore en insigte vanuit kompleksiteitsdenke word gesoek na nuwe maniere om die aard van die mark en investering verwante aspekte van onsekerheid en besluitneming te verstaan. Die kompleksiteitsperspektief sien die ekonomie as’n dinamiese en aanpassende nie-lineêre sisteem. Dit word gedoen deur die implikasies wat kompleksiteit vir investeringsbesluite inhou konseptueel te ondersoek. Die aard en eienskappe van komplekse sisteme word verduidelik en dan op die ekonomie toegepas. Daar word geargumenteer dat kompleksiteit deur drie elemente veroorsaak word: die struktuur van die sisteem, menslike gedrag en eksogene faktore. Daarna word die praktyk van investeringsbesluite geanaliseer in terme van kompleksiteit duer investeringsmodelle van twee suksesvolle, maar uiteenlopende, investeerders te ondersoek, naamlik Warren Buffet en George Soros. Buffet se model draai rondom waarde. Hy sien die irrasionele gedrag van investeerders as ‘n ontvouende fenomeen wat lei tot ‘n gaping tussen intrinsieke en verwagte waarde. Sy investering word gebaseer op die aanname dat oor die langer termyn die mark die intrinsieke waarde herken. Hy ignoreer dus korttermyn skommelinge in die verwagte waarde en fokus op die fundamentele, waaronder die maanwaarde van die besigheid, die kwaliteit van die bestuur, en industrie-dinamika tel. Soros se model daarenteen gebruik ontvouende patrone en potensiële investeringsgeleenthede te ontbloot. Sy model is dat sisteme inherente teenstrydighede het as ook menslike gedrag en besluitneming. Dit lei tot ossilasies en versteurings. Sy fokus is gerig daarop om hierdie versteurings in die sisteem tot voordeel aan te wend. Daar word getoon hoedat beide investeringsmodelle vanuit ‘n kompleksiteitsperspektief verstaan kan word en dat die twee investeerders sulke aspekte in hulle investeringsbesluite inbou

    Speech and neural network dynamics

    Get PDF

    Immunology as a metaphor for computational information processing : fact or fiction?

    Get PDF
    The biological immune system exhibits powerful information processing capabilities, and therefore is of great interest to the computer scientist. A rapidly expanding research area has attempted to model many of the features inherent in the natural immune system in order to solve complex computational problems. This thesis examines the metaphor in detail, in an effort to understand and capitalise on those features of the metaphor which distinguish it from other existing methodologies. Two problem domains are considered — those of scheduling and data-clustering. It is argued that these domains exhibit similar characteristics to the environment in which the biological immune system operates and therefore that they are suitable candidates for application of the metaphor. For each problem domain, two distinct models are developed, incor-porating a variety of immunological principles. The models are tested on a number of artifical benchmark datasets. The success of the models on the problems considered confirms the utility of the metaphor

    Intelligent data mining using artificial neural networks and genetic algorithms : techniques and applications

    Get PDF
    Data Mining (DM) refers to the analysis of observational datasets to find relationships and to summarize the data in ways that are both understandable and useful. Many DM techniques exist. Compared with other DM techniques, Intelligent Systems (ISs) based approaches, which include Artificial Neural Networks (ANNs), fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as Genetic Algorithms (GAs), are tolerant of imprecision, uncertainty, partial truth, and approximation. They provide flexible information processing capability for handling real-life situations. This thesis is concerned with the ideas behind design, implementation, testing and application of a novel ISs based DM technique. The unique contribution of this thesis is in the implementation of a hybrid IS DM technique (Genetic Neural Mathematical Method, GNMM) for solving novel practical problems, the detailed description of this technique, and the illustrations of several applications solved by this novel technique. GNMM consists of three steps: (1) GA-based input variable selection, (2) Multi- Layer Perceptron (MLP) modelling, and (3) mathematical programming based rule extraction. In the first step, GAs are used to evolve an optimal set of MLP inputs. An adaptive method based on the average fitness of successive generations is used to adjust the mutation rate, and hence the exploration/exploitation balance. In addition, GNMM uses the elite group and appearance percentage to minimize the randomness associated with GAs. In the second step, MLP modelling serves as the core DM engine in performing classification/prediction tasks. An Independent Component Analysis (ICA) based weight initialization algorithm is used to determine optimal weights before the commencement of training algorithms. The Levenberg-Marquardt (LM) algorithm is used to achieve a second-order speedup compared to conventional Back-Propagation (BP) training. In the third step, mathematical programming based rule extraction is not only used to identify the premises of multivariate polynomial rules, but also to explore features from the extracted rules based on data samples associated with each rule. Therefore, the methodology can provide regression rules and features not only in the polyhedrons with data instances, but also in the polyhedrons without data instances. A total of six datasets from environmental and medical disciplines were used as case study applications. These datasets involve the prediction of longitudinal dispersion coefficient, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) data, eye bacteria Multisensor Data Fusion (MDF), and diabetes classification (denoted by Data I through to Data VI). GNMM was applied to all these six datasets to explore its effectiveness, but the emphasis is different for different datasets. For example, the emphasis of Data I and II was to give a detailed illustration of how GNMM works; Data III and IV aimed to show how to deal with difficult classification problems; the aim of Data V was to illustrate the averaging effect of GNMM; and finally Data VI was concerned with the GA parameter selection and benchmarking GNMM with other IS DM techniques such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Fuzzy ARTMAP, and Cartesian Genetic Programming (CGP). In addition, datasets obtained from published works (i.e. Data II & III) or public domains (i.e. Data VI) where previous results were present in the literature were also used to benchmark GNMM’s effectiveness. As a closely integrated system GNMM has the merit that it needs little human interaction. With some predefined parameters, such as GA’s crossover probability and the shape of ANNs’ activation functions, GNMM is able to process raw data until some human-interpretable rules being extracted. This is an important feature in terms of practice as quite often users of a DM system have little or no need to fully understand the internal components of such a system. Through case study applications, it has been shown that the GA-based variable selection stage is capable of: filtering out irrelevant and noisy variables, improving the accuracy of the model; making the ANN structure less complex and easier to understand; and reducing the computational complexity and memory requirements. Furthermore, rule extraction ensures that the MLP training results are easily understandable and transferrable

    Cyber security threats and challenges in collaborative mixed-reality

    Get PDF
    Collaborative Mixed-Reality (CMR) applications are gaining interest in a wide range of areas including games, social interaction, design and health-care. To date, the vast majority of published work has focused on display technology advancements, software, collaboration architectures and applications. However, the potential security concerns that affect collaborative platforms have received limited research attention. In this position paper, we investigate the challenges posed by cyber-security threats to CMR systems. We focus on how typical network architectures facilitating CMR and how their vulnerabilities can be exploited by attackers, and discuss the degree of potential social, monetary impacts, psychological and other harms that may result from such exploits. The main purpose of this paper is to provoke a discussion on CMR security concerns. We highlight insights from a cyber-security threat modelling perspective and also propose potential directions for research and development toward better mitigation strategies. We present a simple, systematic approach to understanding a CMR attack surface through an abstraction-based reasoning framework to identify potential attack vectors. Using this framework, security analysts, engineers, designers and users alike (stakeholders) can identify potential Indicators of Exposures (IoE) and Indicators of Compromise (IoC). Our framework allows stakeholders to reduce their CMR attack surface as well understand how Intrusion Detection System (IDS) approaches can be adopted for CMR systems. To demonstrate the validity to our framework, we illustrate several CMR attack surfaces through a set of use-cases. Finally, we also present a discussion on future directions this line of research should take
    • …
    corecore