185 research outputs found

    Modelling atmospheric ozone concentration using machine learning algorithms

    Get PDF
    Air quality monitoring is one of several important tasks carried out in the area of environmental science and engineering. Accordingly, the development of air quality predictive models can be very useful as such models can provide early warnings of pollution levels increasing to unsatisfactory levels. The literature review conducted within the research context of this thesis revealed that only a limited number of widely used machine learning algorithms have been employed for the modelling of the concentrations of atmospheric gases such as ozone, nitrogen oxides etc. Despite this observation the research and technology area of machine learning has recently advanced significantly with the introduction of ensemble learning techniques, convolutional and deep neural networks etc. Given these observations the research presented in this thesis aims to investigate the effective use of ensemble learning algorithms with optimised algorithmic settings and the appropriate choice of base layer algorithms to create effective and efficient models for the prediction and forecasting of specifically, ground level ozone (O3). Three main research contributions have been made by this thesis in the application area of modelling O3 concentrations. As the first contribution, the performance of several ensemble learning (Homogeneous and Heterogonous) algorithms were investigated and compared with all popular and widely used single base learning algorithms. The results have showed impressive prediction performance improvement obtainable by using meta learning (Bagging, Stacking, and Voting) algorithms. The performances of the three investigated meta learning algorithms were similar in nature giving an average 0.91 correlation coefficient, in prediction accuracy. Thus as a second contribution, the effective use of feature selection and parameter based optimisation was carried out in conjunction with the application of Multilayer Perceptron, Support Vector Machines, Random Forest and Bagging based learning techniques providing significant improvements in prediction accuracy. The third contribution of research presented in this thesis includes the univariate and multivariate forecasting of ozone concentrations based of optimised Ensemble Learning algorithms. The results reported supersedes the accuracy levels reported in forecasting Ozone concentration variations based on widely used, single base learning algorithms. In summary the research conducted within this thesis bridges an existing research gap in big data analytics related to environment pollution modelling, prediction and forecasting where present research is largely limited to using standard learning algorithms such as Artificial Neural Networks and Support Vector Machines often available within popular commercial software packages

    A Connotative Space for Supporting Movie Affective Recommendation

    Get PDF
    The problem of relating media content to users’affective responses is here addressed. Previous work suggests that a direct mapping of audio-visual properties into emotion categories elicited by films is rather difficult, due to the high variability of individual reactions. To reduce the gap between the objective level of video features and the subjective sphere of emotions, we propose to shift the representation towards the connotative properties of movies, in a space inter-subjectively shared among users. Consequently, the connotative space allows to define, relate and compare affective descriptions of film videos on equal footing. An extensive test involving a significant number of users watching famous movie scenes, suggests that the connotative space can be related to affective categories of a single user. We apply this finding to reach high performance in meeting user’s emotional preferences

    Spartan Daily, April 26, 1982

    Get PDF
    Volume 78, Issue 50https://scholarworks.sjsu.edu/spartandaily/6893/thumbnail.jp

    Spartan Daily, April 26, 1982

    Get PDF
    Volume 78, Issue 50https://scholarworks.sjsu.edu/spartandaily/6893/thumbnail.jp

    A model to assess organisational information privacy maturity against the Protection of Personal Information Act

    Get PDF
    Includes bibliographical references.Reports on information security breaches have risen dramatically over the past five years with 2014 accounting for some high-profile breaches including Goldman Sachs, Boeing, AT&T, EBay, AOL, American Express and Apple to name a few. One report estimates that 868,045,823 records have been breached from 4,347 data breaches made public since 2005 (Privacy Rights Clearing House, 2013). The theft of laptops, loss of unencrypted USB drives, hackers infiltrating servers, and staff deliberately accessing client’s personal information are all regularly reported (Park, 2014; Privacy Rights Clearing House, 2013) . With the rise of data breaches in the Information Age, the South African government enacted the long awaited Protection of Personal Information (PoPI) Bill at the end of 2013. While South Africa has lagged behind other countries in adopting privacy legislation (the European Union issued their Data Protection Directive in 1995), South African legislators have had the opportunity to draft a privacy Act that draws on the most effective elements from other legislation around the world. Although PoPI has been enacted, a commencement date has still to be decided upon by the Presidency. On PoPI’s commencement date organisations will have an additional year to comply with its requirements, before which they should: review the eight conditions for the lawful processing of personal information set out in Chapter three of the Act; understand the type of personal information they process ; review staff training on mobile technologies and limit access to personal information; ensure laptops and other mobile devices have passwords and are preferably encrypted; look at the physical security of the premises where personal data is store d or processed; and, assess any service providers who process in formation on their behalf. With the demands PoPI places on organisations this research aims to develop a prescriptive model providing organisations with the ability to measure their information privacy maturity based on “generally accepted information security practices and procedure s” ( Protection of Personal Information Act, No.4 of 2013 , sec. 19(3)) . Using a design science research methodology, the development process provides three distinct design cycles: 1) conceptual foundation 2) legal evaluation and 3) organisational evaluation. The end result is the development of a privacy maturity model that allows organisations to measure their current information privacy maturity against the PoPI Act. This research contributes to the knowledge of how PoPI impacts on South African organisations, and in turn, how organisations are able to evaluate their current information privacy maturity in respect of the PoPI Act. The examination and use of global best practices and standards as the foundation for the model, and the integration with the PoPI Act, provides for the development of a unique yet standards-based privacy model aiming to provide practical benefit to South African organisations

    ALT-C 2011 Abstracts

    Get PDF
    This is a PDF of the abstracts for all the sessions at the 2011 ALT conference. It is designed to be used alongside the online version of the conference programme. It was made public on 1 September, with a "topped and tailed" made live on 2 September

    Security in Large-Scale Internet Elections: A Retrospective Analysis of Elections in Estonia, The Netherlands, and Switzerland

    Full text link

    A conceptual study of automatic and semi-automatic quality assurance techniques for round image processing

    Get PDF
    This report summarizes the results of a study conducted by Engineering and Economics Research (EER), Inc. under NASA Contract Number NAS5-27513. The study involved the development of preliminary concepts for automatic and semiautomatic quality assurance (QA) techniques for ground image processing. A distinction is made between quality assessment and the more comprehensive quality assurance which includes decision making and system feedback control in response to quality assessment

    Constructing fail-controlled nodes for distributed systems: a software approach

    Get PDF
    PhD ThesisDesigning and implementing distributed systems which continue to provide specified services in the presence of processing site and communication failures is a difficult task. To facilitate their development, distributed systems have been built assuming that their underlying hardware components are Jail-controlled, i.e. present a well defined failure mode. However, if conventional hardware cannot provide the assumed failure mode, there is a need to build processing sites or nodes, and communication infra-structure that present the fail-controlled behaviour assumed. Coupling a number of redundant processors within a replicated node is a well known way of constructing fail-controlled nodes. Computation is replicated and executed simultaneously at each processor, and by employing suitable validation techniques to the outputs generated by processors (e.g. majority voting, comparison), outputs from faulty processors can be prevented from appearing at the application level. One way of constructing replicated nodes is by introducing hardwired mechanisms to couple replicated processors with specialised validation hardware circuits. Processors are tightly synchronised at the clock cycle level, and have their outputs validated by a reliable validation hardware. Another approach is to use software mechanisms to perform synchronisation of processors and validation of the outputs. The main advantage of hardware based nodes is the minimum performance overhead incurred. However, the introduction of special circuits may increase the complexity of the design tremendously. Further, every new microprocessor architecture requires considerable redesign overhead. Software based nodes do not present these problems, on the other hand, they introduce much bigger performance overheads to the system. In this thesis we investigate alternative ways of constructing efficient fail-controlled, software based replicated nodes. In particular, we present much more efficient order protocols, which are necessary for the implementation of these nodes. Our protocols, unlike others published to date, do not require processors' physical clocks to be explicitly synchronised. The main contribution of this thesis is the precise definition of the semantics of a software based Jail-silent node, along with its efficient design, implementation and performance evaluation.The Brazilian National Research Council (CNPq/Brasil)
    • …
    corecore