10 research outputs found

    Data augmentation in economic time series: Behavior and improvements in predictions

    Get PDF
    The performance of neural networks and statistical models in time series prediction is conditioned by the amount of data available. The lack of observations is one of the main factors influencing the representativeness of the underlying patterns and trends. Using data augmentation techniques based on classical statistical techniques and neural networks, it is possible to generate additional observations and improve the accuracy of the predictions. The particular characteristics of economic time series make it necessary that data augmentation techniques do not significantly influence these characteristics, this fact would alter the quality of the details in the study. This paper analyzes the performance obtained by two data augmentation techniques applied to a time series and finally processed by an ARIMA model and a neural network model to make predictions. The results show a significant improvement in the predictions by the time series augmented by traditional interpolation techniques, obtaining a better fit and correlation with the original series

    Human face detection techniques: A comprehensive review and future research directions

    Get PDF
    Face detection which is an effortless task for humans are complex to perform on machines. Recent veer proliferation of computational resources are paving the way for a frantic advancement of face detection technology. Many astutely developed algorithms have been proposed to detect faces. However, there is a little heed paid in making a comprehensive survey of the available algorithms. This paper aims at providing fourfold discussions on face detection algorithms. At first, we explore a wide variety of available face detection algorithms in five steps including history, working procedure, advantages, limitations, and use in other fields alongside face detection. Secondly, we include a comparative evaluation among different algorithms in each single method. Thirdly, we provide detailed comparisons among the algorithms epitomized to have an all inclusive outlook. Lastly, we conclude this study with several promising research directions to pursue. Earlier survey papers on face detection algorithms are limited to just technical details and popularly used algorithms. In our study, however, we cover detailed technical explanations of face detection algorithms and various recent sub-branches of neural network. We present detailed comparisons among the algorithms in all-inclusive and also under sub-branches. We provide strengths and limitations of these algorithms and a novel literature survey including their use besides face detection

    A Comprehensive Bibliometric Analysis on Social Network Anonymization: Current Approaches and Future Directions

    Full text link
    In recent decades, social network anonymization has become a crucial research field due to its pivotal role in preserving users' privacy. However, the high diversity of approaches introduced in relevant studies poses a challenge to gaining a profound understanding of the field. In response to this, the current study presents an exhaustive and well-structured bibliometric analysis of the social network anonymization field. To begin our research, related studies from the period of 2007-2022 were collected from the Scopus Database then pre-processed. Following this, the VOSviewer was used to visualize the network of authors' keywords. Subsequently, extensive statistical and network analyses were performed to identify the most prominent keywords and trending topics. Additionally, the application of co-word analysis through SciMAT and the Alluvial diagram allowed us to explore the themes of social network anonymization and scrutinize their evolution over time. These analyses culminated in an innovative taxonomy of the existing approaches and anticipation of potential trends in this domain. To the best of our knowledge, this is the first bibliometric analysis in the social network anonymization field, which offers a deeper understanding of the current state and an insightful roadmap for future research in this domain.Comment: 73 pages, 28 figure

    Parameterizing and Aggregating Activation Functions in Deep Neural Networks

    Get PDF
    The nonlinear activation functions applied by each neuron in a neural network are essential for making neural networks powerful representational models. If these are omitted, even deep neural networks reduce to simple linear regression due to the fact that a linear combination of linear combinations is still a linear combination. In much of the existing literature on neural networks, just one or two activation functions are selected for the entire network, even though the use of heterogenous activation functions has been shown to produce superior results in some cases. Even less often employed are activation functions that can adapt their nonlinearities as network parameters along with standard weights and biases. This dissertation presents a collection of papers that advance the state of heterogenous and parameterized activation functions. Contributions of this dissertation include three novel parametric activation functions and applications of each, a study evaluating the utility of the parameters in parametric activation functions, an aggregated activation approach to modeling time-series data as an alternative to recurrent neural networks, and an improvement upon existing work that aggregates neuron inputs using product instead of sum

    Intelligent Systems Approach for Classification and Management of Patients with Headache

    Get PDF
    Primary headache disorders are the most common complaints worldwide. The socioeconomic and personal impact of headache disorders is enormous, as it is the leading cause of workplace absence. Headache patients’ consultations are increasing as the population has increased in size, live longer and many people have multiple conditions, however, access to specialist services across the UK is currently inequitable because the numbers of trained consultant neurologists in the UK are 10 times lower than other European countries. Additionally, more than two third of headache cases presented to primary care were labelled with unspecified headache. Therefore, an alternative pathway to diagnose and manage patients with primary headache could be crucial to reducing the need for specialist assessment and increase capacity within the current service model. Several recent studies have targeted this issue through the development of clinical decision support systems, which can help non-specialist doctors and general practitioners to diagnose patients with primary headache disorders in primary clinics. However, the majority of these studies were following a rule-based system style, in which the rules were summarised and expressed by a computer engineer. This style carries many downsides, and we will discuss them later on in this dissertation. In this study, we are adopting a completely different approach. The use of machine learning is recruited for the classification of primary headache disorders, for which a dataset of 832 records of patients with primary headaches was considered, originating from three medical centres located in Turkey. Three main types of primary headaches were derived from the data set including Tension Type Headache in both episodic and chronic forms, Migraine with and without Aura, followed by Trigeminal Autonomic Cephalalgia that further subdivided into Cluster headache, paroxysmal hemicrania and short-lasting unilateral neuralgiform headache attacks with conjunctival injection and tearing. Six popular machine-learning based classifiers, including linear and non-linear ensemble learning, in addition to one regression based procedure, have been evaluated for the classification of primary headaches within a supervised learning setting, achieving highest aggregate performance outcomes of AUC 0.923, sensitivity 0.897, and overall classification accuracy of 0.843. This study also introduces the proposed HydroApp system, which is an M-health based personalised application for the follow-up of patients with long-term conditions such as chronic headache and hydrocephalus. We managed to develop this system with the supervision of headache specialists at Ashford hospital, London, and neurology experts at Walton Centre and Alder Hey hospital Liverpool. We have successfully investigated the acceptance of using such an M-health based system via an online questionnaire, where 86% of paediatric patients and 60% of adult patients were interested in using HydroApp system to manage their conditions. Features and functions offered by HydroApp system such as recording headache score, recording of general health and well-being as well as alerting the treating team, have been perceived as very or extremely important aspects from patients’ point of view. The study concludes that the advances in intelligent systems and M-health applications represent a promising atmosphere through which to identify alternative solutions, which in turn increases the capacity in the current service model and improves diagnostic capability in the primary headache domain and beyond

    ESSE 2017. Proceedings of the International Conference on Environmental Science and Sustainable Energy

    Get PDF
    Environmental science is an interdisciplinary academic field that integrates physical-, biological-, and information sciences to study and solve environmental problems. ESSE - The International Conference on Environmental Science and Sustainable Energy provides a platform for experts, professionals, and researchers to share updated information and stimulate the communication with each other. In 2017 it was held in Suzhou, China June 23-25, 2017

    ML-based data-entry automation and data anomaly detection to support data quality assurance

    Get PDF
    Data playsacentralroleinmodernsoftwaresystems,whichare very oftenpoweredbymachinelearning(ML)andusedincriticaldo- mains ofourdailylives,suchasfinance,health,andtransportation. However,theeffectivenessofML-intensivesoftwareapplicationshighly depends onthequalityofthedata.Dataqualityisaffectedbydata anomalies; dataentryerrorsareoneofthemainsourcesofanomalies. The goalofthisthesisistodevelopapproachestoensuredataquality by preventingdataentryerrorsduringtheform-fillingprocessandby checking theofflinedatasavedindatabases. The maincontributionsofthisthesisare: 1. LAFF, anapproachtoautomaticallysuggestpossiblevaluesofcat- egorical fieldsindataentryforms. 2. LACQUER, anapproachtoautomaticallyrelaxthecompleteness requirementofdataentryformsbydecidingwhenafieldshould be optionalbasedonthefilledfieldsandhistoricalinputinstances. 3. LAFF-AD, anapproachtoautomaticallydetectdataanomaliesin categorical columnsinofflinedatasets. LAFF andLACQUERfocusmainlyonpreventingdataentryerrors during theform-fillingprocess.Bothapproachescanbeintegratedinto data entryapplicationsasefficientandeffectivestrategiestoassistthe user duringtheform-fillingprocess.LAFF-ADcanbeusedofflineon existing suspiciousdatatoeffectivelydetectanomaliesincategorical data. In addition,weperformedanextensiveevaluationofthethreeap- proaches,assessingtheireffectivenessandefficiency,usingreal-world datasets

    Efficient and effective objective image quality assessment metrics

    Get PDF
    Acquisition, transmission, and storage of images and videos have been largely increased in recent years. At the same time, there has been an increasing demand for high quality images and videos to provide satisfactory quality-of-experience for viewers. In this respect, high dynamic range (HDR) imaging with higher than 8-bit depth has been an interesting approach in order to capture more realistic images and videos. Objective image and video quality assessment plays a significant role in monitoring and enhancing the image and video quality in several applications such as image acquisition, image compression, multimedia streaming, image restoration, image enhancement and displaying. The main contributions of this work are to propose efficient features and similarity maps that can be used to design perceptually consistent image quality assessment tools. In this thesis, perceptually consistent full-reference image quality assessment (FR-IQA) metrics are proposed to assess the quality of natural, synthetic, photo-retouched and tone-mapped images. In addition, efficient no-reference image quality metrics are proposed to assess JPEG compressed and contrast distorted images. Finally, we propose a perceptually consistent color to gray conversion method, perform a subjective rating and evaluate existing color to gray assessment metrics. Existing FR-IQA metrics may have the following limitations. First, their performance is not consistent for different distortions and datasets. Second, better performing metrics usually have high complexity. We propose in this thesis an efficient and reliable full-reference image quality evaluator based on new gradient and color similarities. We derive a general deviation pooling formulation and use it to compute a final quality score from the similarity maps. Extensive experimental results verify high accuracy and consistent performance of the proposed metric on natural, synthetic and photo retouched datasets as well as its low complexity. In order to visualize HDR images on standard low dynamic range (LDR) displays, tone-mapping operators are used in order to convert HDR into LDR. Given different depth bits of HDR and LDR, traditional FR-IQA metrics are not able to assess the quality of tone-mapped images. The existing full-reference metric for tone-mapped images called TMQI converts both HDR and LDR to an intermediate color space and measure their similarity in the spatial domain. We propose in this thesis a feature similarity full-reference metric in which local phase of HDR is compared with the local phase of LDR. Phase is an important information of images and previous studies have shown that human visual system responds strongly to points in an image where the phase information is ordered. Experimental results on two available datasets show the very promising performance of the proposed metric. No-reference image quality assessment (NR-IQA) metrics are of high interest because in the most present and emerging practical real-world applications, the reference signals are not available. In this thesis, we propose two perceptually consistent distortion-specific NR-IQA metrics for JPEG compressed and contrast distorted images. Based on edge statistics of JPEG compressed images, an efficient NR-IQA metric for blockiness artifact is proposed which is robust to block size and misalignment. Then, we consider the quality assessment of contrast distorted images which is a common distortion. Higher orders of Minkowski distance and power transformation are used to train a low complexity model that is able to assess contrast distortion with high accuracy. For the first time, the proposed model is used to classify the type of contrast distortions which is very useful additional information for image contrast enhancement. Unlike its traditional use in the assessment of distortions, objective IQA can be used in other applications. Examples are the quality assessment of image fusion, color to gray image conversion, inpainting, background subtraction, etc. In the last part of this thesis, a real-time and perceptually consistent color to gray image conversion methodology is proposed. The proposed correlation-based method and state-of-the-art methods are compared by subjective and objective evaluation. Then, a conclusion is made on the choice of the objective quality assessment metric for the color to gray image conversion. The conducted subjective ratings can be used in the development process of quality assessment metrics for the color to gray image conversion and to test their performance
    corecore