4 research outputs found

    Engineering Project Management Modeling Using Artificial Neural Networks

    Get PDF
    Performance evaluation of the comprehensive management level of engineering projects is advantageous case of study. Benefited from constructive and fluctuant of artificial neural networks (ANN) and based on their self-study, self-adjustment and nonlinear mapping (activation) function of the ANN inputs to outputs the performance evaluation model of engineering project management was established. Compared with conventional method, the influence of human factor is eliminated, thus the correctness of the measured results is increased. Different model structures were discussed with different ANN parameters and satisfactory results were concluded giving a new approach to evaluate the engineering project management. Keywords: ANN structure, training rate, training time, activation function, performance evaluation

    A taxonomy framework for unsupervised outlier detection techniques for multi-type data sets

    Get PDF
    The term "outlier" can generally be defined as an observation that is significantly different from the other values in a data set. The outliers may be instances of error or indicate events. The task of outlier detection aims at identifying such outliers in order to improve the analysis of data and further discover interesting and useful knowledge about unusual events within numerous applications domains. In this paper, we report on contemporary unsupervised outlier detection techniques for multiple types of data sets and provide a comprehensive taxonomy framework and two decision trees to select the most suitable technique based on data set. Furthermore, we highlight the advantages, disadvantages and performance issues of each class of outlier detection techniques under this taxonomy framework

    Multi-Scale Flow Mapping And Spatiotemporal Analysis Of Origin-Destination Mobility Data

    Get PDF
    Data on spatial mobility have become increasingly available with the wide use of location-aware technologies such as GPS and smart phones. The analysis of movements is involved in a wide range of domains such as demography, migration, public health, urban study, transportation and biology. A movement data set consists of a set of moving objects, each having a sequence of sampled locations as the object moves across space. The locations (points) in different trajectories are usually sampled independently and trajectory data can become very big such as billions of geotagged tweets, mobile phone records, floating vehicles, millions of migrants, etc. Movement data can be analyzed to extract a variety of information such as point of interest or hot spots, flow patterns, community structure, and spatial interaction models. However, it remains a challenging problem to analyze and map large mobility data and understand its embedded complex patterns due to the massive connections, complex patterns and constrained map space to display. My research focuses on the development of scalable and effective computational and visualization approaches to help derive insights from big geographic mobility data, including both origin-destination (OD) data and trajectory data. Specifically, my research contribution has two components: (1) flow clustering and flow mapping of massive flow data, with applications in mapping billions of taxi trips (Chapter 2 and Chapter 3); and (2) time series analysis of mobility, with applications in urban event detection (Chapter 4). Flow map is the most common approach for visualizing spatial mobility data. However, a flow map quickly becomes illegible as the data size increases due to the massive intersections and overlapping flows in the limited map space. It remains a challenging research problem to construct flow maps for big mobility data, which demands new approaches for flow pattern extraction and cartographic generalization. I have developed new cartographic generalization approaches to flow mapping, which extract high-level flow patterns from big data through hierarchical flow clustering, kernel-based flow smoothing, and flow abstraction. My approaches represent a significant breakthrough that enables effective flow mapping of big data to discover complex patterns at multiple scales and present a holistic view of high-level information. The second area of my research focuses on the time series analysis of urban mobility data, such as taxi trips and geo-social media check-ins, to facilitate scientific understanding of urban dynamics and environments. I have developed new approaches to construct location-based time series from mobility data and decompose each mobility time series into three components, i.e. long-term trend, seasonal periodicity pattern and anomalies, from which urban events, land use types, and changes can be inferred. Specifically, I developed time series decomposition method for urban event detection, where an event is defined as a time series anomaly deviating significantly from its regular trend and periodicity

    Contributions for post processing of wavelet transform with SPIHT ROI coding and application in the transmission of images

    Get PDF
    Orientador: Yuzo IanoTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: A área que trata de compressão de imagem com perdas é, atualmente, de grande importância. Isso se deve ao fato de que as técnicas de compressão permitem representar de uma forma eficiente uma imagem reduzindo assim, o espaço necessário para armazenamento ou um posterior envio da imagem através de um canal de comunicações. Em particular, o algoritmo SPIHT (Set Partitioning of Hierarchical Trees) muito usado em compressão de imagens é de implementação simples e pode ser aproveitado em aplicações onde se requer uma baixa complexidade. Este trabalho propõe um esquema de compressão de imagens utilizando uma forma personalizada de armazenamento da transformada DWT (Discrete Wavelet Transform), codificação flexível da ROI (Region Of Interest) e a compressão de imagens usando o algoritmo SPIHT. A aplicação consiste na transmissão dos dados correspondentes usando-se codificação turbo. A forma personalizada de armazenamento da DWT visa um melhor aproveitamento da memória por meio do uso de algoritmo SPIHT. A codificação ROI genérica é aplicada em um nível alto da decomposição DWT. Nesse ponto, o algoritmo SPIHT serve para ressaltar e transmitir com prioridade as regiões de interesse. Os dados a serem transmitidos, visando o menor custo de processamento, são codificados com um esquema turbo convolucional. Isso porque esse esquema é de implementação simples no que concerne à codificação. A simulação é implementada em módulos separados e reutilizáveis para esta pesquisa. Os resultados das simulações mostram que o esquema proposto é uma solução que diminui a quantidade de memória utilizada bem como o custo computacional para aplicações de envio de imagens em aplicações como transmissão de imagens via satélite, radiodifusão e outras mídiasAbstract: Nowadays, the area that comes to lossy image compression is really important. This is due to the fact that compression techniques allow an efficient way to represent an image thereby reducing the space required for storage or subsequent submission of an image through a communications channel. In particular, the algorithm SPIHT (Set Partitioning of Hierarchical Trees) widely used in image compression is simple to implement and can be used in applications where a low complexity is required. This study proposes an image compression scheme using a personalized storage transform DWT (Discrete Wavelet Transform), encoding flexible ROI (Region Of Interest) and image compression algorithm using SPIHT. The application consists in a transmission of the corresponding data using turbo coding. The shape of the custom storage DWT aims to make better use of memory by reducing the amount of memory through the use of SPIHT algorithm. ROI coding is applied in a generic high-level DWT decomposition. At this point, the algorithm serves to highlight SPITH and transmit the priority areas of interest. The data to be transmitted in order to lower the cost of processing are encoded with a turbo convolutional scheme. This is due this scheme is simple to implement with regard to coding. The simulation is implemented in separate modules and reusable for this research. The simulations and analysis show that the proposed scheme is a solution that decreases the amount of memory used and the computational cost for applications to send images in applications such as image transmission via satellite, broadcasting and others mediasDoutoradoTelecomunicações e TelemáticaDoutor em Engenharia Elétric
    corecore