6,464 research outputs found

    FastDeepIoT: Towards Understanding and Optimizing Neural Network Execution Time on Mobile and Embedded Devices

    Full text link
    Deep neural networks show great potential as solutions to many sensing application problems, but their excessive resource demand slows down execution time, pausing a serious impediment to deployment on low-end devices. To address this challenge, recent literature focused on compressing neural network size to improve performance. We show that changing neural network size does not proportionally affect performance attributes of interest, such as execution time. Rather, extreme run-time nonlinearities exist over the network configuration space. Hence, we propose a novel framework, called FastDeepIoT, that uncovers the non-linear relation between neural network structure and execution time, then exploits that understanding to find network configurations that significantly improve the trade-off between execution time and accuracy on mobile and embedded devices. FastDeepIoT makes two key contributions. First, FastDeepIoT automatically learns an accurate and highly interpretable execution time model for deep neural networks on the target device. This is done without prior knowledge of either the hardware specifications or the detailed implementation of the used deep learning library. Second, FastDeepIoT informs a compression algorithm how to minimize execution time on the profiled device without impacting accuracy. We evaluate FastDeepIoT using three different sensing-related tasks on two mobile devices: Nexus 5 and Galaxy Nexus. FastDeepIoT further reduces the neural network execution time by 48%48\% to 78%78\% and energy consumption by 37%37\% to 69%69\% compared with the state-of-the-art compression algorithms.Comment: Accepted by SenSys '1

    Oil and Gas flow Anomaly Detection on offshore naturally flowing wells using Deep Neural Networks

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Data ScienceThe Oil and Gas industry, as never before, faces multiple challenges. It is being impugned for being dirty, a pollutant, and hence the more demand for green alternatives. Nevertheless, the world still has to rely heavily on hydrocarbons, since it is the most traditional and stable source of energy, as opposed to extensively promoted hydro, solar or wind power. Major operators are challenged to produce the oil more efficiently, to counteract the newly arising energy sources, with less of a climate footprint, more scrutinized expenditure, thus facing high skepticism regarding its future. It has to become greener, and hence to act in a manner not required previously. While most of the tools used by the Hydrocarbon E&P industry is expensive and has been used for many years, it is paramount for the industry’s survival and prosperity to apply predictive maintenance technologies, that would foresee potential failures, making production safer, lowering downtime, increasing productivity and diminishing maintenance costs. Many efforts were applied in order to define the most accurate and effective predictive methods, however data scarcity affects the speed and capacity for further experimentations. Whilst it would be highly beneficial for the industry to invest in Artificial Intelligence, this research aims at exploring, in depth, the subject of Anomaly Detection, using the open public data from Petrobras, that was developed by experts. For this research the Deep Learning Neural Networks, such as Recurrent Neural Networks with LSTM and GRU backbones, were implemented for multi-class classification of undesirable events on naturally flowing wells. Further, several hyperparameter optimization tools were explored, mainly focusing on Genetic Algorithms as being the most advanced methods for such kind of tasks. The research concluded with the best performing algorithm with 2 stacked GRU and the following vector of hyperparameters weights: [1, 47, 40, 14], which stand for timestep 1, number of hidden units 47, number of epochs 40 and batch size 14, producing F1 equal to 0.97%. As the world faces many issues, one of which is the detrimental effect of heavy industries to the environment and as result adverse global climate change, this project is an attempt to contribute to the field of applying Artificial Intelligence in the Oil and Gas industry, with the intention to make it more efficient, transparent and sustainable

    Predicting Disease Progression Using Deep Recurrent Neural Networks and Longitudinal Electronic Health Record Data

    Get PDF
    Electronic Health Records (EHR) are widely adopted and used throughout healthcare systems and are able to collect and store longitudinal information data that can be used to describe patient phenotypes. From the underlying data structures used in the EHR, discrete data can be extracted and analyzed to improve patient care and outcomes via tasks such as risk stratification and prospective disease management. Temporality in EHR is innately present given the nature of these data, however, and traditional classification models are limited in this context by the cross- sectional nature of training and prediction processes. Finding temporal patterns in EHR is especially important as it encodes temporal concepts such as event trends, episodes, cycles, and abnormalities. Previously, there have been attempts to utilize temporal neural network models to predict clinical intervention time and mortality in the intensive care unit (ICU) and recurrent neural network (RNN) models to predict multiple types of medical conditions as well as medication use. However, such work has been limited in scope and generalizability beyond the immediate use cases that have been focused upon. In order to extend the relevant knowledge- base, this study demonstrates a predictive modeling pipeline that can extract and integrate clinical information from the EHR, construct a feature set, and apply a deep recurrent neural network (DRNN) to model complex time stamped longitudinal data for monitoring and managing the progression of a disease condition. It utilizes longitudinal data of pediatric patient cohort diagnosed with Neurofibromatosis Type 1 (NF1), which is one of the most common neurogenetic disorders and occurs in 1 of every 3,000 births, without predilection for race, sex, or ethnicity. The prediction pipeline is differentiable from other efforts to-date that have sought to model NF1 progression in that it involves the analysis of multi-dimensional phenotypes wherein the DRNN is able to model complex non-linear relationships between event points in the longitudinal data both temporally and . Such an approach is critical when seeking to transition from traditional evidence-based care models to precision medicine paradigms. Furthermore, our predictive modeling pipeline can be generalized and applied to manage the progression and stratify the risks in other similar complex diseases, as it can predict multiple set of sub-phenotypical features from training on longitudinal event sequences

    Production Optimization Indexed to the Market Demand Through Neural Networks

    Get PDF
    Connectivity, mobility and real-time data analytics are the prerequisites for a new model of intelligent production management that facilitates communication between machines, people and processes and uses technology as the main driver. Many works in the literature treat maintenance and production management in separate approaches, but there is a link between these areas, with maintenance and its actions aimed at ensuring the smooth operation of equipment to avoid unnecessary downtime in production. With the advent of technology, companies are rushing to solve their problems by resorting to technologies in order to fit into the most advanced technological concepts, such as industries 4.0 and 5.0, which are based on the principle of process automation. This approach brings together database technologies, making it possible to monitor the operation of equipment and have the opportunity to study patterns of data behavior that can alert us to possible failures. The present thesis intends to forecast the pulp production indexed to the stock market value.The forecast will be made by means of the pulp production variables of the presses and the stock exchange variables supported by artificial intelligence (AI) technologies, aiming to achieve an effective planning. To support the decision of efficient production management, in this thesis algorithms were developed and validated with from five pulp presses, as well as data from other sources, such as steel production and stock exchange, which were relevant to validate the robustness of the model. This thesis demonstrated the importance of data processing methods and that they have great relevance in the model input since they facilitate the process of training and testing the models. The chosen technologies demonstrated good efficiency and versatility in performing the prediction of the values of the variables of the equipment, also demonstrating robustness and optimization in computational processing. The thesis also presents proposals for future developments, namely in further exploration of these technologies, so that there are market variables that can calibrate production through forecasts supported on these same variables.Conectividade, mobilidade e análise de dados em tempo real são pré-requisitos para um novo modelo de gestão inteligente da produção que facilita a comunicação entre máquinas, pessoas e processos, e usa a tecnologia como motor principal. Muitos trabalhos na literatura tratam a manutenção e a gestão da produção em abordagens separadas, mas existe uma correlação entre estas áreas, sendo que a manutenção e as suas políticas têm como premissa garantir o bom funcionamento dos equipamentos de modo a evitar paragens desnecessárias na linha de produção. Com o advento da tecnologia há uma corrida das empresas para solucionar os seus problemas recorrendo às tecnologias, visando a sua inserção nos conceitos tecnológicos, mais avançados, tais como as indústrias 4.0 e 5.0, as quais têm como princípio a automatização dos processos. Esta abordagem junta as tecnologias de sistema de informação, sendo possível fazer o acompanhamento do funcionamento dos equipamentos e ter a possibilidade de realizar o estudo de padrões de comportamento dos dados que nos possam alertar para possíveis falhas. A presente tese pretende prever a produção da pasta de papel indexada às bolsas de valores. A previsão será feita por via das variáveis da produção da pasta de papel das prensas e das variáveis da bolsa de valores suportadas em tecnologias de artificial intelligence (IA), tendo como objectivo conseguir um planeamento eficaz. Para suportar a decisão de uma gestão da produção eficiente, na presente tese foram desenvolvidos algoritmos, validados em dados de cinco prensas de pasta de papel, bem como dados de outras fontes, tais como, de Produção de Aço e de Bolsas de Valores, os quais se mostraram relevantes para a validação da robustez dos modelos. A presente tese demonstrou a importância dos métodos de tratamento de dados e que os mesmos têm uma grande relevância na entrada do modelo, visto que facilita o processo de treino e testes dos modelos. As tecnologias escolhidas demonstraram uma boa eficiência e versatilidade na realização da previsão dos valores das variáveis dos equipamentos, demonstrando ainda robustez e otimização no processamento computacional. A tese apresenta ainda propostas para futuros desenvolvimentos, designadamente na exploração mais aprofundada destas tecnologias, de modo a que haja variáveis de mercado que possam calibrar a produção através de previsões suportadas nestas mesmas variáveis
    corecore