1,996 research outputs found

    Koneoppimiskehys petrokemianteollisuuden sovelluksille

    Get PDF
    Machine learning has many potentially useful applications in process industry, for example in process monitoring and control. Continuously accumulating process data and the recent development in software and hardware that enable more advanced machine learning, are fulfilling the prerequisites of developing and deploying process automation integrated machine learning applications which improve existing functionalities or even implement artificial intelligence. In this master's thesis, a framework is designed and implemented on a proof-of-concept level, to enable easy acquisition of process data to be used with modern machine learning libraries, and to also enable scalable online deployment of the trained models. The literature part of the thesis concentrates on studying the current state and approaches for digital advisory systems for process operators, as a potential application to be developed on the machine learning framework. The literature study shows that the approaches for process operators' decision support tools have shifted from rule-based and knowledge-based methods to machine learning. However, no standard methods can be concluded, and most of the use cases are quite application-specific. In the developed machine learning framework, both commercial software and open source components with permissive licenses are used. Data is acquired over OPC UA and then processed in Python, which is currently almost the de facto standard language in data analytics. Microservice architecture with containerization is used in the online deployment, and in a qualitative evaluation, it proved to be a versatile and functional solution.Koneoppimisella voidaan osoittaa olevan useita hyödyllisiä käyttökohteita prosessiteollisuudessa, esimerkiksi prosessinohjaukseen liittyvissä sovelluksissa. Jatkuvasti kerääntyvä prosessidata ja toisaalta koneoppimiseen soveltuvien ohjelmistojen sekä myös laitteistojen viimeaikainen kehitys johtavat tilanteeseen, jossa prosessiautomaatioon liitettyjen koneoppimissovellusten avulla on mahdollista parantaa nykyisiä toiminnallisuuksia tai jopa toteuttaa tekoälysovelluksia. Tässä diplomityössä suunniteltiin ja toteutettiin prototyypin tasolla koneoppimiskehys, jonka avulla on helppo käyttää prosessidataa yhdessä nykyaikaisten koneoppimiskirjastojen kanssa. Kehys mahdollistaa myös koneopittujen mallien skaalautuvan käyttöönoton. Diplomityön kirjallisuusosa keskittyy prosessioperaattoreille tarkoitettujen digitaalisten avustajajärjestelmien nykytilaan ja toteutustapoihin, avustajajärjestelmän tai sen päätöstukijärjestelmän ollessa yksi mahdollinen koneoppimiskehyksen päälle rakennettava ohjelma. Kirjallisuustutkimuksen mukaan prosessioperaattorin päätöstukijärjestelmien taustalla olevat menetelmät ovat yhä useammin koneoppimiseen perustuvia, aiempien sääntö- ja tietämyskantoihin perustuvien menetelmien sijasta. Selkeitä yhdenmukaisia lähestymistapoja ei kuitenkaan ole helposti pääteltävissä kirjallisuuden perusteella. Lisäksi useimmat tapausesimerkit ovat sovellettavissa vain kyseisissä erikoistapauksissa. Kehitetyssä koneoppimiskehyksessä on käytetty sekä kaupallisia että avoimen lähdekoodin komponentteja. Prosessidata haetaan OPC UA -protokollan avulla, ja sitä on mahdollista käsitellä Python-kielellä, josta on muodostunut lähes de facto -standardi data-analytiikassa. Kehyksen käyttöönottokomponentit perustuvat mikropalveluarkkitehtuuriin ja konttiteknologiaan, jotka osoittautuivat laadullisessa testauksessa monipuoliseksi ja toimivaksi toteutustavaksi

    In-Situ Process Monitoring for Metal Additive Manufacturing (AM) Through Acoustic Technique

    Get PDF
    Additive Manufacturing (AM) is currently a widely used technology in different industries such as aerospace, medical, and consumer products. Previously it was mainly used for prototyping of the products, but now it is equally valuable for commercial product manufacturing. More profound understanding is still needed to track and identify defects during the AM process to ensure higher quality products with less material waste. Nondestructive testing becomes an essential form of testing for AM parts, where AE is one of the most used methods for in situ process monitoring. The Acoustic Emission (AE) approach has gained a reputation in nondestructive testing (NDT) as one of the most influential and proven techniques in numerous engineering fields. Material testing through Acoustic Emission (AE) has become one of the most popular techniques in AM because of its capability to detect defects and anomalies and monitor the progress of flaws. Various AE technique approaches have been under investigation for in-situ monitoring of AM products. The preliminary results from AE exploration show promising results which need further investigation on data analysis and signal processing. AE monitoring technique allows finding the defects during the fabrication process, so that failure of the AM can be prevented, or the process condition can be finely tuned to avoid significant damages or waste of materials. In this work, recorded AE data over the Direct Energy Deposition (DED) additive manufacturing process was analyzed by the Machine Learning (ML) algorithm to classify different build conditions. The feature extraction method is used to obtain the required data for further processing. Wavelet transformation of signals has been used to acquire the time-frequency spectrum of the AE signals for different process conditions, and image processing by Convolutional Neural Network (CNN) is used to identify the transformed spectrum of different build conditions. The identifiers in AE signals are correlated to the part quality by statistical methods. The results show a promising approach for quality evaluation and process monitoring in AM. In this work, the assessment of deposition properties at different process conditions is also done by optical microscope, Scanning Electron Microscope (SEM), Energy-Dispersive X-ray Spectroscopy (EDS), and nanoindentation technique

    Managing smart cities with deepint.net

    Get PDF
    In this keynote, the evolution of intelligent computer systems will be examined. The need for human capital will be emphasised, as well as the need to follow one’s “gut instinct” in problem-solving. We will look at the benefits of combining information and knowledge to solve complex problems and will examine how knowledge engineering facilitates the integration of different algorithms. Furthermore, we will analyse the importance of complementary technologies such as IoT and Blockchain in the development of intelligent systems. It will be shown how tools like "Deep Intelligence" make it possible to create computer systems efficiently and effectively. "Smart" infrastructures need to incorporate all added-value resources so they can offer useful services to the society, while reducing costs, ensuring reliability and improving the quality of life of the citizens. The combination of AI with IoT and with blockchain offers a world of possibilities and opportunities

    Efficient Deployment of DeepTech AI Models in Engineering Solutions

    Get PDF
    The blockchain system, appeared in 2009 together with the virtual currency bitcoin, is a record of digital transactions based on a huge database in which all financial operations carried out with electronic currency are registered. The Blockchain (or chain of blocks) is a shared database that works as a book for the record of purchase-sale operations or any other transaction. It is the technological base of the operation of bitcoin, for example. It consists of a set of notes that are in a shared online database in which operations, quantities, dates and participants are registered by means of codes. By using cryptographic keys and being distributed by many computers (people), it presents security advantages against manipulation and fraud. A modification in one of the copies would be useless, but the change must be made in all the copies because the database is open and public

    DeepTech - AI Models in Engineering Solutions

    Get PDF
    Artificial Intelligence revived in the last decade. The need for progress, the growing processing capacity and the low cost of the Cloud have facilitated the development of new, powerful algorithms. The efficiency of these algorithms in Big Data processing, Deep Learning and Convolutional Networks is transforming the way we work and is opening new horizons. Thanks to them, we can now analyse data and obtain unimaginable solutions to today’s problems. Nevertheless, our success is not entirely based on algorithms, it also comes from our ability to follow our “gut” when choosing the best combination of algorithms for an intelligent artefact. It's about approaching engineering with a lot of knowledge and tact. This involves the use of both connectionist and symbolic systems, and of having a full understanding of the algorithms used. Moreover, to address today’s problems we must work with both historical and real-time data. We must fully comprehend the problem, its time evolution, as well as the relevance and implications of each piece of data, etc. It is also important to consider development time, costs and the ability to create systems that will interact with their environment, will connect with the objects that surround them and will manage the data they obtain in a reliable manner

    The role of the AIoT and deepint.net

    Get PDF
    AIoT is a term, also known as intelligence of things, which refers to the new wave of the future of technology that combines two major platforms, very present in today's market: Artificial Intelligence (AI) and the Internet of things (IoT). As IoT devices will generate large amounts of data, Artificial Intelligence is going to be functionally necessary to deal with these huge volumes if we are to have any chance of making sense of the data. This whole process will be called connected intelligence. To take this step forward and definitively enter the era of Intelligence of Things, we will need to enable to a greater or lesser part these cognitive and executive capacities towards objects. To do this, we are going to talk more and more about the concept of Edge Computing (or “edge computing”), which is nothing more than the ability to process data, analyze situations, evaluate possible scenarios and make decisions from the object itself and not from a server hundreds or thousands of miles away

    Intelligent Models in Complex Problem Solving

    Get PDF
    Artificial Intelligence revived in the last decade. The need for progress, the growing processing capacity and the low cost of the Cloud have facilitated the development of new, powerful algorithms. The efficiency of these algorithms in Big Data processing, Deep Learning and Convolutional Networks is transforming the way we work and is opening new horizons

    The 1993 Goddard Conference on Space Applications of Artificial Intelligence

    Get PDF
    This publication comprises the papers presented at the 1993 Goddard Conference on Space Applications of Artificial Intelligence held at the NASA/Goddard Space Flight Center, Greenbelt, MD on May 10-13, 1993. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed
    corecore