2,951 research outputs found

    Execution Management System implementation in international retail company

    Get PDF
    Project Work presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementIn nowadays the penetration of new technologies and changing business environment are pushing companies to compete and focus on efficiency of its processes. One of company’s competitive advantage is its operational processes and its efficiency. In order to save money and increase competitive advantage a company should continuously improve its processes and today the company could use not only dedicated business improvement frameworks and technics, but also specific software which can explore existing processes of the company by searching and analysing program logs and process trails in the company’s systems. Process mining software could help companies to understand different variants of existing processes, compare them to reference models, see actual efficiency and automation rate, understand the root cause of the inefficiency and take necessary actions to improve the existing processes. But Process mining software is not yet so popular in CIS retail market and many companies are looking sceptically towards using this type of software. So far not every retail business in CIS countries is using Process Mining software to understand its processes and even more rare is implementation of RPA software to substitute human work with robots and AI. The aim of planned work is to design and implement EMS system and present case study of the implementation

    NPC AI System Based on Gameplay Recordings

    Get PDF
    Hästi optimeeritud mitte-mängija tegelased (MMT) on vastaste või meeskonna kaaslastena üheks peamiseks osaks mitme mängija mängudes. Enamus mänguroboteid on ehitatud jäikade süsteemide peal, mis võimaldavad vaid loetud arvu otsuseid ja animatsioone. Kogenud mängijad suudavad eristada mänguroboteid inimmängijatest ning ette ennustada nende liigutusi ja strateegiaid. See alandab mängukogemuse kvaliteeti. Seetõttu, eelistavad mitme mängijaga mängude mängijad mängida pigem inimmängijate kui MMTde vastu. Virtuaalreaalsuse (VR) mängud ja VR mängijad on siiani veel väike osa mängutööstusest ja mitme mängija VR mängud kannatavad mängijabaasi kaotusest, kui mänguomanikud ei suuda leida teisi mängijaid, kellega mängida. See uurimus demonstreerib mängulindistustel põhineva tehisintellekt (TI) süsteemi rakendatavust VR esimese isiku vaates tulistamismängule Vrena. Teemamäng kasutab ebatavalist liikumisesüsteemi, milles mängijad liiguvad otsiankrute abil. VR mängijate liigutuste imiteerimiseks loodi AI süsteem, mis kasutab mängulindistusi navigeerimisandmetena. Süsteem koosneb kolmest peamisest funktsionaalsusest. Need funktsionaalsused on mängutegevuse lindistamine, andmete töötlemine ja navigeerimine. Mängu keskkond on tükeldatud kuubikujulisteks sektoriteks, et vähendada erinevate asukohal põhinevate olekute arvu ning mängutegevus on lindistatud ajaintervallide ja tegevuste põhjal. Loodud mängulogid on segmenteeritud logilõikudeks ning logilõikude abil on loodud otsingutabel. Otsingutabelit kasutatakse MMT agentide navigeerimiseks ning MMTde otsuste langetamise mehanism jäljendab olek-tegevus-tasu kontseptsiooni. Loodud töövahendi kvaliteeti hinnati uuringu põhjal, millest saadi märkimisväärset tagasisidet süsteemi täiustamiseks.A well optimized Non-Player Character (NPC) as an opponent or a teammate is a major part of the multiplayer games. Most of the game bots are built upon a rigid system with numbered decisions and animations. Experienced players can distinguish bots from hu-man players and they can predict bot movements and strategies. This reduces the quality of the gameplay experience. Therefore, multiplayer game players favour playing against human players rather than NPCs. VR game market and VR gamers are still a small frac-tion of the game industry and multiplayer VR games suffer from loss of their player base if the game owners cannot find other players to play with. This study demonstrates the applicability of an Artificial Intelligence (AI) system based on gameplay recordings for a Virtual Reality (VR) First-person Shooter (FPS) game called Vrena. The subject game has an uncommon way of movement, in which the players use grappling hooks to navigate. To imitate VR players’ movements and gestures an AI system is developed which uses gameplay recordings as navigation data. The system contains three major functionality. These functionalities are gameplay recording, data refinement, and navigation. The game environment is sliced into cubic sectors to reduce the number of positional states and gameplay is recorded by time intervals and actions. Produced game logs are segmented into log sections and these log sections are used for creating a look-up table. The lookup table is used for navigating the NPC agent and the decision mechanism followed a way similar to the state-action-reward concept. The success of the developed tool is tested via a survey, which provided substantial feedback for improving the system

    ICSEA 2022: the seventeenth international conference on software engineering advances

    Get PDF
    The Seventeenth International Conference on Software Engineering Advances (ICSEA 2022), held between October 16th and October 20th, 2022, continued a series of events covering a broad spectrum of software-related topics. The conference covered fundamentals on designing, implementing, testing, validating and maintaining various kinds of software. Several tracks were proposed to treat the topics from theory to practice, in terms of methodologies, design, implementation, testing, use cases, tools, and lessons learned. The conference topics covered classical and advanced methodologies, open source, agile software, as well as software deployment and software economics and education. Other advanced aspects are related to on-time practical aspects, such as run-time vulnerability checking, rejuvenation process, updates partial or temporary feature deprecation, software deployment and configuration, and on-line software updates. These aspects trigger implications related to patenting, licensing, engineering education, new ways for software adoption and improvement, and ultimately, to software knowledge management. There are many advanced applications requiring robust, safe, and secure software: disaster recovery applications, vehicular systems, biomedical-related software, biometrics related software, mission critical software, E-health related software, crisis-situation software. These applications require appropriate software engineering techniques, metrics and formalisms, such as, software reuse, appropriate software quality metrics, composition and integration, consistency checking, model checking, provers and reasoning. The nature of research in software varies slightly with the specific discipline researchers work in, yet there is much common ground and room for a sharing of best practice, frameworks, tools, languages and methodologies. Despite the number of experts we have available, little work is done at the meta level, that is examining how we go about our research, and how this process can be improved. There are questions related to the choice of programming language, IDEs and documentation styles and standard. Reuse can be of great benefit to research projects yet reuse of prior research projects introduces special problems that need to be mitigated. The research environment is a mix of creativity and systematic approach which leads to a creative tension that needs to be managed or at least monitored. Much of the coding in any university is undertaken by research students or young researchers. Issues of skills training, development and quality control can have significant effects on an entire department. In an industrial research setting, the environment is not quite that of industry as a whole, nor does it follow the pattern set by the university. The unique approaches and issues of industrial research may hold lessons for researchers in other domains. We take here the opportunity to warmly thank all the members of the ICSEA 2022 technical program committee, as well as all the reviewers. The creation of such a high-quality conference program would not have been possible without their involvement. We also kindly thank all the authors who dedicated much of their time and effort to contribute to ICSEA 2022. We truly believe that, thanks to all these efforts, the final conference program consisted of top-quality contributions. We also thank the members of the ICSEA 2022 organizing committee for their help in handling the logistics of this event. We hope that ICSEA 2022 was a successful international forum for the exchange of ideas and results between academia and industry and for the promotion of progress in software engineering advances

    Digital Twins and Blockchain for IoT Management

    Get PDF
    We live in a data-driven world powered by sensors getting data from anywhere at any time. This advancement is possible thanks to the Internet of Things (IoT). IoT embeds common physical objects with heterogeneous sensing, actuating, and communication capabilities to collect data from the environment and people. These objects are generally known as things and exchange data with other things, entities, computational processes, and systems over the internet. Consequently, a web of devices and computational processes emerges involving billions of entities collecting, processing, and sharing data. As a result, we now have an internet of entities/things that process and produce data, an ever-growing volume that can easily exceed petabytes. Therefore, there is a need for novel management approaches to handle the previously unheard number of IoT devices, processes, and data streams. This dissertation focuses on solutions for IoT management using decentralized technologies. A massive number of IoT devices interact with software and hardware components and are owned by different people. Therefore, there is a need for decentralized management. Blockchain is a capable and promising distributed ledger technology with features to support decentralized systems with large numbers of devices. People should not have to interact with these devices or data streams directly. Therefore, there is a need to abstract access to these components. Digital twins are software artifacts that can abstract an object, a process, or a system to enable communication between the physical and digital worlds. Fog/edge computing is the alternative to the cloud to provide services with less latency. This research uses blockchain technology, digital twins, and fog/edge computing for IoT management. The systems developed in this dissertation enable configuration, self-management, zero-trust management, and data streaming view provisioning from a fog/edge layer. In this way, this massive number of things and the data they produce are managed through services distributed across nodes close to them, providing access and configuration security and privacy protection

    Introducing CARONTE: a Crawler for Adversarial Resources Over Non Trusted Environments

    Get PDF
    The monitoring of underground criminal activities is often automated to maximize the data collection and to train ML models to automatically adapt data collection tools to different communities. On the other hand, sophisticated adversaries may adopt crawling-detection capabilities that may significantly jeopardize researchers' opportunities to perform the data collection, for example by putting their accounts under the spotlight and being expelled from the community. This is particularly undesirable in prominent and high-profile criminal communities where entry costs are significant (either monetarily or for example for background checking or other trust-building mechanisms). This work presents CARONTE, a tool to semi-automatically learn virtually any forum structure for parsing and data-extraction, while maintaining a low profile for the data collection and avoiding the requirement of collecting massive datasets to maintain tool scalability. We showcase CARONTE against four underground forum communities, and show that from the adversary's perspective CARONTE maintains a profile similar to humans, whereas state-of-the-art crawling tools show clearly distinct and easy to detect patterns of automated activity

    Recent developments in hole cleaning technology in deviated well bores for geothermal and petroleum

    Get PDF
    This paper is looking on recent developments in hole cleaning technologies and how recent advancements can be used to aid efficient hole cleaning in deviated wells. Successful hole cleaning relies upon integrating optimum drilling fluid properties with the best drilling practices. The ability of the drilling fluid to transport the drilling cuttings to the are determined by several parameters (cutting density, mud weight, hole-size, hole-angle, rheology of fluid, cutting size, rate of penetration, drill pipe eccentricity, drill pipe rotation speed, phase of fluid, cutting transport ratio and cutting bed properties). Efficient hole cleaning of deviated wells is important and difficult to perform efficiently, deviated wells normally uses drilling fluid with lower viscosity and gel building properties than in vertical section. Deviated wells are an important tool to either boost the return from existing fields or gaining access to new and formerly inaccessible formations. The increasing need for oil and gas have kept increasing with ever increasing energy output in the world, despite the world trying to swap to more renewable resources. Petroleum products such as coal, gas and oil still stand for over 80% of the energy production in the world. Increasing energy demands from the world exceeds the development within renewable technologies and gaining access to new formation and extracting most of the oil and gas in current formation will be paramount in giving people access to energy required to keep the world running. Percentage of world’s energy coming from renewable resources has increased and will hopefully keep increasing, but total energy demand especially from developing countries with increasing population and higher standard of living requires higher amount of energy than the countries are currently consuming with renewable being too expensive, inefficient, or lacking the required infrastructure for implementation. The paper is a compilation of recent developments and would hopefully give the reader insight in the processes most important efficient hole cleaning for deviated wells. The topic of efficient hole cleaning is complex, and a lot of different parameters will be introduced to understand the role of new developments. Basic understanding of these parameters and their interplay with each other is required to understand to keep the innovation with respect to efficient hole cleaning and automating more of the process involved in hole cleaning while drilling in deviated wells. The paper also uses the information from collected studies to write to a data code based on recent developments to aid in controlling the right rate of penetration (ROP) during drilling. No independent research was those in this paper and is based on the work of research and literature of others

    Orchestration of distributed ingestion and processing of IoT data for fog platforms

    Get PDF
    In recent years there has been an extraordinary growth of the Internet of Things (IoT) and its protocols. The increasing diffusion of electronic devices with identification, computing and communication capabilities is laying ground for the emergence of a highly distributed service and networking environment. The above mentioned situation implies that there is an increasing demand for advanced IoT data management and processing platforms. Such platforms require support for multiple protocols at the edge for extended connectivity with the objects, but also need to exhibit uniform internal data organization and advanced data processing capabilities to fulfill the demands of the application and services that consume IoT data. One of the initial approaches to address this demand is the integration between IoT and the Cloud computing paradigm. There are many benefits of integrating IoT with Cloud computing. The IoT generates massive amounts of data, and Cloud computing provides a pathway for that data to travel to its destination. But today’s Cloud computing models do not quite fit for the volume, variety, and velocity of data that the IoT generates. Among the new technologies emerging around the Internet of Things to provide a new whole scenario, the Fog Computing paradigm has become the most relevant. Fog computing was introduced a few years ago in response to challenges posed by many IoT applications, including requirements such as very low latency, real-time operation, large geo-distribution, and mobility. Also this low latency, geo-distributed and mobility environments are covered by the network architecture MEC (Mobile Edge Computing) that provides an IT service environment and Cloud-computing capabilities at the edge of the mobile network, within the Radio Access Network (RAN) and in close proximity to mobile subscribers. Fog computing addresses use cases with requirements far beyond Cloud-only solution capabilities. The interplay between Cloud and Fog computing is crucial for the evolution of the so-called IoT, but the reach and specification of such interplay is an open problem. This thesis aims to find the right techniques and design decisions to build a scalable distributed system for the IoT under the Fog Computing paradigm to ingest and process data. The final goal is to explore the trade-offs and challenges in the design of a solution from Edge to Cloud to address opportunities that current and future technologies will bring in an integrated way. This thesis describes an architectural approach that addresses some of the technical challenges behind the convergence between IoT, Cloud and Fog with special focus on bridging the gap between Cloud and Fog. To that end, new models and techniques are introduced in order to explore solutions for IoT environments. This thesis contributes to the architectural proposals for IoT ingestion and data processing by 1) proposing the characterization of a platform for hosting IoT workloads in the Cloud providing multi-tenant data stream processing capabilities, the interfaces over an advanced data-centric technology, including the building of a state-of-the-art infrastructure to evaluate the performance and to validate the proposed solution. 2) studying an architectural approach following the Fog paradigm that addresses some of the technical challenges found in the first contribution. The idea is to study an extension of the model that addresses some of the central challenges behind the converge of Fog and IoT. 3) Design a distributed and scalable platform to perform IoT operations in a moving data environment. The idea after study data processing in Cloud, and after study the convenience of the Fog paradigm to solve the IoT close to the Edge challenges, is to define the protocols, the interfaces and the data management to solve the ingestion and processing of data in a distributed and orchestrated manner for the Fog Computing paradigm for IoT in a moving data environment.En els últims anys hi ha hagut un gran creixement del Internet of Things (IoT) i els seus protocols. La creixent difusió de dispositius electrònics amb capacitats d'identificació, computació i comunicació esta establint les bases de l’aparició de serveis altament distribuïts i del seu entorn de xarxa. L’esmentada situació implica que hi ha una creixent demanda de plataformes de processament i gestió avançada de dades per IoT. Aquestes plataformes requereixen suport per a múltiples protocols al Edge per connectivitat amb el objectes, però també necessiten d’una organització de dades interna i capacitats avançades de processament de dades per satisfer les demandes de les aplicacions i els serveis que consumeixen dades IoT. Una de les aproximacions inicials per abordar aquesta demanda és la integració entre IoT i el paradigma del Cloud computing. Hi ha molts avantatges d'integrar IoT amb el Cloud. IoT genera quantitats massives de dades i el Cloud proporciona una via perquè aquestes dades viatgin a la seva destinació. Però els models actuals del Cloud no s'ajusten del tot al volum, varietat i velocitat de les dades que genera l'IoT. Entre les noves tecnologies que sorgeixen al voltant del IoT per proporcionar un escenari nou, el paradigma del Fog Computing s'ha convertit en la més rellevant. Fog Computing es va introduir fa uns anys com a resposta als desafiaments que plantegen moltes aplicacions IoT, incloent requisits com baixa latència, operacions en temps real, distribució geogràfica extensa i mobilitat. També aquest entorn està cobert per l'arquitectura de xarxa MEC (Mobile Edge Computing) que proporciona serveis de TI i capacitats Cloud al edge per la xarxa mòbil dins la Radio Access Network (RAN) i a prop dels subscriptors mòbils. El Fog aborda casos d?us amb requisits que van més enllà de les capacitats de solucions només Cloud. La interacció entre Cloud i Fog és crucial per a l'evolució de l'anomenat IoT, però l'abast i especificació d'aquesta interacció és un problema obert. Aquesta tesi té com objectiu trobar les decisions de disseny i les tècniques adequades per construir un sistema distribuït escalable per IoT sota el paradigma del Fog Computing per a ingerir i processar dades. L'objectiu final és explorar els avantatges/desavantatges i els desafiaments en el disseny d'una solució des del Edge al Cloud per abordar les oportunitats que les tecnologies actuals i futures portaran d'una manera integrada. Aquesta tesi descriu un enfocament arquitectònic que aborda alguns dels reptes tècnics que hi ha darrere de la convergència entre IoT, Cloud i Fog amb especial atenció a reduir la bretxa entre el Cloud i el Fog. Amb aquesta finalitat, s'introdueixen nous models i tècniques per explorar solucions per entorns IoT. Aquesta tesi contribueix a les propostes arquitectòniques per a la ingesta i el processament de dades IoT mitjançant 1) proposant la caracterització d'una plataforma per a l'allotjament de workloads IoT en el Cloud que proporcioni capacitats de processament de flux de dades multi-tenant, les interfícies a través d'una tecnologia centrada en dades incloent la construcció d'una infraestructura avançada per avaluar el rendiment i validar la solució proposada. 2) estudiar un enfocament arquitectònic seguint el paradigma Fog que aborda alguns dels reptes tècnics que es troben en la primera contribució. La idea és estudiar una extensió del model que abordi alguns dels reptes centrals que hi ha darrere de la convergència de Fog i IoT. 3) Dissenyar una plataforma distribuïda i escalable per a realitzar operacions IoT en un entorn de dades en moviment. La idea després d'estudiar el processament de dades a Cloud, i després d'estudiar la conveniència del paradigma Fog per resoldre el IoT prop dels desafiaments Edge, és definir els protocols, les interfícies i la gestió de dades per resoldre la ingestió i processament de dades en un distribuït i orquestrat per al paradigma Fog Computing per a l'IoT en un entorn de dades en moviment

    Deep Reinforcement Learning Approaches for Technology Enhanced Learning

    Get PDF
    Artificial Intelligence (AI) has advanced significantly in recent years, transforming various industries and domains. Its ability to extract patterns and insights from large volumes of data has revolutionised areas such as image recognition, natural language processing, and autonomous systems. As AI systems become increasingly integrated into daily human life, there is a growing need for meaningful collaboration and mutual engagement between humans and AI, known as Human-AI Collaboration. This collaboration involves combining AI with human workflows to achieve shared objectives. In the current educational landscape, the integration of AI methods in Technology Enhanced Learning (TEL) has become crucial for providing high-quality education and facilitating lifelong learning. Human-AI Collaboration also plays a vital role in the field of Technology Enhanced Learning (TEL), particularly in Intelligent Tutoring Systems (ITS). The COVID-19 pandemic has further emphasised the need for effective educational technologies to support remote learning and bridge the gap between traditional classrooms and online platforms. To maximise the performance of ITS while minimising the input and interaction required from students, it is essential to design collaborative systems that effectively leverage the capabilities of AI and foster effective collaboration between students and ITS. However, there are several challenges that need to be addressed in this context. One challenge is the lack of clear guidance on designing and building user-friendly systems that facilitate collaboration between humans and AI. This challenge is relevant not only to education researchers but also to Human-Computer Interaction (HCI) researchers and developers. Another challenge is the scarcity of interaction data in the early stages of ITS development, which hampers the accurate modelling of students' knowledge states and learning trajectories, known as the cold start problem. Moreover, the effectiveness of Intelligent Tutoring Systems (ITS) in delivering personalised instruction is hindered by the limitations of existing Knowledge Tracing (KT) models, which often struggle to provide accurate predictions. Therefore, addressing these challenges is crucial for enhancing the collaborative process between humans and AI in the development of ITS. This thesis aims to address these challenges and improve the collaborative process between students and ITS in TEL. It proposes innovative approaches to generate simulated student behavioural data and enhance the performance of KT models. The thesis starts with a comprehensive survey of human-AI collaborative systems, identifying key challenges and opportunities. It then presents a structured framework for the student-ITS collaborative process, providing insights into designing user-friendly and efficient systems. To overcome the challenge of data scarcity in ITS development, the thesis proposes two student modelling approaches: Sim-GAIL and SimStu. SimStu leverages a deep learning method, the Decision Transformer, to simulate student interactions and enhance ITS training. Sim-GAIL utilises a reinforcement learning method, Generative Adversarial Imitation Learning (GAIL), to generate high-fidelity and diverse simulated student behavioural data, addressing the cold start problem in ITS training. Furthermore, the thesis focuses on improving the performance of KT models. It introduces the MLFBKT model, which integrates multiple features and mines latent relations in student interaction data, aiming to improve the accuracy and efficiency of KT models. Additionally, the thesis proposes the LBKT model, which combines the strengths of the BERT model and LSTM to process long sequence data in KT models effectively. Overall, this thesis contributes to the field of Human-AI collaboration in TEL by addressing key challenges and proposing innovative approaches to enhance ITS training and KT model performance. The findings have the potential to improve the learning experiences and outcomes of students in educational settings
    corecore