18,280 research outputs found

    Hybrid Approach to Automation, RPA and Machine Learning: a Method for the Human-centered Design of Software Robots

    Full text link
    One of the more prominent trends within Industry 4.0 is the drive to employ Robotic Process Automation (RPA), especially as one of the elements of the Lean approach. The full implementation of RPA is riddled with challenges relating both to the reality of everyday business operations, from SMEs to SSCs and beyond, and the social effects of the changing job market. To successfully address these points there is a need to develop a solution that would adjust to the existing business operations and at the same time lower the negative social impact of the automation process. To achieve these goals we propose a hybrid, human-centered approach to the development of software robots. This design and implementation method combines the Living Lab approach with empowerment through participatory design to kick-start the co-development and co-maintenance of hybrid software robots which, supported by variety of AI methods and tools, including interactive and collaborative ML in the cloud, transform menial job posts into higher-skilled positions, allowing former employees to stay on as robot co-designers and maintainers, i.e. as co-programmers who supervise the machine learning processes with the use of tailored high-level RPA Domain Specific Languages (DSLs) to adjust the functioning of the robots and maintain operational flexibility

    The Role of Big Data Analytics in Industrial Internet of Things

    Full text link
    Big data production in industrial Internet of Things (IIoT) is evident due to the massive deployment of sensors and Internet of Things (IoT) devices. However, big data processing is challenging due to limited computational, networking and storage resources at IoT device-end. Big data analytics (BDA) is expected to provide operational- and customer-level intelligence in IIoT systems. Although numerous studies on IIoT and BDA exist, only a few studies have explored the convergence of the two paradigms. In this study, we investigate the recent BDA technologies, algorithms and techniques that can lead to the development of intelligent IIoT systems. We devise a taxonomy by classifying and categorising the literature on the basis of important parameters (e.g. data sources, analytics tools, analytics techniques, requirements, industrial analytics applications and analytics types). We present the frameworks and case studies of the various enterprises that have benefited from BDA. We also enumerate the considerable opportunities introduced by BDA in IIoT.We identify and discuss the indispensable challenges that remain to be addressed as future research directions as well

    A Gradient-Aware Search Algorithm for Constrained Markov Decision Processes

    Full text link
    The canonical solution methodology for finite constrained Markov decision processes (CMDPs), where the objective is to maximize the expected infinite-horizon discounted rewards subject to the expected infinite-horizon discounted costs constraints, is based on convex linear programming. In this brief, we first prove that the optimization objective in the dual linear program of a finite CMDP is a piece-wise linear convex function (PWLC) with respect to the Lagrange penalty multipliers. Next, we propose a novel two-level Gradient-Aware Search (GAS) algorithm which exploits the PWLC structure to find the optimal state-value function and Lagrange penalty multipliers of a finite CMDP. The proposed algorithm is applied in two stochastic control problems with constraints: robot navigation in a grid world and solar-powered unmanned aerial vehicle (UAV)-based wireless network management. We empirically compare the convergence performance of the proposed GAS algorithm with binary search (BS), Lagrangian primal-dual optimization (PDO), and Linear Programming (LP). Compared with benchmark algorithms, it is shown that the proposed GAS algorithm converges to the optimal solution faster, does not require hyper-parameter tuning, and is not sensitive to initialization of the Lagrange penalty multiplier.Comment: Submitted as a brief paper to the IEEE TNNL

    A Berkeley View of Systems Challenges for AI

    Full text link
    With the increasing commoditization of computer vision, speech recognition and machine translation systems and the widespread deployment of learning-based back-end technologies such as digital advertising and intelligent infrastructures, AI (Artificial Intelligence) has moved from research labs to production. These changes have been made possible by unprecedented levels of data and computation, by methodological advances in machine learning, by innovations in systems software and architectures, and by the broad accessibility of these technologies. The next generation of AI systems promises to accelerate these developments and increasingly impact our lives via frequent interactions and making (often mission-critical) decisions on our behalf, often in highly personalized contexts. Realizing this promise, however, raises daunting challenges. In particular, we need AI systems that make timely and safe decisions in unpredictable environments, that are robust against sophisticated adversaries, and that can process ever increasing amounts of data across organizations and individuals without compromising confidentiality. These challenges will be exacerbated by the end of the Moore's Law, which will constrain the amount of data these technologies can store and process. In this paper, we propose several open research directions in systems, architectures, and security that can address these challenges and help unlock AI's potential to improve lives and society.Comment: Berkeley Technical Repor

    Six Key Enablers for Machine Type Communication in 6G

    Full text link
    While 5G is being rolled out in different parts of the globe, few research groups around the world −- such as the Finnish 6G Flagship program −- have already started posing the question: \textit{What will 6G be?} The 6G vision is a data-driven society, enabled by near instant unlimited wireless connectivity. Driven by impetus to provide vertical-specific wireless network solutions, machine type communication encompassing both its mission critical and massive connectivity aspects is foreseen to be an important cornerstone of 6G development. This article presents an over-arching vision for machine type communication in 6G. In this regard, some relevant performance indicators are first anticipated, followed by a presentation of six key enabling technologies.Comment: 14 pages, five figures, submitted to IEEE Communications Magazine for possible publicatio

    Multiuser Computation Offloading and Downloading for Edge Computing with Virtualization

    Full text link
    Mobile-edge computing (MEC) is an emerging technology for enhancing the computational capabilities of mobile devices and reducing their energy consumption via offloading complex computation tasks to the nearby servers. Multiuser MEC at servers is widely realized via parallel computing based on virtualization. Due to finite shared I/O resources, interference between virtual machines (VMs), called I/O interference, degrades the computation performance. In this paper, we study the problem of joint radio-and-computation resource allocation (RCRA) in multiuser MEC systems in the presence of I/O interference. Specifically, offloading scheduling algorithms are designed targeting two system performance metrics: sum offloading throughput maximization and sum mobile energy consumption minimization. Their designs are formulated as non-convex mixed-integer programming problems, which account for latency due to offloading, result downloading and parallel computing. A set of low-complexity algorithms are designed based on a decomposition approach and leveraging classic techniques from combinatorial optimization. The resultant algorithms jointly schedule offloading users, control their offloading sizes, and divide time for communication (offloading and downloading) and computation. They are either optimal or can achieve close-to-optimality as shown by simulation. Comprehensive simulation results demonstrate considering of I/O interference can endow on an offloading controller robustness against the performance-degradation factor

    Impact of Artificial Intelligence on Businesses: from Research, Innovation, Market Deployment to Future Shifts in Business Models

    Full text link
    The fast pace of artificial intelligence (AI) and automation is propelling strategists to reshape their business models. This is fostering the integration of AI in the business processes but the consequences of this adoption are underexplored and need attention. This paper focuses on the overall impact of AI on businesses - from research, innovation, market deployment to future shifts in business models. To access this overall impact, we design a three-dimensional research model, based upon the Neo-Schumpeterian economics and its three forces viz. innovation, knowledge, and entrepreneurship. The first dimension deals with research and innovation in AI. In the second dimension, we explore the influence of AI on the global market and the strategic objectives of the businesses and finally, the third dimension examines how AI is shaping business contexts. Additionally, the paper explores AI implications on actors and its dark sides.Comment: 38 pages, 10 figures, 3 tables. A part of this work has been presented in DIGITS 201

    Edge Intelligence: The Confluence of Edge Computing and Artificial Intelligence

    Full text link
    Along with the rapid developments in communication technologies and the surge in the use of mobile devices, a brand-new computation paradigm, Edge Computing, is surging in popularity. Meanwhile, Artificial Intelligence (AI) applications are thriving with the breakthroughs in deep learning and the many improvements in hardware architectures. Billions of data bytes, generated at the network edge, put massive demands on data processing and structural optimization. Thus, there exists a strong demand to integrate Edge Computing and AI, which gives birth to Edge Intelligence. In this paper, we divide Edge Intelligence into AI for edge (Intelligence-enabled Edge Computing) and AI on edge (Artificial Intelligence on Edge). The former focuses on providing more optimal solutions to key problems in Edge Computing with the help of popular and effective AI technologies while the latter studies how to carry out the entire process of building AI models, i.e., model training and inference, on the edge. This paper provides insights into this new inter-disciplinary field from a broader perspective. It discusses the core concepts and the research road-map, which should provide the necessary background for potential future research initiatives in Edge Intelligence.Comment: 13 pages, 3 figure

    6G: The Next Frontier

    Full text link
    The current development of 5G networks represents a breakthrough in the design of communication networks, for its ability to provide a single platform enabling a variety of different services, from enhanced mobile broadband communications, automated driving, Internet-of-Things, with its huge number of connected devices, etc. Nevertheless, looking at the current development of technologies and new services, it is already possible to envision the need to move beyond 5G with a new architecture incorporating new services and technologies. The goal of this paper is to motivate the need to move to a sixth generation (6G) of mobile communication networks, starting from a gap analysis of 5G, and predicting a new synthesis of near future services, like hologram interfaces, ambient sensing intelligence, a pervasive introduction of artificial intelligence and the incorporation of technologies, like TeraHertz (THz) or Visible Light Communications (VLC), 3-dimensional coverage.Comment: This paper was submitted to IEEE Vehicular Technologies Magazine on the 7th of January 201

    Blockchain And The Future of the Internet: A Comprehensive Review

    Full text link
    Blockchain is challenging the status quo of the central trust infrastructure currently prevalent in the Internet towards a design principle that is underscored by decentralization, transparency, and trusted auditability. In ideal terms, blockchain advocates a decentralized, transparent, and more democratic version of the Internet. Essentially being a trusted and decentralized database, blockchain finds its applications in fields as varied as the energy sector, forestry, fisheries, mining, material recycling, air pollution monitoring, supply chain management, and their associated operations. In this paper, we present a survey of blockchain-based network applications. Our goal is to cover the evolution of blockchain-based systems that are trying to bring in a renaissance in the existing, mostly centralized, space of network applications. While re-imagining the space with blockchain, we highlight various common challenges, pitfalls, and shortcomings that can occur. Our aim is to make this work as a guiding reference manual for someone interested in shifting towards a blockchain-based solution for one's existing use case or automating one from the ground up.Comment: Under Review in IEEE COMS
    • …
    corecore