133 research outputs found

    A Pilot Program in Internet-of-things with University and Industry Collaboration: Introduction and Lessons Learned

    Get PDF
    Internet-of-Things (IoT) is one of the most prominent technological eco-systems and an engine of growth with an estimated market size of 14Trillionto14 Trillion to 33 Trillion by 2025 (McKinsey Global Institute). The IoT eco-system uses well-established technologies in many fields; and it adds new and often challenging requirements on extant techniques. For example, many wireless schemes are or being redesigned to address battery life and cost of solution issues. At the same time, the industry needs to hire and retrain many technical personnel to address these issues and support this newly evolving eco-system in many different markets. These facts culminate in the need for engineering students to be skilled to handle the new challenges and match the hiring market needs. As importantly, the more experienced technical personnel need to be retrained to understand this evolving eco-system. In this light, we have taken parallel symbiotic steps to address these challenges. We have piloted a course in IoT covering the most critical technologies in a typical end-to-end IoT system, including various access technologies and higher layer protocols and standards as well as prominent cloud services. Our industry partner has developed new measurement equipment to address more accurate and sensitive current draw of circuits to assist with power-frugal designs for long battery life. They have also developed a programmable board along with several experiments geared towards IoT applications. Last summer a small group of graduate students, with the guidance of a senior faculty member, used the IoT board to assess its efficacy for less experienced engineering students. The board and the associated experiments were found to be very useful and a good addition to the program. The experiments are also valuable for continuing education purposes for developing specific skills in the development of IoT systems. The team created an updated and tailored user’s manual to better serve the needs of less experienced engineering students and to alleviate the initial frustration associated with setting up the system. In this paper, we will present the experiences of the pilot program and the key points that present the enhancements of technical manual for a teaching environment. We will present the value that the IoT board and its experiments bring to the students in order to enhance their experience when learning about the IoT eco-system

    Investment Thesis for International Business Machines Corp.

    Get PDF
    The Studies in Applied Finance series is under the general direction of Prof. Steve H. Hanke, Co-Director of the Johns Hopkins Institute for Applied Economics, Global Health, and Study of Business Enterprise ([email protected]) and Dr. Hesam Motlagh ([email protected]), a Fellow at the Johns Hopkins Institute for Applied Economics, Global Health, and Study of Business Enterprise. This working paper is one in a series on applied financial economics, which focuses on company valuations. The authors are mainly students at the Johns Hopkins University in Baltimore who have conducted their work at the Institute as undergraduate researchers.The working paper is an in-depth financial analysis of International Business Machines (IBM). Our analysis examines the change in IBM’s business from a traditional hardware and software services provider to a cognitive services and cloud platform company. This analysis is combined with our proprietary Discounted Cash Flow (P-DCF) model and a Monte-Carlo simulation. This results in distributions of probable free cash flows. In addition to these quantitative factors, we also examine compensation plans of IBM executives to assess the degree to which the executives’ compensation incentives are aligned with the objective of creating shareholder value

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    Microservice Transition and its Granularity Problem: A Systematic Mapping Study

    Get PDF
    Microservices have gained wide recognition and acceptance in software industries as an emerging architectural style for autonomic, scalable, and more reliable computing. The transition to microservices has been highly motivated by the need for better alignment of technical design decisions with improving value potentials of architectures. Despite microservices' popularity, research still lacks disciplined understanding of transition and consensus on the principles and activities underlying "micro-ing" architectures. In this paper, we report on a systematic mapping study that consolidates various views, approaches and activities that commonly assist in the transition to microservices. The study aims to provide a better understanding of the transition; it also contributes a working definition of the transition and technical activities underlying it. We term the transition and technical activities leading to microservice architectures as microservitization. We then shed light on a fundamental problem of microservitization: microservice granularity and reasoning about its adaptation as first-class entities. This study reviews state-of-the-art and -practice related to reasoning about microservice granularity; it reviews modelling approaches, aspects considered, guidelines and processes used to reason about microservice granularity. This study identifies opportunities for future research and development related to reasoning about microservice granularity.Comment: 36 pages including references, 6 figures, and 3 table

    Middleware Architecture for Sensing as a Service

    Get PDF
    The Internet of Things is a concept that envisions the world as a smart space in which physical objects embedded with sensors, actuators, and network connectivity can communicate and react to their surroundings. Recent advancement in information and communication technologies make it possible to make the IoT vision a reality. However, IoT devices and consumers of data from these IoT devices can be owned by different entities which makes IoT data sharing a real challenge. Sensing as a Service is a concept that is influenced by the cloud computing term “Every Thing as a Service”. Sensing as a Service enables sensor data sharing. Sensing as a Service middleware enables IoT applications to access data generated by sensing devices owned by other entities. IoT applications are charged by the Sensing as a Service middleware for the amount of sensor data they use. This thesis addresses the architectural design of a cloud-based Sensing as Service middleware. The middleware enables sensor owners to sell their sensor data through the Internet. IoT applications can collect, and analyze sensors through the middleware API. We propose multitenancy algorithms for the middleware resource management. In addition, we propose a SQL-Like language that can be used by IoT applications for sensing service discovery, and sensor stream analytics. The evaluation of the middleware implementation shows the effectiveness of the algorithm

    Digital Twins and Blockchain for IoT Management

    Get PDF
    We live in a data-driven world powered by sensors getting data from anywhere at any time. This advancement is possible thanks to the Internet of Things (IoT). IoT embeds common physical objects with heterogeneous sensing, actuating, and communication capabilities to collect data from the environment and people. These objects are generally known as things and exchange data with other things, entities, computational processes, and systems over the internet. Consequently, a web of devices and computational processes emerges involving billions of entities collecting, processing, and sharing data. As a result, we now have an internet of entities/things that process and produce data, an ever-growing volume that can easily exceed petabytes. Therefore, there is a need for novel management approaches to handle the previously unheard number of IoT devices, processes, and data streams. This dissertation focuses on solutions for IoT management using decentralized technologies. A massive number of IoT devices interact with software and hardware components and are owned by different people. Therefore, there is a need for decentralized management. Blockchain is a capable and promising distributed ledger technology with features to support decentralized systems with large numbers of devices. People should not have to interact with these devices or data streams directly. Therefore, there is a need to abstract access to these components. Digital twins are software artifacts that can abstract an object, a process, or a system to enable communication between the physical and digital worlds. Fog/edge computing is the alternative to the cloud to provide services with less latency. This research uses blockchain technology, digital twins, and fog/edge computing for IoT management. The systems developed in this dissertation enable configuration, self-management, zero-trust management, and data streaming view provisioning from a fog/edge layer. In this way, this massive number of things and the data they produce are managed through services distributed across nodes close to them, providing access and configuration security and privacy protection
    • 

    corecore