529 research outputs found

    Dependability of the NFV Orchestrator: State of the Art and Research Challenges

    Get PDF
    © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The introduction of network function virtualisation (NFV) represents a signiïŹcant change in networking technology, which may create new opportunities in terms of cost efïŹciency, operations, and service provisioning. Although not explicitly stated as an objective, the dependability of the services provided using this technology should be at least as good as conventional solutions. Logical centralisation, off-the-shelf computing platforms, and increased system complexity represent new dependability challenges relative to the state of the art. The core function of the network, with respect to failure and service management, is orchestration. The failure and misoperation of the NFV orchestrator (NFVO) will have huge network-wide consequences. At the same time, NFVO is vulnerable to overload and design faults. Thus, the objective of this paper is to give a tutorial on the dependability challenges of the NFVO, and to give insight into the required future research. This paper provides necessary background information, reviews the available literature, outlines the proposed solutions, and identiïŹes some design and research problems that must be addressed.acceptedVersio

    Agile management and interoperability testing of SDN/NFV-enriched 5G core networks

    Get PDF
    In the fifth generation (5G) era, the radio internet protocol capacity is expected to reach 20Gb/s per sector, and ultralarge content traffic will travel across a faster wireless/wireline access network and packet core network. Moreover, the massive and mission-critical Internet of Things is the main differentiator of 5G services. These types of real-time and large-bandwidth-consuming services require a radio latency of less than 1 ms and an end-to-end latency of less than a few milliseconds. By distributing 5G core nodes closer to cell sites, the backhaul traffic volume and latency can be significantly reduced by having mobile devices download content immediately from a closer content server. In this paper, we propose a novel solution based on software-defined network and network function virtualization technologies in order to achieve agile management of 5G core network functionalities with a proof-of-concept implementation targeted for the PyeongChang Winter Olympics and describe the results of interoperability testing experiences between two core networks

    Doctor of Philosophy

    Get PDF
    dissertationMonitoring and remediation of environmental contaminants (biological and chemical) form the crux of global water resource management. There is an extant need to develop point-of-use, low-power, low-cost tools that can address this problem effectively with min­ imal environmental impact. Nanotechnology and microfluidics have made enormous ad­ vances during the past decade in the area of biosensing and environmental remediation. The "marriage" of these two technologies can effectively address some of the above-mentioned needs [1]. In this dissertation, nanomaterials were used in conjunction with microfluidic techniques to detect and degrade biological and chemical pollutants. In the first project, a point-of-use sensor was developed for detection of trichloroethylene (TCE) from water. A self-organizing nanotubular titanium dioxide (TNA) synthesized by electrochemical anodization and functionalized with photocatalytically deposited platinum (Pt/TNA) was applied to the detection. The morphology and crystallinity of the Pt/TNA sensor was characterized using field emission scanning electron microscope, energy dis­ persive x-ray spectroscopy, and X-ray diffraction. The sensor could detect TCE in the concentrations ranging from 10 to 1000 ppm. The room-temperature operation capability of the sensor makes it less power intensive and can potentially be incorporated into a field-based sensor. In the second part, TNA synthesized on a foil was incorporated into a flow-based microfluidic format and applied to degradation of a model pollutant, methylene blue. The system was demonstrated to have enhanced photocatalytic performance at higher flow rates (50-200 ^L/min) over the same microfluidic format with TiO2 nanoparticulate (commercial P25) catalyst. The microfluidic format with TNA catalyst was able to achieve 82% fractional conversion of 18 mM methylene blue in comparison to 55% in the case of the TiO2 nanoparticulate layer at a flow rate of 200 L/min. The microfluidic device was fabricated using non-cleanroom-based methods, making it suitable for economical large-scale manufacture. A computational model of the microfluidic format was developed in COMSOL MultiphysicsŸ finite element software to evaluate the effect of diffusion coefficient and rate constant on the photocatalytic performance. To further enhance the photocatalytic performance of the microfluidic device, TNA synthesized on a mesh was used as the catalyst. The new system was shown to have enhanced photocatalytic performance in comparison to TNA on a foil. The device was then employed in the inactivation of E. coli O157:H7 at different flow rates and light intensities (100, 50, 20, 10 mW/cm2). In the second project, a protocol for ultra-sensitive indirect electrochemical detection of E. coli O157:H7 was reported. The protocol uses antibody functionalized primary (magnetic) beads for capture and polyguanine (polyG) oligonucleotide functionalized sec­ ondary (polystyrene) beads as an electrochemical tag. The method was able to detect concentrations of E. coli O157:H7 down to 3 CFU/100 mL (S/N=3). We also demonstrate the use of the protocol for detection of E. coli O157:H7 seeded in wastewater effluent samples

    A hybrid e-learning framework: Process-based, semantically-enriched and service-oriented

    Get PDF
    Despite the recent innovations in e-Learning, much development is needed to ensure better learning experience for everyone and bridge the research gap in the current state of the art e-Learning artefacts. Contemporary e-learning artefacts possess various limitations as follows. First, they offer inadequate variations of adaptivity, since their recommendations are limited to e-learning resources, peers or communities. Second, they are often overwhelmed with technology at the expense of proper pedagogy and learning theories underpinning e-learning practices. Third, they do not comprehensively capture the e-learning experiences as their focus shifts to e-learning activities instead of e-learning processes. In reality, learning is a complex process that includes various activities and interactions between different roles to achieve certain gaols in a continuously evolving environment. Fourth, they tend more towards legacy systems and lack the agility and flexibility in their structure and design. To respond to the above limitations, this research aims at investigating the effectiveness of combining three advanced technologies (i.e., Business Process Modelling and Enactment, Semantics and Service Oriented Computing – SOC–) with learning pedagogy in order to enhance the e-learner experience. The key design artefact of this research is the development of the HeLPS e-Learning Framework – Hybrid e-Learning Framework that is Process-based, Semantically-enriched and Service Oriented-enabled. In this framework, a generic e-learning process has been developed bottom-up based on surveying a wide range of e-learning models (i.e., practical artefacts) and their underpinning pedagogies/concepts (i.e., theories); and then forming a generic e-learning process. Furthermore, an e-Learning Meta-Model has been developed in order to capture the semantics of e-learning domain and its processes. Such processes have been formally modelled and dynamically enacted using a service-oriented enabled architecture. This framework has been evaluated using a concern-based evaluation employing both static and dynamic approaches. The HeLPS e-Learning Framework along with its components have been evaluated by applying a data-driven approach and artificially-constructed case study to check its effectiveness in capturing the semantics, enriching e-learning processes and deriving services that can enhance the e-learner experience. Results revealed the effectiveness of combining the above-mentioned technologies in order to enhance the e-learner experience. Also, further research directions have been suggested.This research contributes to enhancing the e-learner experience by making the e-learning artefacts driven by pedagogy and informed by the latest technologies. One major novel contribution of this research is the introduction of a layered architectural framework (i.e., HeLPS) that combines business process modelling and enactment, semantics and SOC together. Another novel contribution is adopting the process-based approach in e-learning domain through: identifying these processes and developing a generic business process model from a set of related e-learning business process models that have the same goals and associated objectives. A third key contribution is the development of the e-Learning Meta-Model, which captures a high-abstract view of learning domain and encapsulates various domain rules using the Semantic Web Rule Language. Additional contribution is promoting the utilisation of Service-Orientation in e-learning through developing a semantically-enriched approach to identify and discover web services from e-learning business process models. Fifth, e-Learner Experience Model (eLEM) and e-Learning Capability Maturity Model (eLCMM) have been developed, where the former aims at identifying and quantifying the e-learner experience and the latter represents a well-defined evolutionary plateau towards achieving a mature e-learning process from a technological perspective. Both models have been combined with a new developed data-driven Validation and Verification Model to develop a Concern-based Evaluation Approach for e-Learning artefacts, which is considered as another contribution

    Integrated Electrochemical Biosensors for Detection of Waterborne Pathogens in Low-Resource Settings

    Get PDF
    More than 783 million people worldwide are currently without access to clean and safe water. Approximately 1 in 5 cases of mortality due to waterborne diseases involve children, and over 1.5 million cases of waterborne disease occur every year. In the developing world, this makes waterborne diseases the second highest cause of mortality. Such cases of waterborne disease are thought to be caused by poor sanitation, water infrastructure, public knowledge, and lack of suitable water monitoring systems. Conventional laboratory-based techniques are inadequate for effective on-site water quality monitoring purposes. This is due to their need for excessive equipment, operational complexity, lack of affordability, and long sample collection to data analysis times. In this review, we discuss the conventional techniques used in modern-day water quality testing. We discuss the future challenges of water quality testing in the developing world and how conventional techniques fall short of these challenges. Finally, we discuss the development of electrochemical biosensors and current research on the integration of these devices with microfluidic components to develop truly integrated, portable, simple to use and cost-effective devices for use by local environmental agencies, NGOs, and local communities in low-resource settings

    Logistics Lessons Learned in NASA Space Flight

    Get PDF
    The Vision for Space Exploration sets out a number of goals, involving both strategic and tactical objectives. These include returning the Space Shuttle to flight, completing the International Space Station, and conducting human expeditions to the Moon by 2020. Each of these goals has profound logistics implications. In the consideration of these objectives,a need for a study on NASA logistics lessons learned was recognized. The study endeavors to identify both needs for space exploration and challenges in the development of past logistics architectures, as well as in the design of space systems. This study may also be appropriately applied as guidance in the development of an integrated logistics architecture for future human missions to the Moon and Mars. This report first summarizes current logistics practices for the Space Shuttle Program (SSP) and the International Space Station (ISS) and examines the practices of manifesting, stowage, inventory tracking, waste disposal, and return logistics. The key findings of this examination are that while the current practices do have many positive aspects, there are also several shortcomings. These shortcomings include a high-level of excess complexity, redundancy of information/lack of a common database, and a large human-in-the-loop component. Later sections of this report describe the methodology and results of our work to systematically gather logistics lessons learned from past and current human spaceflight programs as well as validating these lessons through a survey of the opinions of current space logisticians. To consider the perspectives on logistics lessons, we searched several sources within NASA, including organizations with direct and indirect connections with the system flow in mission planning. We utilized crew debriefs, the John Commonsense lessons repository for the JSC Mission Operations Directorate, and the Skylab Lessons Learned. Additionally, we searched the public version of the Lessons Learned Information System (LLIS) and verified that we received the same result using the internal version of LLIS for our logistics lesson searches. In conducting the research, information from multiple databases was consolidated into a single spreadsheet of 300 lessons learned. Keywords were applied for the purpose of sorting and evaluation. Once the lessons had been compiled, an analysis of the resulting data was performed, first sorting it by keyword, then finding duplication and root cause, and finally sorting by root cause. The data was then distilled into the top 7 lessons learned across programs, centers, and activities

    Centralized Conferencing in the IP Multimedia Subsystem: from theory to practice

    Get PDF
    In this paper we present a conferencing architecture compliant with the IP Multimedia Subsystem (IMS) specification. To the purpose, we embrace a practical approach, by describing an actual implementation of an open source centralized video-conferencing system, called CONFIANCE, capable to offer advanced communication experience to end-users through the effective exploitation of mechanisms like session management and floor control. CONFIANCE has been designed to be fully compliant with the latest standard proposals coming from both the IETF and the 3GPP and can be considered as an outstandingexample of a real-time application built on top of the grounds paved by the SIP protocol. We will discuss in the paper both the design of the overall conferencing framework and the most important issues we had to face during the implementation phase

    Consortium for Robotics and Unmanned Systems Education and Research (CRUSER) 2019 Annual Report

    Get PDF
    Prepared for: Dr. Brian Bingham, CRUSER DirectorThe Naval Postgraduate School (NPS) Consortium for Robotics and Unmanned Systems Education and Research (CRUSER) provides a collaborative environment and community of interest for the advancement of unmanned systems (UxS) education and research endeavors across the Navy (USN), Marine Corps (USMC) and Department of Defense (DoD). CRUSER is a Secretary of the Navy (SECNAV) initiative to build an inclusive community of interest on the application of unmanned systems (UxS) in military and naval operations. This 2019 annual report summarizes CRUSER activities in its eighth year of operations and highlights future plans.Deputy Undersecretary of the Navy PPOIOffice of Naval Research (ONR)Approved for public release; distribution is unlimited

    Consortium for Robotics and Unmanned Systems Education and Research (CRUSER) 2019 Annual Report

    Get PDF
    Prepared for: Dr. Brian Bingham, CRUSER DirectorThe Naval Postgraduate School (NPS) Consortium for Robotics and Unmanned Systems Education and Research (CRUSER) provides a collaborative environment and community of interest for the advancement of unmanned systems (UxS) education and research endeavors across the Navy (USN), Marine Corps (USMC) and Department of Defense (DoD). CRUSER is a Secretary of the Navy (SECNAV) initiative to build an inclusive community of interest on the application of unmanned systems (UxS) in military and naval operations. This 2019 annual report summarizes CRUSER activities in its eighth year of operations and highlights future plans.Deputy Undersecretary of the Navy PPOIOffice of Naval Research (ONR)Approved for public release; distribution is unlimited
    • 

    corecore