446 research outputs found

    Towards a New Architectural Framework – The Nth Stratum Concept

    Get PDF
    Current architectures and solutions are about to reach the limits of sustainable developments. Over the years, many new requirements have emerged, and there are observations pointing to an ever-increasing diversity in applications, services, devices, types of networks at the edge and the access. Meanwhile, the infrastructures for internetworking, connectivity, and also management remain fairly the same. A new paradigm is needed that can support a continuous high pace of innovations in all the different parts and aspects of a communication system, while at the same time keeping costs of deployment and maintenance down. This new paradigm has to embrace current trends towards increased heterogeneity, but on the other hands provide support for co-existence and interoperability between alternative and various solutions all residing within a global communication system. This paper presents a new architectural framework called the Nth Stratum concept, and which takes a holistic approach to tackle these new needs and requirements on a future communication system

    On the Way to a Theory for Network Architectures

    Get PDF
    Abstract. The design of the future Internet is facing real challenges on network architecture. Attempts to resolve the issues related to naming/addressing, middle boxes, QoS-Security-Mobility interactions, cross-layer and inter-domain usually lead to endless debates. This is because the structure of the Internet has become too complex and has evolved with new functions added which are not always compatible with the existing functions. In order to analyze the architecture of the Internet in a strict manner, it is necessary to understand in depth the composition of functionalities within a protocol or between protocols. This paper presents a study on the composition of network functionalities and highlights future directions towards a theory for network architectures which includes the principles that network architectures should follow to ensure the normal operation of the member functions, detect all possible conflicts between them as well as figure out impossibilities

    Architectural Principles and Elements of In-Network Management

    Get PDF
    Recent endeavors in addressing the challenges of the current and future Internet pursue a clean slate design methodology. Simultaneously, it is argued that the Internet is unlikely to be changed in one fell swoop and that its next generation requires an evolutionary design approach. Recognizing both positions, we claim that cleanness and evolution are not mutually exclusive, but rather complementary and indispensable properties for sustainable management in the future Internet. In this paper we propose the in-network management (INM) paradigm, which adopts a clean slate design approach to the management of future communication networks that is brought about by evolutionary design principles. The proposed paradigm builds on embedded management capabilities to address the intrinsic nature, and hence, close relationship between the network and its management. At the same time, INM assists in the gradual adoption of embedded self-managing processes to progressively achieve adequate and practical degrees of INM. We demonstrate how INM can be exploited in current and future network management by its application to P2P networks

    AHP analysis of key success factors for enterprise transformation; From the viewpoint of project management process

    Get PDF
    Nowadays, project management, instead of operations, has become the main operation mode of an enterprise, while plenty discussions on using project management to help transformation of enterprises can be easily found in the existing literature. However, only a few stages of the project management process were applied in current frameworks, which could not provide an effective and comprehensive solution for an enterprise in crisis. In view of this, this study aimed to explore enterprise transformation from the perspective of project management process, proposing a whole transformation plan more coherent and complete. It would also suggest a fresh approach for exploring this research topic. The study found that the two phases of "planning" and "execution" are the most important ones for enterprise transformation. Therefore, it should be bold but cautious when formulating a transformation plan, and the plan should be precisely executed when being implemented. More importantly, this study connected the project management process with the enterprise transformation planning process, so that the enterprise transformation can be carried out more smoothly and have better probability of success.Keywords. Enterprise transformation; Project management process; Analytical hierarchy process (AHP).JEL. C52; L25; M14

    Data Quality Management in ETL Process under Resource Constraints

    Get PDF
    Currently, access to data is necessary for many companies, particularly those engaged in marketing, to make decisions that will improve the quality of their services and businesses. They frequently find the knowledge they need from several sources in a variety of formats. Following a dedication to the quality of information offered to data consumers, a system will be implemented to consolidate all these data sources for analysis and decision-making. This study addresses the evaluation of data quality (DQ) in an ETL process developed to support a marketing data management platform. More specifically, this study addressed the problem of evaluating the quality of data with a high-volume trait. Addressing the problem of DQ assessment at high ingestion rates is beyond the scope of this study, which focuses on data quality assessment with limitations to vertical or horizontal scaling of the ingestion system. We also analyze the use of the model developed on real data to assume an improvement in the quality of the data in the ETL. The methodology used consisted of studying each feature related to the characterization of high-quality data and analyzing the impact of those in an ETL concerned with voluminous data. We propose algorithms for improving a more generalizable integration DQ assessment. We conducted a practical implementation study of the different criteria and characteristics proposed to evaluate the impact of the data collected throughout the process of data Extraction, Transformation, and Loading. We highlight a quality assessment framework that models the different necessary parts of the process, including data sources, metrics characterizing data quality, data destination, and the analysis and performance of the algorithms used in the assessment process. The ETL practical implementation in this research is based on a Direct Acyclic Graph (DAG) model, with the main purpose of extracting, transforming, and transmitting data from this first service to the rest of the Marketing Data Management Platform infrastructure, which is considered as the end user. The evaluation and quality are based on the development of algorithms that take source data as input in combination with predefined properties encompassing the expected result of the ETL transformation to produce the evaluation result. The evaluation findings may be used to support or contradict the standards for quality. Decisions are made in the event of a DQ failure to improve and enhance the data. We suggest including data checks at the very end of the ETL data manipulation process as well as a model for data volume reduction using algorithms that are intended to make the procedure more generic to enable quick review. The quality of the data evaluated during the test is a statistical representation of the ingested dataset, which provides an accurate profile that enables user applications to retrieve high-quality data without delay. The main contributions of this thesis are: i) the development of an ETL service in a Marketing Data Management Platform and ii) an examination of data reduction models with a view to assessing data quality. Chapter 1 presents a literature review of this research and describes the basic concepts and their definitions in other research, including sampling, ETL, Data Quality and Big Data. Chapter 2 We present the manner in which the ETL system fits into the framework of the data-management platform and how the entire architecture is modelled. Chapter 3 presents the outcomes of the experiment. The experimental findings, which were obtained using various types of actual data, are presented in this chapter. The performance over time and the effect of the startified sample are depicted in graphs. The closing part presents the conclusions of this thesis and discusses the prospective research.Data Quality Management in ETL Process under Resource Constraints 1 Notes 3 INTRODUCTION 9 1.1 ETL 12 1.1.1 ETL definitions 13 1.1.2 ETL tools review 14 1.2 Data quality 16 1.2.1 Data Quality Dimensions 17 1.2.2 Data Quality Objectives in the Context of ETL 18 1.2.3 ISO Data Quality Standards 19 1.3 Tcp-di benchmark 20 1.4 Big data 21 1.4.1 Vs and BIG Data 21 1.4.2 Batch processing and Big Data 22 1.5 Sampling for big data 23 1.5.1 A taxonomy for Big Data sampling techniques 24 CHAPTER 2. SYSTEM MODELING 26 2.1 ETL model 27 2.2 System architecture overview 29 2.2.1 Metadata store 30 2.2.2 Horizontal autoscaling environment 31 2.2.3 Workflow runner 33 CHAPTER 3. MEASUREMENT RESULTS 35 3.1 Estimation of the Population Mean 36 3.2 Performance evaluation of stratified random sampling for DQ assessment 37 CHAPTER 4. LABOUR PROTECTION AND SAFETY IN EMERGENCY 44 4.1 Introduction 44 4.2 Need for guidelines 45 4.2.1. Software quality 45 4.2.2 Static analysis 47 4.2.3 Automated static analysis tools 49 4.3 Universal standards 51 4.4 Challenges in safety critical systems 52 4.5 Similarities between Different Standards 53 4.6 Conclusion to safety 53 CONCLUSIONS 54 BIBLIOGRAPHY 5

    Design, implementation and experimental evaluation of a network-slicing aware mobile protocol stack

    Get PDF
    Mención Internacional en el título de doctorWith the arrival of new generation mobile networks, we currently observe a paradigm shift, where monolithic network functions running on dedicated hardware are now implemented as software pieces that can be virtualized on general purpose hardware platforms. This paradigm shift stands on the softwarization of network functions and the adoption of virtualization techniques. Network Function Virtualization (NFV) comprises softwarization of network elements and virtualization of these components. It brings multiple advantages: (i) Flexibility, allowing an easy management of the virtual network functions (VNFs) (deploy, start, stop or update); (ii) efficiency, resources can be adequately consumed due to the increased flexibility of the network infrastructure; and (iii) reduced costs, due to the ability of sharing hardware resources. To this end, multiple challenges must be addressed to effectively leverage of all these benefits. Network Function Virtualization envisioned the concept of virtual network, resulting in a key enabler of 5G networks flexibility, Network Slicing. This new paradigm represents a new way to operate mobile networks where the underlying infrastructure is "sliced" into logically separated networks that can be customized to the specific needs of the tenant. This approach also enables the ability of instantiate VNFs at different locations of the infrastructure, choosing their optimal placement based on parameters such as the requirements of the service traversing the slice or the available resources. This decision process is called orchestration and involves all the VNFs withing the same network slice. The orchestrator is the entity in charge of managing network slices. Hands-on experiments on network slicing are essential to understand its benefits and limits, and to validate the design and deployment choices. While some network slicing prototypes have been built for Radio Access Networks (RANs), leveraging on the wide availability of radio hardware and open-source software, there is no currently open-source suite for end-to-end network slicing available to the research community. Similarly, orchestration mechanisms must be evaluated as well to properly validate theoretical solutions addressing diverse aspects such as resource assignment or service composition. This thesis contributes on the study of the mobile networks evolution regarding its softwarization and cloudification. We identify software patterns for network function virtualization, including the definition of a novel mobile architecture that squeezes the virtualization architecture by splitting functionality in atomic functions. Then, we effectively design, implement and evaluate of an open-source network slicing implementation. Our results show a per-slice customization without paying the price in terms of performance, also providing a slicing implementation to the research community. Moreover, we propose a framework to flexibly re-orchestrate a virtualized network, allowing on-the-fly re-orchestration without disrupting ongoing services. This framework can greatly improve performance under changing conditions. We evaluate the resulting performance in a realistic network slicing setup, showing the feasibility and advantages of flexible re-orchestration. Lastly and following the required re-design of network functions envisioned during the study of the evolution of mobile networks, we present a novel pipeline architecture specifically engineered for 4G/5G Physical Layers virtualized over clouds. The proposed design follows two objectives, resiliency upon unpredictable computing and parallelization to increase efficiency in multi-core clouds. To this end, we employ techniques such as tight deadline control, jitter-absorbing buffers, predictive Hybrid Automatic Repeat Request, and congestion control. Our experimental results show that our cloud-native approach attains > 95% of the theoretical spectrum efficiency in hostile environments where stateof- the-art architectures collapse.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Francisco Valera Pintor.- Secretario: Vincenzo Sciancalepore.- Vocal: Xenofon Fouka

    Algorithms for Large-Scale Sparse Tensor Factorization

    Get PDF
    University of Minnesota Ph.D. dissertation. April 2019. Major: Computer Science. Advisor: George Karypis. 1 computer file (PDF); xiv, 153 pages.Tensor factorization is a technique for analyzing data that features interactions of data along three or more axes, or modes. Many fields such as retail, health analytics, and cybersecurity utilize tensor factorization to gain useful insights and make better decisions. The tensors that arise in these domains are increasingly large, sparse, and high dimensional. Factoring these tensors is computationally expensive, if not infeasible. The ubiquity of multi-core processors and large-scale clusters motivates the development of scalable parallel algorithms to facilitate these computations. However, sparse tensor factorizations often achieve only a small fraction of potential performance due to challenges including data-dependent parallelism and memory accesses, high memory consumption, and frequent fine-grained synchronizations among compute cores. This thesis presents a collection of algorithms for factoring sparse tensors on modern parallel architectures. This work is focused on developing algorithms that are scalable while being memory- and operation-efficient. We address a number of challenges across various forms of tensor factorizations and emphasize results on large, real-world datasets

    Key distribution technique for IPTV services with support for admission control and user defined groups

    Get PDF
    Tese de doutoramento. Engenharia Electrotécnica e de Computadores. Faculdade de Engenharia. Universidade do Porto. 200

    Oral register in the biblical libretto : towards a biblical poetic

    Get PDF
    With the publication of A. B. Lord's The Singer of Tales in 1960, students of the ancient literatures of the Hebrew Bible, like their colleagues in Old English, medieval French, and Old Icelandic, were intrigued with the possibility that the corpus they studied reflected the work of composers in an oral tradition. Biblicists began to think in terms of bards who composed their literature extemporaneously without the aid of writing through the fresh manipulation of traditional patterns in language and content. Continuing and refining the work of his teacher Milman Parry, Albert Lord had suggested that such an oral compositional process lay behind the elegant and complex epics in classical Greek that are attributed to Homer.Not
    corecore