707 research outputs found

    DMFSGD: A Decentralized Matrix Factorization Algorithm for Network Distance Prediction

    Full text link
    The knowledge of end-to-end network distances is essential to many Internet applications. As active probing of all pairwise distances is infeasible in large-scale networks, a natural idea is to measure a few pairs and to predict the other ones without actually measuring them. This paper formulates the distance prediction problem as matrix completion where unknown entries of an incomplete matrix of pairwise distances are to be predicted. The problem is solvable because strong correlations among network distances exist and cause the constructed distance matrix to be low rank. The new formulation circumvents the well-known drawbacks of existing approaches based on Euclidean embedding. A new algorithm, so-called Decentralized Matrix Factorization by Stochastic Gradient Descent (DMFSGD), is proposed to solve the network distance prediction problem. By letting network nodes exchange messages with each other, the algorithm is fully decentralized and only requires each node to collect and to process local measurements, with neither explicit matrix constructions nor special nodes such as landmarks and central servers. In addition, we compared comprehensively matrix factorization and Euclidean embedding to demonstrate the suitability of the former on network distance prediction. We further studied the incorporation of a robust loss function and of non-negativity constraints. Extensive experiments on various publicly-available datasets of network delays show not only the scalability and the accuracy of our approach but also its usability in real Internet applications.Comment: submitted to IEEE/ACM Transactions on Networking on Nov. 201

    Case study in six sigma methadology : manufacturing quality improvement and guidence for managers

    Get PDF
    This article discusses the successful implementation of Six Sigma methodology in a high precision and critical process in the manufacture of automotive products. The Six Sigma define–measure–analyse–improve–control approach resulted in a reduction of tolerance-related problems and improved the first pass yield from 85% to 99.4%. Data were collected on all possible causes and regression analysis, hypothesis testing, Taguchi methods, classification and regression tree, etc. were used to analyse the data and draw conclusions. Implementation of Six Sigma methodology had a significant financial impact on the profitability of the company. An approximate saving of US$70,000 per annum was reported, which is in addition to the customer-facing benefits of improved quality on returns and sales. The project also had the benefit of allowing the company to learn useful messages that will guide future Six Sigma activities

    The regulation of telecommunication in the United Kingdom of Great Britain & Northern Ireland

    Get PDF
    This paper reviews the application of national antitrust law and the implementation of the European Union's telecommunications directives to the markets in the United Kingdom, against the declared policy objective of raising national competitiveness. It illustrates the complexity of the systems that have been created over three decades, with complex and interlocking regulatory, self-regulatory, judicial and appellate bodies, interacting with the parliamentary systems to form a regulatory state. Where markets have failed, or thought likely to fail, the state at different levels (UK, national and municipal) has supported studies and subsidized the provision of broadband Internet access. The regulator, using its sectoral antitrust powers, agreed with British Telecom to functional separation, transferring the enduring bottleneck of local access to a separate subsidiary. While the UK describes itself as a regulatory leader this is difficult to evaluate, given the number and the frequencies of changes, nonetheless the claim seems very difficult to substantiate. --Governance,Competitiveness,Regulatory state,Great Britain,United Kingdom

    Notes on notions around operational research

    Get PDF

    A behavioural analysis of the adoption and use of interactive computer systems by senior managers

    Get PDF
    The purpose of this research has been to make a contribution to knowledge about those processes and phenomena which influence the use of computer-based decision systems by senior managers for their own decision activities. In the course of the thesis, research questions are addressed which relate to the nature of the role of the directly-accessed computer in the working life of the top manager, and especially to the factors which influence computer adoption and use. A review of relevant literature enabled gaps in existing knowledge about senior managerial computer use to be identified, and indicated the potential value of exploratory research. A programme of interviews was devised and executed which enabled the exploration of the research problem across a sample of senior managers from private and public organizations. It is felt that the methodology of performing intra- and inter-organizational comparisons among computer-exposed managers was fundamental to achieving new insights into managerial behaviours. Following qualitative and qualitative analysis of the research data, a dynamic behavioural model of the computer adoption process in large organizations is proposed together with a description of salient behavioural features at key points in the process. This theoretical model contributes to an understanding of the nature and circumstances of the senior managerial behaviours associated with direct computer use

    Proxcache: A new cache deployment strategy in information-centric network for mitigating path and content redundancy

    Get PDF
    One of the promising paradigms for resource sharing with maintaining the basic Internet semantics is the Information-Centric Networking (ICN). ICN distinction with the current Internet is its ability to refer contents by names with partly dissociating the host-to-host practice of Internet Protocol addresses. Moreover, content caching in ICN is the major action of achieving content networking to reduce the amount of server access. The current caching practice in ICN using the Leave Copy Everywhere (LCE) progenerate problems of over deposition of contents known as content redundancy, path redundancy, lesser cache-hit rates in heterogeneous networks and lower content diversity. This study proposes a new cache deployment strategy referred to as ProXcache to acquire node relationships using hyperedge concept of hypergraph for cache positioning. The study formulates the relationships through the path and distance approximation to mitigate content and path redundancy. The study adopted the Design Research Methodology approach to achieve the slated research objectives. ProXcache was investigated using simulation on the Abilene, GEANT and the DTelekom network topologies for LCE and ProbCache caching strategies with the Zipf distribution to differ content categorization. The results show the overall content and path redundancy are minimized with lesser caching operation of six depositions per request as compared to nine and nineteen for ProbCache and LCE respectively. ProXcache yields better content diversity ratio of 80% against 20% and 49% for LCE and ProbCache respectively as the cache sizes varied. ProXcache also improves the cache-hit ratio through proxy positions. These thus, have significant influence in the development of the ICN for better management of contents towards subscribing to the Future Internet

    Developing Manufacturing System Platforms

    Get PDF
    corecore