8,718 research outputs found

    The Motivation, Architecture and Demonstration of Ultralight Network Testbed

    Get PDF
    In this paper we describe progress in the NSF-funded Ultralight project and a recent demonstration of Ultralight technologies at SuperComputing 2005 (SC|05). The goal of the Ultralight project is to help meet the data-intensive computing challenges of the next generation of particle physics experiments with a comprehensive, network-focused approach. Ultralight adopts a new approach to networking: instead of treating it traditionally, as a static, unchanging and unmanaged set of inter-computer links, we are developing and using it as a dynamic, configurable, and closely monitored resource that is managed from end-to-end. Thus we are constructing a next-generation global system that is able to meet the data processing, distribution, access and analysis needs of the particle physics community. In this paper we present the motivation for, and an overview of, the Ultralight project. We then cover early results in the various working areas of the project. The remainder of the paper describes our experiences of the Ultralight network architecture, kernel setup, application tuning and configuration used during the bandwidth challenge event at SC|05. During this Challenge, we achieved a record-breaking aggregate data rate in excess of 150 Gbps while moving physics datasets between many sites interconnected by the Ultralight backbone network. The exercise highlighted the benefits of Ultralight's research and development efforts that are enabling new and advanced methods of distributed scientific data analysis

    The Design and Demonstration of the Ultralight Testbed

    Get PDF
    In this paper we present the motivation, the design, and a recent demonstration of the UltraLight testbed at SC|05. The goal of the Ultralight testbed is to help meet the data-intensive computing challenges of the next generation of particle physics experiments with a comprehensive, network- focused approach. UltraLight adopts a new approach to networking: instead of treating it traditionally, as a static, unchanging and unmanaged set of inter-computer links, we are developing and using it as a dynamic, configurable, and closely monitored resource that is managed from end-to-end. To achieve its goal we are constructing a next-generation global system that is able to meet the data processing, distribution, access and analysis needs of the particle physics community. In this paper we will first present early results in the various working areas of the project. We then describe our experiences of the network architecture, kernel setup, application tuning and configuration used during the bandwidth challenge event at SC|05. During this Challenge, we achieved a record-breaking aggregate data rate in excess of 150 Gbps while moving physics datasets between many Grid computing sites

    Filling some black holes: modeling the connection between urbanization, infrastructure, and global service intensity in 112 metropolitan regions across the world

    Get PDF
    This empirical article combines insights from previous research on the level of knowledge-intensive service in metropolitan areas with the aim to develop an understanding of the spatial structure of the global service economy. We use a stepwise regression model with GaWC’s measure of globalized service provisioning as the dependent variable and a range of variables focusing on population, infrastructure, urban primacy, and national regulation as independent variables. The discussion of the results focuses on model parameters as well as the meaning of outliers, and is used to explore some avenues for future research

    Filling some black holes: modeling the connection between urbanization, infrastructure, and global service intensity

    Get PDF
    This empirical article combines insights from previous research on the level of knowledge-intensive service in metropolitan areas with the aim to develop an understanding of the spatial structure of the global service economy. We use a stepwise regression model with the Globalization and World Cities research network's measure of globalized service provisioning as the dependent variable and a range of variables focusing on population, infrastructure, urban primacy, and national regulation as independent variables. The discussion of the results focuses on model parameters as well as the meaning of outliers and is used to explore some avenues for future research

    The Archives Unleashed Project: Technology, Process, and Community to Improve Scholarly Access to Web Archives

    Get PDF
    The Archives Unleashed project aims to improve scholarly access to web archives through a multi-pronged strategy involving tool creation, process modeling, and community building -- all proceeding concurrently in mutually --reinforcing efforts. As we near the end of our initially-conceived three-year project, we report on our progress and share lessons learned along the way. The main contribution articulated in this paper is a process model that decomposes scholarly inquiries into four main activities: filter, extract, aggregate, and visualize. Based on the insight that these activities can be disaggregated across time, space, and tools, it is possible to generate "derivative products", using our Archives Unleashed Toolkit, that serve as useful starting points for scholarly inquiry. Scholars can download these products from the Archives Unleashed Cloud and manipulate them just like any other dataset, thus providing access to web archives without requiring any specialized knowledge. Over the past few years, our platform has processed over a thousand different collections from over two hundred users, totaling around 300 terabytes of web archives.This research was supported by the Andrew W. Mellon Foundation, the Social Sciences and Humanities Research Council of Canada, as well as Start Smart Labs, Compute Canada, the University of Waterloo, and York University. We’d like to thank Jeremy Wiebe, Ryan Deschamps, and Gursimran Singh for their contributions

    From Social Data Mining to Forecasting Socio-Economic Crisis

    Full text link
    Socio-economic data mining has a great potential in terms of gaining a better understanding of problems that our economy and society are facing, such as financial instability, shortages of resources, or conflicts. Without large-scale data mining, progress in these areas seems hard or impossible. Therefore, a suitable, distributed data mining infrastructure and research centers should be built in Europe. It also appears appropriate to build a network of Crisis Observatories. They can be imagined as laboratories devoted to the gathering and processing of enormous volumes of data on both natural systems such as the Earth and its ecosystem, as well as on human techno-socio-economic systems, so as to gain early warnings of impending events. Reality mining provides the chance to adapt more quickly and more accurately to changing situations. Further opportunities arise by individually customized services, which however should be provided in a privacy-respecting way. This requires the development of novel ICT (such as a self- organizing Web), but most likely new legal regulations and suitable institutions as well. As long as such regulations are lacking on a world-wide scale, it is in the public interest that scientists explore what can be done with the huge data available. Big data do have the potential to change or even threaten democratic societies. The same applies to sudden and large-scale failures of ICT systems. Therefore, dealing with data must be done with a large degree of responsibility and care. Self-interests of individuals, companies or institutions have limits, where the public interest is affected, and public interest is not a sufficient justification to violate human rights of individuals. Privacy is a high good, as confidentiality is, and damaging it would have serious side effects for society.Comment: 65 pages, 1 figure, Visioneer White Paper, see http://www.visioneer.ethz.c

    An Empirical Analysis and Evaluation of Internet Robustness

    Get PDF
    The study of network robustness is a critical tool in the understanding of complex interconnected systems such as the Internet, which due to digitalization, gives rise to an increasing prevalence of cyberattacks. Robustness is when a network maintains its basic functionality even under failure of some of its components, in this instance being nodes or edges. Despite the importance of the Internet in the global economic system, it is rare to find empirical analyses of the global pattern of Internet traffic data established via backbone connections, which can be defined as an interconnected network of nodes and edges between which bandwidth flows. Hence in this thesis, I use metrics based on graph properties of network models to evaluate the robustness of the backbone network, which is further supported by international cybersecurity ratings. These cybersecurity ratings are adapted from the Global Cybersecurity Index which measures countries' commitments to cybersecurity and ranks countries based on their cybersecurity strategies. Ultimately this empirical analysis follows a three-step process of firstly mapping the Internet as a network of networks, followed by analysing the various networks and country profiles, and finally assessing each regional network's robustness. By using TeleGeography and ITU data, the results show that the regions with countries which have higher cybersecurity ratings in turn have more robust networks, when compared to regions with countries which have lower cybersecurity ratings

    Network Kriging

    Full text link
    Network service providers and customers are often concerned with aggregate performance measures that span multiple network paths. Unfortunately, forming such network-wide measures can be difficult, due to the issues of scale involved. In particular, the number of paths grows too rapidly with the number of endpoints to make exhaustive measurement practical. As a result, it is of interest to explore the feasibility of methods that dramatically reduce the number of paths measured in such situations while maintaining acceptable accuracy. We cast the problem as one of statistical prediction--in the spirit of the so-called `kriging' problem in spatial statistics--and show that end-to-end network properties may be accurately predicted in many cases using a surprisingly small set of carefully chosen paths. More precisely, we formulate a general framework for the prediction problem, propose a class of linear predictors for standard quantities of interest (e.g., averages, totals, differences) and show that linear algebraic methods of subset selection may be used to effectively choose which paths to measure. We characterize the performance of the resulting methods, both analytically and numerically. The success of our methods derives from the low effective rank of routing matrices as encountered in practice, which appears to be a new observation in its own right with potentially broad implications on network measurement generally.Comment: 16 pages, 9 figures, single-space
    • …
    corecore