358,992 research outputs found

    MeshfreeFlowNet: A Physics-Constrained Deep Continuous Space-Time Super-Resolution Framework

    Get PDF
    We propose MeshfreeFlowNet, a novel deep learning-based super-resolution framework to generate continuous (grid-free) spatio-temporal solutions from the low-resolution inputs. While being computationally efficient, MeshfreeFlowNet accurately recovers the fine-scale quantities of interest. MeshfreeFlowNet allows for: (i) the output to be sampled at all spatio-temporal resolutions, (ii) a set of Partial Differential Equation (PDE) constraints to be imposed, and (iii) training on fixed-size inputs on arbitrarily sized spatio-temporal domains owing to its fully convolutional encoder. We empirically study the performance of MeshfreeFlowNet on the task of super-resolution of turbulent flows in the Rayleigh-Benard convection problem. Across a diverse set of evaluation metrics, we show that MeshfreeFlowNet significantly outperforms existing baselines. Furthermore, we provide a large scale implementation of MeshfreeFlowNet and show that it efficiently scales across large clusters, achieving 96.80% scaling efficiency on up to 128 GPUs and a training time of less than 4 minutes.Comment: Supplementary Video: https://youtu.be/mjqwPch9gDo. Accepted to SC2

    Collaboration in sensor network research: an in-depth longitudinal analysis of assortative mixing patterns

    Get PDF
    Many investigations of scientific collaboration are based on statistical analyses of large networks constructed from bibliographic repositories. These investigations often rely on a wealth of bibliographic data, but very little or no other information about the individuals in the network, and thus, fail to illustrate the broader social and academic landscape in which collaboration takes place. In this article, we perform an in-depth longitudinal analysis of a relatively small network of scientific collaboration (N = 291) constructed from the bibliographic record of a research center involved in the development and application of sensor network and wireless technologies. We perform a preliminary analysis of selected structural properties of the network, computing its range, configuration and topology. We then support our preliminary statistical analysis with an in-depth temporal investigation of the assortative mixing of selected node characteristics, unveiling the researchers' propensity to collaborate preferentially with others with a similar academic profile. Our qualitative analysis of mixing patterns offers clues as to the nature of the scientific community being modeled in relation to its organizational, disciplinary, institutional, and international arrangements of collaboration.Comment: Scientometrics (In press

    The State of Network Neutrality Regulation

    Get PDF
    The Network Neutrality (NN) debate refers to the battle over the design of a regulatory framework for preserving the Internet as a public network and open innovation platform. Fueled by concerns that broadband access service providers might abuse network management to discriminate against third party providers (e.g., content or application providers), policymakers have struggled with designing rules that would protect the Internet from unreasonable network management practices. In this article, we provide an overview of the history of the debate in the U.S. and the EU and highlight the challenges that will confront network engineers designing and operating networks as the debate continues to evolve.BMBF, 16DII111, Verbundprojekt: Weizenbaum-Institut fĆ¼r die vernetzte Gesellschaft - Das Deutsche Internet-Institut; Teilvorhaben: Wissenschaftszentrum Berlin fĆ¼r Sozialforschung (WZB)EC/H2020/679158/EU/Resolving the Tussle in the Internet: Mapping, Architecture, and Policy Making/ResolutioNe

    Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes

    Get PDF
    I argue that data becomes temporarily interesting by itself to some self-improving, but computationally limited, subjective observer once he learns to predict or compress the data in a better way, thus making it subjectively simpler and more beautiful. Curiosity is the desire to create or discover more non-random, non-arbitrary, regular data that is novel and surprising not in the traditional sense of Boltzmann and Shannon but in the sense that it allows for compression progress because its regularity was not yet known. This drive maximizes interestingness, the first derivative of subjective beauty or compressibility, that is, the steepness of the learning curve. It motivates exploring infants, pure mathematicians, composers, artists, dancers, comedians, yourself, and (since 1990) artificial systems.Comment: 35 pages, 3 figures, based on KES 2008 keynote and ALT 2007 / DS 2007 joint invited lectur
    • ā€¦
    corecore