80 research outputs found
Design Index for Deep Neural Networks
AbstractIn this paper, we propose a Deep Neural Networks (DNN) Design Index which would aid a DNN designer during the designing phase of DNNs. We study the designing aspect of DNNs from model-specific and data-specific perspectives with focus on three performance metrics: training time, training error and, validation error. We use a simple example to illustrate the significance of the DNN design index. To validate it, we calculate the design indices for four benchmark problems. This is an elementary work aimed at setting a direction for creating design indices pertaining to deep learning
A Visual Analytics Framework for Reviewing Streaming Performance Data
Understanding and tuning the performance of extreme-scale parallel computing
systems demands a streaming approach due to the computational cost of applying
offline algorithms to vast amounts of performance log data. Analyzing large
streaming data is challenging because the rate of receiving data and limited
time to comprehend data make it difficult for the analysts to sufficiently
examine the data without missing important changes or patterns. To support
streaming data analysis, we introduce a visual analytic framework comprising of
three modules: data management, analysis, and interactive visualization. The
data management module collects various computing and communication performance
metrics from the monitored system using streaming data processing techniques
and feeds the data to the other two modules. The analysis module automatically
identifies important changes and patterns at the required latency. In
particular, we introduce a set of online and progressive analysis methods for
not only controlling the computational costs but also helping analysts better
follow the critical aspects of the analysis results. Finally, the interactive
visualization module provides the analysts with a coherent view of the changes
and patterns in the continuously captured performance data. Through a
multi-faceted case study on performance analysis of parallel discrete-event
simulation, we demonstrate the effectiveness of our framework for identifying
bottlenecks and locating outliers.Comment: This is the author's preprint version that will be published in
Proceedings of IEEE Pacific Visualization Symposium, 202
Country Concepts and the Rational Actor Trap: Limitations to Strategic Management of International NGOs
Growing criticism of inefficient development aid demanded new planning instruments of donors, including international NGOs (INGOs). A reorientation from isolated project-planning towards holistic country concepts and the increasing rationality of a result-orientated planning process were seen as answer. However, whether these country concepts - newly introduced by major INGOs too - have increased the efficiency of development cooperation is open to question. Firstly, there have been counteracting external factors, like the globalization of the aid business, that demanded structural changes in the composition of INGO portfolios towards growing short-term humanitarian aid; this was hardly compatible with the requirements of medium-term country planning. Secondly, the underlying vision of rationality as a remedy for the major ills of development aid was in itself a fallacy. A major change in the methodology of planning, closely connected with a shift of emphasis in the approach to development cooperation, away from project planning and service delivery, towards supporting the socio-cultural and political environment of the recipient communities, demands a reorientation of aid management: The most urgent change needed is by donors, away from the blinkers of result-orientated planning towards participative organizational cultures of learning.Des critiques croissantes de l'aide au développement inefficace exigent de nouveaux instruments de planification des bailleurs de fonds, y compris les ONG internationales (ONGI). Une réorientation de la planification des projets isolés vers des concepts holistiques de la planification de l’aide par pays ainsi que la rationalité croissante d'un processus de planification orientée vers les résultats ont été considérés comme réponse. Toutefois, si ces concepts de pays - nouvellement introduites par les grandes OING eux aussi - ont augmenté l'efficacité de la coopération au développement est ouvert à la question. Tout d'abord, il y a eu l’impact des facteurs externes, comme la mondialisation de l'entreprise de l'aide, qui a exigé des changements structurels dans la composition des portefeuilles des OING vers la croissance de l'aide humanitaire à court terme. Cela était difficilement compatible avec les exigences de l'aménagement du territoire à moyen terme. Deuxièmement, la vision sous-jacente de la rationalité accrue de la planification, concentré sur les resultats, comme un remède pour les grands maux de l'aide au développement était en soi une erreur. Un changement majeur dans la méthodologie de la planification, étroitement liée à un changement d'orientation dans l'approche de la coopération au développement, qui n’est pas concentrer sur planification du projet et la prestation de services, mais qui soutienne l'environnement socio-culturel et politique des communautés bénéficiaires, exige une réorientation de la gestion de l’aide: Le changement le plus urgent est un changement par les donateurs eux-mêmes, qui devrait implanter des cultures de collaboration étroit avec les partenaires et la population locale
Market Exchange and the Rule of Law:Confidence in predictability
Law and economics is a significant field of analysis in legal studies and in economics, although there have been a number of controversies about how best to understand the relationship between economic relations and the regulatory role of law. Rather than surveying this field and offering a criticism of various theories and engaging in the dispute between different perspectives on the relationship between the two, in this article I take an approach rooted in neither mainstream economics nor in formal legal philosophy. Rather drawing on a recent well-rounded statement of behavioural economics and a synthesis of previous work on the narrative of the rule of law, I seek to explore how and why contemporary capitalism seems to have become so tied up with the rule of law, and what this might tell us more generally about the role of law in market relations. This analysis goes beyond the relatively commonplace observation that capitalism requires property rights, contract law and market institutionalisation to function, to ask ‘what exactly is it about the rule of law that seems so necessary to establishing and maintaining market exchange(s)?
Efficient execution of Time Warp programs on heterogeneous, NOW platforms
Ph.D.Richard M. Fujimot
Abstract Large-Scale TCP Models Using Optimistic Parallel Simulation
Internet data traffic is doubling each year, yet bandwidth does not appear to be growing as fast as expected and thus short falls in available bandwidth, particularly at the “last mile ” may result. To address these bandwidth allocation and congestion problems, researchers are proposing new overlay networks that provide a high quality of service and a near lossless guarantee. However, the central question raised by these new services is what impact will they have in the large? To address these and other network engineering research questions, highperformance simulation tools are required. However, to date, optimistic techniques have been viewed as operating outside of the performance envelope for Internet protocols, such as TCP, OSPF and BGP. In this paper, we dispel those views and demonstrate that optimistic protocols are able to efficiently simulate large-scale TCP scenarios for realistic, network topologies using a single Hyper-Threaded computing system costing less than $7,000 USD. For our real-world topology, we use the core AT&T US network. Our optimistic simulator yields extremely high efficiency and many of our performance runs produce zero rollbacks. Our compact modeling framework reduces the amount of memory required per TCP connection and thus our memory overhead per connection for one of our largest experimental network topologies was 2.6 KB. That value was comprised of all events used to model TCP packets, TCP connection state and routing information.
An Abstract Internet Topology Model for Simulating Peer-to-Peer Content Distribution
In recent years, many researchers have run simulations of the Internet. The Internet’s inherent heterogeneity and constantly changing nature make it difficult to construct a realistic, yet computationally feasible model. In the construction of any model, one must take into consideration flexibility, accuracy, required resources, execution time, and realism. In this paper, we discuss the methodology and creation of a model used to simulate Internet content distribution, and the rationale used behind its design. In particular, we are interested in modeling the in-home consumer broadband Internet, while preserving geographical market relationships. In our performance study, our simulations experience tremendous speedups, and require a fraction of the memory of other models, without sacrificing the accuracy of our findings. Specifically, our piece-level model achieves the accuracy of a packet-level model, while requiring the processing of 40 times fewer events.
- …