42,881 research outputs found
Undermining and Strengthening Social Networks through Network Modification
Social networks have well documented effects at the individual and aggregate
level. Consequently it is often useful to understand how an attempt to
influence a network will change its structure and consequently achieve other
goals. We develop a framework for network modification that allows for
arbitrary objective functions, types of modification (e.g. edge weight
addition, edge weight removal, node removal, and covariate value change), and
recovery mechanisms (i.e. how a network responds to interventions). The
framework outlined in this paper helps both to situate the existing work on
network interventions but also opens up many new possibilities for intervening
in networks. In particular use two case studies to highlight the potential
impact of empirically calibrating the objective function and network recovery
mechanisms as well as showing how interventions beyond node removal can be
optimised. First, we simulate an optimal removal of nodes from the Noordin
terrorist network in order to reduce the expected number of attacks (based on
empirically predicting the terrorist collaboration network from multiple types
of network ties). Second, we simulate optimally strengthening ties within
entrepreneurial ecosystems in six developing countries. In both cases we
estimate ERGM models to simulate how a network will endogenously evolve after
intervention
Automatic Prediction Of Small Group Performance In Information Sharing Tasks
In this paper, we describe a novel approach, based on Markov jump processes,
to model small group conversational dynamics and to predict small group
performance. More precisely, we estimate conversational events such as turn
taking, backchannels, turn-transitions at the micro-level (1 minute windows)
and then we bridge the micro-level behavior and the macro-level performance. We
tested our approach with a cooperative task, the Information Sharing task, and
we verified the relevance of micro- level interaction dynamics in determining a
good group performance (e.g. higher speaking turns rate and more balanced
participation among group members).Comment: Presented at Collective Intelligence conference, 2012
(arXiv:1204.2991
Predicting Intermediate Storage Performance for Workflow Applications
Configuring a storage system to better serve an application is a challenging
task complicated by a multidimensional, discrete configuration space and the
high cost of space exploration (e.g., by running the application with different
storage configurations). To enable selecting the best configuration in a
reasonable time, we design an end-to-end performance prediction mechanism that
estimates the turn-around time of an application using storage system under a
given configuration. This approach focuses on a generic object-based storage
system design, supports exploring the impact of optimizations targeting
workflow applications (e.g., various data placement schemes) in addition to
other, more traditional, configuration knobs (e.g., stripe size or replication
level), and models the system operation at data-chunk and control message
level.
This paper presents our experience to date with designing and using this
prediction mechanism. We evaluate this mechanism using micro- as well as
synthetic benchmarks mimicking real workflow applications, and a real
application.. A preliminary evaluation shows that we are on a good track to
meet our objectives: it can scale to model a workflow application run on an
entire cluster while offering an over 200x speedup factor (normalized by
resource) compared to running the actual application, and can achieve, in the
limited number of scenarios we study, a prediction accuracy that enables
identifying the best storage system configuration
Human Computation and Convergence
Humans are the most effective integrators and producers of information,
directly and through the use of information-processing inventions. As these
inventions become increasingly sophisticated, the substantive role of humans in
processing information will tend toward capabilities that derive from our most
complex cognitive processes, e.g., abstraction, creativity, and applied world
knowledge. Through the advancement of human computation - methods that leverage
the respective strengths of humans and machines in distributed
information-processing systems - formerly discrete processes will combine
synergistically into increasingly integrated and complex information processing
systems. These new, collective systems will exhibit an unprecedented degree of
predictive accuracy in modeling physical and techno-social processes, and may
ultimately coalesce into a single unified predictive organism, with the
capacity to address societies most wicked problems and achieve planetary
homeostasis.Comment: Pre-publication draft of chapter. 24 pages, 3 figures; added
references to page 1 and 3, and corrected typ
Early Turn-taking Prediction with Spiking Neural Networks for Human Robot Collaboration
Turn-taking is essential to the structure of human teamwork. Humans are
typically aware of team members' intention to keep or relinquish their turn
before a turn switch, where the responsibility of working on a shared task is
shifted. Future co-robots are also expected to provide such competence. To that
end, this paper proposes the Cognitive Turn-taking Model (CTTM), which
leverages cognitive models (i.e., Spiking Neural Network) to achieve early
turn-taking prediction. The CTTM framework can process multimodal human
communication cues (both implicit and explicit) and predict human turn-taking
intentions in an early stage. The proposed framework is tested on a simulated
surgical procedure, where a robotic scrub nurse predicts the surgeon's
turn-taking intention. It was found that the proposed CTTM framework
outperforms the state-of-the-art turn-taking prediction algorithms by a large
margin. It also outperforms humans when presented with partial observations of
communication cues (i.e., less than 40% of full actions). This early prediction
capability enables robots to initiate turn-taking actions at an early stage,
which facilitates collaboration and increases overall efficiency.Comment: Submitted to IEEE International Conference on Robotics and Automation
(ICRA) 201
From Social Simulation to Integrative System Design
As the recent financial crisis showed, today there is a strong need to gain
"ecological perspective" of all relevant interactions in
socio-economic-techno-environmental systems. For this, we suggested to set-up a
network of Centers for integrative systems design, which shall be able to run
all potentially relevant scenarios, identify causality chains, explore feedback
and cascading effects for a number of model variants, and determine the
reliability of their implications (given the validity of the underlying
models). They will be able to detect possible negative side effect of policy
decisions, before they occur. The Centers belonging to this network of
Integrative Systems Design Centers would be focused on a particular field, but
they would be part of an attempt to eventually cover all relevant areas of
society and economy and integrate them within a "Living Earth Simulator". The
results of all research activities of such Centers would be turned into
informative input for political Decision Arenas. For example, Crisis
Observatories (for financial instabilities, shortages of resources,
environmental change, conflict, spreading of diseases, etc.) would be connected
with such Decision Arenas for the purpose of visualization, in order to make
complex interdependencies understandable to scientists, decision-makers, and
the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c
Recommended from our members
Simulating intertwined design processes that have similar structures: A case study of a small company that creates made-to-order fashion products
The authors use simulation to analyse the resource-driven dependencies between concurrent processes used to create customised products in a company. Such processes are uncertain and unique according to the design changes required. However, they have similar structures. For simulation, a level of abstraction is chosen such that all possible processes are represented by the same activity network. Differences between processes are determined by the customisations that they implement. The approach is illustrated through application to a small business that creates customised fashion products. We suggest that similar techniques could be applied to study intertwined design processes in more complex domains.The case study was carried out as part of Considerate Design for Personalised
Fashion funded by the EPSRC/AHRC Design in the 21st century programme. The
context of a multi-project environment was analysed as part of the EU Framework 7
CONVERGE project CP-FP 228746-2.Post-prin
GEANT4 : a simulation toolkit
Abstract Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics. PACS: 07.05.Tp; 13; 2
Jets plus Missing Energy with an Autofocus
Jets plus missing transverse energy is one of the main search channels for
new physics at the LHC. A major limitation lies in our understanding of QCD
backgrounds. Using jet merging we can describe the number of jets in typical
background channels in terms of a staircase scaling, including theory
uncertainties. The scaling parameter depends on the particles in the final
state and on cuts applied. Measuring the staircase scaling will allow us to
also predict the effective mass for Standard Model backgrounds. Based on both
observables we propose an analysis strategy avoiding model specific cuts which
returns information about the color charge and the mass scale of the underlying
new physics.Comment: 15 pages, 9 figures, 3 table
- âŠ