129,953 research outputs found
Architecture of Environmental Risk Modelling: for a faster and more robust response to natural disasters
Demands on the disaster response capacity of the European Union are likely to
increase, as the impacts of disasters continue to grow both in size and
frequency. This has resulted in intensive research on issues concerning
spatially-explicit information and modelling and their multiple sources of
uncertainty. Geospatial support is one of the forms of assistance frequently
required by emergency response centres along with hazard forecast and event
management assessment. Robust modelling of natural hazards requires dynamic
simulations under an array of multiple inputs from different sources.
Uncertainty is associated with meteorological forecast and calibration of the
model parameters. Software uncertainty also derives from the data
transformation models (D-TM) needed for predicting hazard behaviour and its
consequences. On the other hand, social contributions have recently been
recognized as valuable in raw-data collection and mapping efforts traditionally
dominated by professional organizations. Here an architecture overview is
proposed for adaptive and robust modelling of natural hazards, following the
Semantic Array Programming paradigm to also include the distributed array of
social contributors called Citizen Sensor in a semantically-enhanced strategy
for D-TM modelling. The modelling architecture proposes a multicriteria
approach for assessing the array of potential impacts with qualitative rapid
assessment methods based on a Partial Open Loop Feedback Control (POLFC) schema
and complementing more traditional and accurate a-posteriori assessment. We
discuss the computational aspect of environmental risk modelling using
array-based parallel paradigms on High Performance Computing (HPC) platforms,
in order for the implications of urgency to be introduced into the systems
(Urgent-HPC).Comment: 12 pages, 1 figure, 1 text box, presented at the 3rd Conference of
Computational Interdisciplinary Sciences (CCIS 2014), Asuncion, Paragua
A review of modular strategies and architecture within manufacturing operations
This paper reviews existing modularity and modularization literature within manufacturing operations. Its purpose is to examine the tools, techniques, and concepts relating to modular production, to draw together key issues currently dominating the literature, to assess managerial implications associated with the emerging modular paradigm, and to present an agenda for future research directions. The review is based on journal papers included in the ABI/Inform electronic database and other noteworthy research published as part of significant research programmes. The research methodology concerns reviewing existing literature to identify key modular concepts, to determine modular developments, and to present a review of significant contributions to the field. The findings indicate that the modular paradigm is being adopted in a number of manufacturing organizations. As a result a range of conceptual tools, techniques, and frameworks has emerged and the field of modular enquiry is in the process of codifying the modular lexicon and developing appropriate modular strategies commensurate with the needs of manufacturers. Modular strategies and modular architecture were identified as two key issues currently dominating the modular landscape. Based on this review, the present authors suggest that future research areas need to focus on the development and subsequent standardization of interface protocols, cross-brand module use, supply chain power, transparency, and trust. This is the first review of the modular landscape and as such provides insights into, first, the development of modularization and, second, issues relating to designing modular products and modular supply chains
A strategic approach to making sense of the “wicked” problem of ERM
Purpose – The purpose of this paper is to provide an approach to viewing the “wicked” problem of electronic records management (ERM), using the Cynefin framework, a sense-making tool. It re-conceptualises the ERM challenge by understanding the nature of the people issues. This supports decision making about the most appropriate tactics to adopt to effect positive change.
Design/methodology/approach – Cynefin was used to synthesise qualitative data from an empirical research project that investigated strategies and tactics for improving ERM.
Findings – ERM may be thought of as a dynamic, complex challenge but, viewed through the Cynefin framework, many issues are not complex; they are simple or complicated and can be addressed using best or good practice. The truly complex issues need a different approach, described as emergent practice. Cynefin provides a different lens through which to view, make sense of and re-perceive the ERM challenge and offers a strategic approach to accelerating change.
Research limitations/implications – Since Cynefin has been applied to one data set, the findings are transferrable not generalisable. They, and/or the approach, can be used to further test the propositions.
Practical implications – The resultant ERM framework provides a practical example for information and records managers to exploit or use as a starting point to explore the situation in particular organisational contexts. It could also be used in other practical, teaching and/or research-related records contexts.
Originality/value – This paper provides a new strategic approach to addressing the wicked problem of ERM, which is applicable for any organisational context
A Case Study on Formal Verification of Self-Adaptive Behaviors in a Decentralized System
Self-adaptation is a promising approach to manage the complexity of modern
software systems. A self-adaptive system is able to adapt autonomously to
internal dynamics and changing conditions in the environment to achieve
particular quality goals. Our particular interest is in decentralized
self-adaptive systems, in which central control of adaptation is not an option.
One important challenge in self-adaptive systems, in particular those with
decentralized control of adaptation, is to provide guarantees about the
intended runtime qualities. In this paper, we present a case study in which we
use model checking to verify behavioral properties of a decentralized
self-adaptive system. Concretely, we contribute with a formalized architecture
model of a decentralized traffic monitoring system and prove a number of
self-adaptation properties for flexibility and robustness. To model the main
processes in the system we use timed automata, and for the specification of the
required properties we use timed computation tree logic. We use the Uppaal tool
to specify the system and verify the flexibility and robustness properties.Comment: In Proceedings FOCLASA 2012, arXiv:1208.432
Simulation modelling and visualisation: toolkits for building artificial worlds
Simulations users at all levels make heavy use of compute resources to drive computational
simulations for greatly varying applications areas of research using different simulation
paradigms. Simulations are implemented in many software forms, ranging from highly standardised
and general models that run in proprietary software packages to ad hoc hand-crafted
simulations codes for very specific applications. Visualisation of the workings or results of a
simulation is another highly valuable capability for simulation developers and practitioners.
There are many different software libraries and methods available for creating a visualisation
layer for simulations, and it is often a difficult and time-consuming process to assemble a
toolkit of these libraries and other resources that best suits a particular simulation model. We
present here a break-down of the main simulation paradigms, and discuss differing toolkits and
approaches that different researchers have taken to tackle coupled simulation and visualisation
in each paradigm
Transdisciplinarity seen through Information, Communication, Computation, (Inter-)Action and Cognition
Similar to oil that acted as a basic raw material and key driving force of
industrial society, information acts as a raw material and principal mover of
knowledge society in the knowledge production, propagation and application. New
developments in information processing and information communication
technologies allow increasingly complex and accurate descriptions,
representations and models, which are often multi-parameter, multi-perspective,
multi-level and multidimensional. This leads to the necessity of collaborative
work between different domains with corresponding specialist competences,
sciences and research traditions. We present several major transdisciplinary
unification projects for information and knowledge, which proceed on the
descriptive, logical and the level of generative mechanisms. Parallel process
of boundary crossing and transdisciplinary activity is going on in the applied
domains. Technological artifacts are becoming increasingly complex and their
design is strongly user-centered, which brings in not only the function and
various technological qualities but also other aspects including esthetic, user
experience, ethics and sustainability with social and environmental dimensions.
When integrating knowledge from a variety of fields, with contributions from
different groups of stakeholders, numerous challenges are met in establishing
common view and common course of action. In this context, information is our
environment, and informational ecology determines both epistemology and spaces
for action. We present some insights into the current state of the art of
transdisciplinary theory and practice of information studies and informatics.
We depict different facets of transdisciplinarity as we see it from our
different research fields that include information studies, computability,
human-computer interaction, multi-operating-systems environments and
philosophy.Comment: Chapter in a forthcoming book: Information Studies and the Quest for
Transdisciplinarity - Forthcoming book in World Scientific. Mark Burgin and
Wolfgang Hofkirchner, Editor
- …