6,292 research outputs found
Informatics Research Institute (IRIS) July 2004 newsletter
This summer period has been rich in presence and dissemination related activities. Several important
conferences, which have enjoyed a great international
participation and success, have been organized by IRIS
academics in Salford. These include NLDB04, CRIS 2004 and the LTSN workshop. Also, a substantial number of research projects have been secured from national as well as European funding sources. All these activities are contributing to reinforcing the leading position that IRIS is currently enjoying in the field of Informatics. This newsletter gives an overview of all research activities
that took place during this reporting period. It is hoped that this will help trigger further collaboration with existing and future colleagues from academia, research and industry to work together towards addressing the many societal and technological challenges engendered by the information age
Human factors and the WWW : making sense of URLs
We present a study of how WWW users âmake senseâ of URLs. Experiments were used to investigate
usersâ capacity to employ the URL as a surrogate for the resource to which it refers. The results show that users
can infer useful information from URLs, but that such improvisation has shortcomings as a navigation aid
Review of research in feature-based design
Research in feature-based design is reviewed. Feature-based design is regarded as a key factor towards CAD/CAPP integration from a process planning point of view. From a design point of view, feature-based design offers possibilities for supporting the design process better than current CAD systems do. The evolution of feature definitions is briefly discussed. Features and their role in the design process and as representatives of design-objects and design-object knowledge are discussed. The main research issues related to feature-based design are outlined. These are: feature representation, features and tolerances, feature validation, multiple viewpoints towards features, features and standardization, and features and languages. An overview of some academic feature-based design systems is provided. Future research issues in feature-based design are outlined. The conclusion is that feature-based design is still in its infancy, and that more research is needed for a better support of the design process and better integration with manufacturing, although major advances have already been made
The Performability Manager
The authors describe the performability manager, a distributed system component that contributes to a more effective and efficient use of system components and prevents quality of service (QoS) degradation. The performability manager dynamically reconfigures distributed systems whenever needed, to recover from failures and to permit the system to evolve over time and include new functionality. Large systems require dynamic reconfiguration to support dynamic change without shutting down the complete system. A distributed system monitor is needed to verify QoS. Monitoring a distributed system is difficult because of synchronization problems and minor differences in clock speeds. The authors describe the functionality and the operation of the performability manager (both informally and formally). Throughout the paper they illustrate the approach by an example distributed application: an ANSAware-based number translation service (NTS), from the intelligent networks (IN) area
The development and application of economic valuation techniques and their use in environmental policy - A survey
This paper is concerned with the issue of how to introduce monetary valuation into public decision-making. This issue is closely related to introducing rational procedures into public decision-making (Pearce, 2001). All public decision-making involves choice. To economists, rational choice means making the 'best' use of available resources, i.e. choose that option that has the lowest opportunity cost or the lowest value to be sacrificed. Costs and benefits of any project should therefore be weighed as well as compared to cost and benefits of alternative projects. This implies that all impacts of these projects need to be expressed in the same unit to make comparison possible. Money seems to be the most obvious numĂŠraire. We discuss some of the most popular economic valuation techniques and their potential role in public decision-making. Due to the high cost and time that is needed to perform original valuation studies and the limited knowledge of decision-makers with these techniques, we recommend that the Flemish Administration primarily invests in performing high-quality transfer studies.Valuation, Cost-benefit Analysis, Travel Cost Method, Contingent Valuation Method
Architecture of Environmental Risk Modelling: for a faster and more robust response to natural disasters
Demands on the disaster response capacity of the European Union are likely to
increase, as the impacts of disasters continue to grow both in size and
frequency. This has resulted in intensive research on issues concerning
spatially-explicit information and modelling and their multiple sources of
uncertainty. Geospatial support is one of the forms of assistance frequently
required by emergency response centres along with hazard forecast and event
management assessment. Robust modelling of natural hazards requires dynamic
simulations under an array of multiple inputs from different sources.
Uncertainty is associated with meteorological forecast and calibration of the
model parameters. Software uncertainty also derives from the data
transformation models (D-TM) needed for predicting hazard behaviour and its
consequences. On the other hand, social contributions have recently been
recognized as valuable in raw-data collection and mapping efforts traditionally
dominated by professional organizations. Here an architecture overview is
proposed for adaptive and robust modelling of natural hazards, following the
Semantic Array Programming paradigm to also include the distributed array of
social contributors called Citizen Sensor in a semantically-enhanced strategy
for D-TM modelling. The modelling architecture proposes a multicriteria
approach for assessing the array of potential impacts with qualitative rapid
assessment methods based on a Partial Open Loop Feedback Control (POLFC) schema
and complementing more traditional and accurate a-posteriori assessment. We
discuss the computational aspect of environmental risk modelling using
array-based parallel paradigms on High Performance Computing (HPC) platforms,
in order for the implications of urgency to be introduced into the systems
(Urgent-HPC).Comment: 12 pages, 1 figure, 1 text box, presented at the 3rd Conference of
Computational Interdisciplinary Sciences (CCIS 2014), Asuncion, Paragua
Recommended from our members
An approach to modeling database activity
Results in the field of data modeling currently suffer from many of the same ills which plagued data management systems in the late 1960's. Advanced semantic modeling systems such as the Semantic Data Model and the Relational Model/Tasmania are extremely complex to understand as well as somewhat ad hoc in design. Such systems capture only static snapshots of activity in the world being modeled. On the other hand, behavioral models which do attempt to model system dynamics typically provide less overall modeling power than comprehensive semantic models. Further, the specifications of behavior which can be expressed with such models are themselves static snapshots which are not integrated with other database objects.This work describes one approach for capturing dynamic relationships by distilling the concepts found in semantic and behavioral data models into a small number of flexible constructs. The resulting Prototype Activity Modeling System (PAMS) captures the containment, feedback, operational, and state dependency roles of entities in the world being modeled. Further, these definitions of database activity are captured as database objects (rather than as a schema) so as to allow dynamic manipulation of entity roles.The key concept of the approach is the bundle - a purposefully designed extension of time-proven relational database modeling concepts which includes support for presentation ordering and complex Cartesian aggregations. By applying the basic nested bundle principle, it is possible to obtain complex hierarchies of static structural information. The static templates so constructed, when used with a non-procedural query language and the value nomination principle which reduces relations to scalar values when necessary, provide a conventional database modeling system for applications. By extending these templates with the non-procedural thunk principle which embeds query specifications within object definitions, variations caused by dependencies within the application can cause the apparent contents of the database description to change. When further extended by the activity monitoring principle which records the interaction between the application and its environment, these dynamic templates can account for changes outside the scope of the application
- âŚ