27,877 research outputs found
D3.2 Cost Concept Model and Gateway Specification
This document introduces a Framework supporting the implementation of a cost concept model against which current and future cost models for curating digital assets can be benchmarked. The value built into this cost concept model leverages the comprehensive engagement by the 4C project with various user communities and builds upon our understanding of the requirements, drivers, obstacles and objectives that various stakeholder groups have relating to digital curation. Ultimately, this concept model should provide a critical input to the development and refinement of cost models as well as helping to ensure that the curation and preservation solutions and services that will inevitably arise from the commercial sector as âsupplyâ respond to a much better understood âdemandâ for cost-effective and relevant tools. To meet acknowledged gaps in current provision, a nested model of curation which addresses both costs and benefits is provided. The goal of this task was not to create a single, functionally implementable cost modelling application; but rather to design a model based on common concepts and to develop a generic gateway specification that can be used by future model developers, service and solution providers, and by researchers in follow-up research and development projects.<p></p>
The Framework includes:<p></p>
⢠A Cost Concept Modelâwhich defines the core concepts that should be included in curation costs models;<p></p>
⢠An Implementation Guideâfor the cost concept model that provides guidance and proposes questions that should be considered when developing new cost models and refining existing cost models;<p></p>
⢠A Gateway Specification Templateâwhich provides standard metadata for each of the core cost concepts and is intended for use by future model developers, model users, and service and solution providers to promote interoperability;<p></p>
⢠A Nested Model for Digital Curationâthat visualises the core concepts, demonstrates how they interact and places them into context visually by linking them to A Cost and Benefit Model for Curation.<p></p>
This Framework provides guidance for data collection and associated calculations in an operational context but will also provide a critical foundation for more strategic thinking around curation such as the Economic Sustainability Reference Model (ESRM).<p></p>
Where appropriate, definitions of terms are provided, recommendations are made, and examples from existing models are used to illustrate the principles of the framework
An ES process framework for understanding the strategic decision making process of ES implementations
Enterprise systems (ES) implementations are regarded costly, time and resource consuming and have a
great impact on the organization in terms of the risks they involve and the opportunities they provide. The
steering committee (SC) represents the group of individuals who is responsible for making strategic
decisions throughout the ES implementation lifecycle. It is evident from recent studies that there is a
relationship between the decision making process and ES implementation success. One of the key
elements that contribute to the success of ES implementations is a quick decision making process (Brown
and Vessey, 1999; Gupta, 2000; Parr, et al., 1999). This study addresses the strategic decision-making
process by SC through its focus on four research questions (1) How can the strategic decision-making
process in the implementation of ES be better understood, during each phase of the ES implementation
lifecycle? (2) What is the process by which the SC makes strategic decisions? (3) How are fast decisions
made? and (4) How does decision speed link to the success of ES implementation? Process models of ES
implementation will provide a framework to investigate the strategic decision making process during each
phases of the ES implementation lifecycle. Patterns in the decision making process will be explored using
strategic choice models. This study develops a research model that focuses on the decision making
process by steering committee to explore research questions. It concludes with identifying contributions
to both IS research and business practitioners
Aspect-Oriented Programming
Aspect-oriented programming is a promising idea that can improve the quality of software by reduce the problem of code tangling and improving the separation of concerns. At ECOOP'97, the first AOP workshop brought together a number of researchers interested in aspect-orientation. At ECOOP'98, during the second AOP workshop the participants reported on progress in some research topics and raised more issues that were further discussed. \ud
\ud
This year, the ideas and concepts of AOP have been spread and adopted more widely, and, accordingly, the workshop received many submissions covering areas from design and application of aspects to design and implementation of aspect languages
Resilient Critical Infrastructure Management using Service Oriented Architecture
AbstractâThe SERSCIS project aims to support the use of interconnected systems of services in Critical Infrastructure (CI) applications. The problem of system interconnectedness is aptly demonstrated by âAirport Collaborative Decision Makingâ (ACDM). Failure or underperformance of any of the interlinked ICT systems may compromise the ability of airports to plan their use of resources to sustain high levels of air traffic, or to provide accurate aircraft movement forecasts to the wider European air traffic management systems. The proposed solution is to introduce further SERSCIS ICT components to manage dependability and interdependency. These use semantic models of the critical infrastructure, including its ICT services, to identify faults and potential risks and to increase human awareness of them. Semantics allows information and services to be described in such a way that makes them understandable to computers. Thus when a failure (or a threat of failure) is detected, SERSCIS components can take action to manage the consequences, including changing the interdependency relationships between services. In some cases, the components will be able to take action autonomously â e.g. to manage âlocalâ issues such as the allocation of CPU time to maintain service performance, or the selection of services where there are redundant sources available. In other cases the components will alert human operators so they can take action instead. The goal of this paper is to describe a Service Oriented Architecture (SOA) that can be used to address the management of ICT components and interdependencies in critical infrastructure systems. Index Termsâresilience; QoS; SOA; critical infrastructure, SLA
Policy Enforcement with Proactive Libraries
Software libraries implement APIs that deliver reusable functionalities. To
correctly use these functionalities, software applications must satisfy certain
correctness policies, for instance policies about the order some API methods
can be invoked and about the values that can be used for the parameters. If
these policies are violated, applications may produce misbehaviors and failures
at runtime. Although this problem is general, applications that incorrectly use
API methods are more frequent in certain contexts. For instance, Android
provides a rich and rapidly evolving set of APIs that might be used incorrectly
by app developers who often implement and publish faulty apps in the
marketplaces. To mitigate this problem, we introduce the novel notion of
proactive library, which augments classic libraries with the capability of
proactively detecting and healing misuses at run- time. Proactive libraries
blend libraries with multiple proactive modules that collect data, check the
correctness policies of the libraries, and heal executions as soon as the
violation of a correctness policy is detected. The proactive modules can be
activated or deactivated at runtime by the users and can be implemented without
requiring any change to the original library and any knowledge about the
applications that may use the library. We evaluated proactive libraries in the
context of the Android ecosystem. Results show that proactive libraries can
automati- cally overcome several problems related to bad resource usage at the
cost of a small overhead.Comment: O. Riganelli, D. Micucci and L. Mariani, "Policy Enforcement with
Proactive Libraries" 2017 IEEE/ACM 12th International Symposium on Software
Engineering for Adaptive and Self-Managing Systems (SEAMS), Buenos Aires,
Argentina, 2017, pp. 182-19
The LIFE2 final project report
Executive summary: The first phase of LIFE (Lifecycle Information For E-Literature) made a major contribution to
understanding the long-term costs of digital preservation; an essential step in helping
institutions plan for the future. The LIFE work models the digital lifecycle and calculates the
costs of preserving digital information for future years. Organisations can apply this process
in order to understand costs and plan effectively for the preservation of their digital
collections
The second phase of the LIFE Project, LIFE2, has refined the LIFE Model adding three new
exemplar Case Studies to further build upon LIFE1. LIFE2 is an 18-month JISC-funded
project between UCL (University College London) and The British Library (BL), supported
by the LIBER Access and Preservation Divisions. LIFE2 began in March 2007, and
completed in August 2008.
The LIFE approach has been validated by a full independent economic review and has
successfully produced an updated lifecycle costing model (LIFE Model v2) and digital
preservation costing model (GPM v1.1). The LIFE Model has been tested with three further
Case Studies including institutional repositories (SHERPA-LEAP), digital preservation
services (SHERPA DP) and a comparison of analogue and digital collections (British Library
Newspapers). These Case Studies were useful for scenario building and have fed back into
both the LIFE Model and the LIFE Methodology.
The experiences of implementing the Case Studies indicated that enhancements made to the
LIFE Methodology, Model and associated tools have simplified the costing process. Mapping
a specific lifecycle to the LIFE Model isnât always a straightforward process. The revised and
more detailed Model has reduced ambiguity. The costing templates, which were refined
throughout the process of developing the Case Studies, ensure clear articulation of both
working and cost figures, and facilitate comparative analysis between different lifecycles.
The LIFE work has been successfully disseminated throughout the digital preservation and
HE communities. Early adopters of the work include the Royal Danish Library, State
Archives and the State and University Library, Denmark as well as the LIFE2 Project partners.
Furthermore, interest in the LIFE work has not been limited to these sectors, with interest in
LIFE expressed by local government, records offices, and private industry. LIFE has also
provided input into the LC-JISC Blue Ribbon Task Force on the Economic Sustainability of
Digital Preservation.
Moving forward our ability to cost the digital preservation lifecycle will require further
investment in costing tools and models. Developments in estimative models will be needed to
support planning activities, both at a collection management level and at a later preservation
planning level once a collection has been acquired. In order to support these developments a
greater volume of raw cost data will be required to inform and test new cost models. This
volume of data cannot be supported via the Case Study approach, and the LIFE team would
suggest that a software tool would provide the volume of costing data necessary to provide a
truly accurate predictive model
Recommended from our members
Designing a consulting services architecture model
textDuring my years of experience in the technology industry, it has become obvious that standard processes and methodologies within the engineering discipline are at a mature state. The realization though is that software engineering specifically lags behind. Most software engineering methodologies that I have studied focus on the mission of software development. It is this realization and the need for structure that led me to review existing methodologies used within my company's software services organization. The definition of what a successful software services methodology entails is rather limited. This report will provide a history of existing software engineering methodologies that I have studied, describe an initial services method that was being developed within my organization, develop a new model that addresses previous shortcomings and identify additional components required to further define a strong software services-oriented delivery methodology.Electrical and Computer Engineerin
- âŚ