117,054 research outputs found
Digital Ecosystems: Ecosystem-Oriented Architectures
We view Digital Ecosystems to be the digital counterparts of biological
ecosystems. Here, we are concerned with the creation of these Digital
Ecosystems, exploiting the self-organising properties of biological ecosystems
to evolve high-level software applications. Therefore, we created the Digital
Ecosystem, a novel optimisation technique inspired by biological ecosystems,
where the optimisation works at two levels: a first optimisation, migration of
agents which are distributed in a decentralised peer-to-peer network, operating
continuously in time; this process feeds a second optimisation based on
evolutionary computing that operates locally on single peers and is aimed at
finding solutions to satisfy locally relevant constraints. The Digital
Ecosystem was then measured experimentally through simulations, with measures
originating from theoretical ecology, evaluating its likeness to biological
ecosystems. This included its responsiveness to requests for applications from
the user base, as a measure of the ecological succession (ecosystem maturity).
Overall, we have advanced the understanding of Digital Ecosystems, creating
Ecosystem-Oriented Architectures where the word ecosystem is more than just a
metaphor.Comment: 39 pages, 26 figures, journa
An Empirical Study of a Software Maintenance Process
This paper describes how a process support tool is used to collect metrics about a major upgrade to our own electronic retail system. An incremental prototyping lifecycle is adopted in which each increment is categorised by an effort type and a project component. Effort types are Acquire, Build, Comprehend and Design and span all phases of development. Project components include data models and process models expressed in an OO modelling language and process algebra respectively as well as C++ classes and function templates and build components including source files and data files. This categorisation is independent of incremental prototyping and equally applicable to other software lifecycles. The process support tool (PWI) is responsible for ensuring the consistency between the models and the C++ source. It also supports the interaction between multiple developers and multiple metric-collectors. The first two releases of the retailing software are available for ftp from oracle.ecs.soton.ac.uk in directory pub/peter. Readers are invited to use the software and apply their own metrics as appropriate. We would be interested to correspond with anyone who does so
Uptake of BIM and IPD within the UK AEC Industry: the evolving role of the architectural technologist
Building Information Modelling is not only a tool, but also the process of creation, maintenance, distribution and co-ordination of an integrated database that collaboratively stores 2D and 3D information, with embedded physical and functional data within a project-building model. The uptake of BIM within the UK Architecture, Engineering and Construction (AEC) industry has been slow since the 1980âs, but over recent years, adoptions have increased. The increased collaborative nature of BIM, external data sharing techniques and progressively complex building design, promotes requirements for design teams to coordinate and communicate more effectively to achieve project goals. To manage this collaboration, new or evolved job roles may emerge. This research examined the current use of BIM, Integrated Project Delivery (IPD) and collaborative working in the UK AEC industry and job roles that have evolved or been created to cater for them. Using semi-structured interviews the interviewees indicated while several of the key enablers of IPD were being used, IPD itself had not been fully adopted. BIM was being used with some success but improvements could be made. New job roles such as the BIM Engineer and BIM Coordinator had been seen in the industry and evidence that the Architectural Technologist (AT) role is evolving into a more multidisciplinary role; this reflects similar findings of recent research
Evolutionary improvement of programs
Most applications of genetic programming (GP) involve the creation of an entirely new function, program or expression to solve a specific problem. In this paper, we propose a new approach that applies GP to improve existing software by optimizing its non-functional properties such as execution time, memory usage, or power consumption. In general, satisfying non-functional requirements is a difficult task and often achieved in part by optimizing compilers. However, modern compilers are in general not always able to produce semantically equivalent alternatives that optimize non-functional properties, even if such alternatives are known to exist: this is usually due to the limited local nature of such optimizations. In this paper, we discuss how best to combine and extend the existing evolutionary methods of GP, multiobjective optimization, and coevolution in order to improve existing software. Given as input the implementation of a function, we attempt to evolve a semantically equivalent version, in this case optimized to reduce execution time subject to a given probability distribution of inputs. We demonstrate that our framework is able to produce non-obvious optimizations that compilers are not yet able to generate on eight example functions. We employ a coevolved population of test cases to encourage the preservation of the function's semantics. We exploit the original program both through seeding of the population in order to focus the search, and as an oracle for testing purposes. As well as discussing the issues that arise when attempting to improve software, we employ rigorous experimental method to provide interesting and practical insights to suggest how to address these issues
Deferred Action: Theoretical model of process architecture design for emergent business processes
E-Business modelling and ebusiness systems development assumes fixed company resources,
structures, and business processes. Empirical and theoretical evidence suggests that company resources
and structures are emergent rather than fixed. Planning business activity in emergent contexts requires
flexible ebusiness models based on better management theories and models . This paper builds and
proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect
business processes. Drawing on development of theories of the âaction and designâclass the Theory of
Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible
process architecture is presented by identifying its core components and their relationships, and then
illustrated with exemplar flexible process architectures capable of responding to emergent factors.
Managerial implications of the model are considered and the modelâs generic applicability is discussed
Recommended from our members
A practical approach to goal modelling for time-constrained projects
Goal modelling is a well known rigorous method for analysing
problem rationale and developing requirements. Under the pressures typical of time-constrained projects its benefits are not accessible. This is because of the effort and time needed to create the graph and because reading the results can be difficult owing to the effects of crosscutting concerns. Here we introduce an adaptation of KAOS to meet the needs of rapid turn around and clarity. The main aim is to help the stakeholders gain an insight into the larger issues that might be overlooked if they make a premature start into implementation. The method emphasises the use of obstacles, accepts under-refined goals and has
new methods for managing crosscutting concerns and strategic decision making. It is expected to be of value to agile as well as traditional processes
Traceability for Model Driven, Software Product Line Engineering
Traceability is an important challenge for software organizations. This is true for traditional software development and even more so in new approaches that introduce more variety of artefacts such as Model Driven development or Software Product Lines. In this paper we look at some aspect of the interaction of Traceability, Model Driven development and Software Product Line
Contribution structures
The invisibility of the individuals and groups that gave rise to requirements artifacts has
been identified as a primary reason for the persistence of requirements traceability
problems. This paper presents an approach, based on modelling the dynamic contribution
structures underlying requirements artifacts, which addresses this issue. We show how
these structures can be defined, using information about the agents who have contributed
to artifact production, in conjunction with details of the numerous traceability relations
that hold within and between artifacts themselves. We describe a scheme, derived from
work in sociolinguistics, which can be used to indicate the capacities in which agents
contribute. We then show how this information can be used to infer details about the
social roles and commitments of agents with respect to their various contributions and to
each other. We further propose a categorisation for artifact-based traceability relations
and illustrate how they impinge on the identification and definition of these structures.
Finally, we outline how this approach can be implemented and supported by tools,
explain the means by which requirements change can be accommodated in the
corresponding contribution structures, and demonstrate the potential it provides for
"personnel-based" requirements traceability
Simulation, no problem, of course we offer this service! (observations on firms who have worked to make this true)
The paper focuses on the practical experiences of a number of professional firms striving to use simulation to deliver information of value to their clients. It exposes issues such as limitations in existing working practices and the mismatch between language routinely used by facilitators and trainees as well as their different expectations. The paper also discusses the differences observed between incremental implementation of simulation within practices and firms who wished to "jump in at the deep end". Lastly, it addresses the dilemma of how to move simulation tools into the already busy schedules and overloaded programmes of design practices successfully
- âŠ