51,815 research outputs found
Recommended from our members
Reusability in software engineering
This paper surveys recent work concerning reusability in software engineering. The current directions in software reusability are discussed, and the two major approaches of reusable building blocks and reusable patterns studied. An extensive bibliography, parts of which are annotated, is included
Using the DiaSpec design language and compiler to develop robotics systems
A Sense/Compute/Control (SCC) application is one that interacts with the
physical environment. Such applications are pervasive in domains such as
building automation, assisted living, and autonomic computing. Developing an
SCC application is complex because: (1) the implementation must address both
the interaction with the environment and the application logic; (2) any
evolution in the environment must be reflected in the implementation of the
application; (3) correctness is essential, as effects on the physical
environment can have irreversible consequences. The SCC architectural pattern
and the DiaSpec domain-specific design language propose a framework to guide
the design of such applications. From a design description in DiaSpec, the
DiaSpec compiler is capable of generating a programming framework that guides
the developer in implementing the design and that provides runtime support. In
this paper, we report on an experiment using DiaSpec (both the design language
and compiler) to develop a standard robotics application. We discuss the
benefits and problems of using DiaSpec in a robotics setting and present some
changes that would make DiaSpec a better framework in this setting.Comment: DSLRob'11: Domain-Specific Languages and models for ROBotic systems
(2011
A Parsing Scheme for Finding the Design Pattern and Reducing the Development Cost of Reusable Object Oriented Software
Because of the importance of object oriented methodologies, the research in
developing new measure for object oriented system development is getting
increased focus. The most of the metrics need to find the interactions between
the objects and modules for developing necessary metric and an influential
software measure that is attracting the software developers, designers and
researchers. In this paper a new interactions are defined for object oriented
system. Using these interactions, a parser is developed to analyze the existing
architecture of the software. Within the design model, it is necessary for
design classes to collaborate with one another. However, collaboration should
be kept to an acceptable minimum i.e. better designing practice will introduce
low coupling. If a design model is highly coupled, the system is difficult to
implement, to test and to maintain overtime. In case of enhancing software, we
need to introduce or remove module and in that case coupling is the most
important factor to be considered because unnecessary coupling may make the
system unstable and may cause reduction in the system's performance. So
coupling is thought to be a desirable goal in software construction, leading to
better values for external software qualities such as maintainability,
reusability and so on. To test this hypothesis, a good measure of class
coupling is needed. In this paper, based on the developed tool called Design
Analyzer we propose a methodology to reuse an existing system with the
objective of enhancing an existing Object oriented system keeping the coupling
as low as possible.Comment: 15 page
Recommended from our members
A computer-based product classification and component detection for demanufacturing processes
This is an Author's Accepted Manuscript of an article published in International Journal of Computer Integrated
Manufacturing, 24(10), 900-914, 2011 [copyright Taylor & Francis], available online at:
http://www.tandfonline.com/10.1080/0951192X.2011.579169.The aim of this paper is to propose a novel computer-based product classification, component detection and tracking for demanufacturing and disassembly process. This is achieved by introducing a series of automated and sequential product scanning, component identification, image analysis and sorting – leading to the development of a bill of material (BOM). The produced BOM can then be associated with the relevant disassembly/demanufacture proviso. The proposed integrated image sorting and product classification (ISPC) approach can be considered as a step forward in automation of demanufacturing activities. The ISPC model proposed in this paper utilises and builds on the state-of-the-art technology and current body of research in computer-integrated demanufacturing and remanufacturing (CIDR). An appraisal of the latest research material and the factors that inhibit CIDR methods inpractice are presented. A novel solution for the integration of imaging and material identification techniques toovercome some of the existing shortcomings of automated recycling processes is proposed in this paper. The proposed product scanning and component detection ISPC software consists of four distinct models: the repertory database, the search engine, the product-attributes updater and the image sorting and classification algorithm. The software framework that integrates the four components is presented in this paper. Finally, an overall assessment of applying ISPC at various stages of CIDR processes concludes the article.University of Ibadan MacArthur Foundation Gran
Continuous Improvement Through Knowledge-Guided Analysis in Experience Feedback
Continuous improvement in industrial processes is increasingly a key element of competitiveness for industrial systems. The management of experience feedback in this framework is designed to build, analyze and facilitate the knowledge sharing among problem solving practitioners of an organization in order to improve processes and products achievement. During Problem Solving Processes, the intellectual investment of experts is often considerable and the opportunities for expert knowledge exploitation are numerous: decision making, problem solving under uncertainty, and expert configuration. In this paper, our contribution relates to the structuring of a cognitive experience feedback framework, which allows a flexible exploitation of expert knowledge during Problem Solving Processes and a reuse such collected experience. To that purpose, the proposed approach uses the general principles of root cause analysis for identifying the root causes of problems or events, the conceptual graphs formalism for the semantic conceptualization of the domain vocabulary and the Transferable Belief Model for the fusion of information from different sources. The underlying formal reasoning mechanisms (logic-based semantics) in conceptual graphs enable intelligent information retrieval for the effective exploitation of lessons learned from past projects. An example will illustrate the application of the proposed approach of experience feedback processes formalization in the transport industry sector
A Model-Driven Engineering Approach for ROS using Ontological Semantics
This paper presents a novel ontology-driven software engineering approach for
the development of industrial robotics control software. It introduces the
ReApp architecture that synthesizes model-driven engineering with semantic
technologies to facilitate the development and reuse of ROS-based components
and applications. In ReApp, we show how different ontological classification
systems for hardware, software, and capabilities help developers in discovering
suitable software components for their tasks and in applying them correctly.
The proposed model-driven tooling enables developers to work at higher
abstraction levels and fosters automatic code generation. It is underpinned by
ontologies to minimize discontinuities in the development workflow, with an
integrated development environment presenting a seamless interface to the user.
First results show the viability and synergy of the selected approach when
searching for or developing software with reuse in mind.Comment: Presented at DSLRob 2015 (arXiv:1601.00877), Stefan Zander, Georg
Heppner, Georg Neugschwandtner, Ramez Awad, Marc Essinger and Nadia Ahmed: A
Model-Driven Engineering Approach for ROS using Ontological Semantic
Beyond Reuse Distance Analysis: Dynamic Analysis for Characterization of Data Locality Potential
Emerging computer architectures will feature drastically decreased flops/byte
(ratio of peak processing rate to memory bandwidth) as highlighted by recent
studies on Exascale architectural trends. Further, flops are getting cheaper
while the energy cost of data movement is increasingly dominant. The
understanding and characterization of data locality properties of computations
is critical in order to guide efforts to enhance data locality. Reuse distance
analysis of memory address traces is a valuable tool to perform data locality
characterization of programs. A single reuse distance analysis can be used to
estimate the number of cache misses in a fully associative LRU cache of any
size, thereby providing estimates on the minimum bandwidth requirements at
different levels of the memory hierarchy to avoid being bandwidth bound.
However, such an analysis only holds for the particular execution order that
produced the trace. It cannot estimate potential improvement in data locality
through dependence preserving transformations that change the execution
schedule of the operations in the computation. In this article, we develop a
novel dynamic analysis approach to characterize the inherent locality
properties of a computation and thereby assess the potential for data locality
enhancement via dependence preserving transformations. The execution trace of a
code is analyzed to extract a computational directed acyclic graph (CDAG) of
the data dependences. The CDAG is then partitioned into convex subsets, and the
convex partitioning is used to reorder the operations in the execution trace to
enhance data locality. The approach enables us to go beyond reuse distance
analysis of a single specific order of execution of the operations of a
computation in characterization of its data locality properties. It can serve a
valuable role in identifying promising code regions for manual transformation,
as well as assessing the effectiveness of compiler transformations for data
locality enhancement. We demonstrate the effectiveness of the approach using a
number of benchmarks, including case studies where the potential shown by the
analysis is exploited to achieve lower data movement costs and better
performance.Comment: Transaction on Architecture and Code Optimization (2014
An Empirical Study of a Software Maintenance Process
This paper describes how a process support tool is used to collect metrics about a major upgrade to our own electronic retail system. An incremental prototyping lifecycle is adopted in which each increment is categorised by an effort type and a project component. Effort types are Acquire, Build, Comprehend and Design and span all phases of development. Project components include data models and process models expressed in an OO modelling language and process algebra respectively as well as C++ classes and function templates and build components including source files and data files. This categorisation is independent of incremental prototyping and equally applicable to other software lifecycles. The process support tool (PWI) is responsible for ensuring the consistency between the models and the C++ source. It also supports the interaction between multiple developers and multiple metric-collectors. The first two releases of the retailing software are available for ftp from oracle.ecs.soton.ac.uk in directory pub/peter. Readers are invited to use the software and apply their own metrics as appropriate. We would be interested to correspond with anyone who does so
- …