3,192 research outputs found
Business rules based legacy system evolution towards service-oriented architecture.
Enterprises can be empowered to live up to the potential of becoming dynamic, agile and real-time. Service orientation is emerging from the amalgamation of a number of key business, technology and cultural developments. Three essential trends in particular are coming together to create a new revolutionary breed of enterprise, the service-oriented enterprise (SOE): (1) the continuous performance management of the enterprise; (2) the emergence of business process management; and (3) advances in the standards-based service-oriented infrastructures.
This thesis focuses on this emerging three-layered architecture that builds on a service-oriented architecture framework, with a process layer that brings technology and business together, and a corporate performance layer that continually monitors and improves the performance indicators of global enterprises provides a novel framework for the business context in which to apply the important technical idea of service orientation and moves it from being an interesting tool for engineers to a vehicle for business managers to fundamentally improve their businesses
Software Reuse in Cardiology Related Medical Database Using K-Means Clustering Technique
Software technology based on reuse is identified as a process of designing
software for the reuse purpose. The software reuse is a process in which the
existing software is used to build new software. A metric is a quantitative
indicator of an attribute of an item or thing. Reusability is the likelihood
for a segment of source code that can be used again to add new functionalities
with slight or no modification. A lot of research has been projected using
reusability in reducing code, domain, requirements, design etc., but very
little work is reported using software reuse in medical domain. An attempt is
made to bridge the gap in this direction, using the concepts of clustering and
classifying the data based on the distance measures. In this paper cardiologic
database is considered for study. The developed model will be useful for
Doctors or Paramedics to find out the patients level in the cardiologic
disease, deduce the medicines required in seconds and propose them to the
patient. In order to measure the reusability K means clustering algorithm is
used.Comment: 5 pages. arXiv admin note: text overlap with arXiv:1212.031
Modular-topology optimization of structures and mechanisms with free material design and clustering
Topology optimization of modular structures and mechanisms enables balancing
the performance of automatically-generated individualized designs, as required
by Industry 4.0, with enhanced sustainability by means of component reuse. For
optimal modular design, two key questions must be answered: (i) what should the
topology of individual modules be like and (ii) how should modules be arranged
at the product scale? We address these challenges by proposing a bi-level
sequential strategy that combines free material design, clustering techniques,
and topology optimization. First, using free material optimization enhanced
with post-processing for checkerboard suppression, we determine the
distribution of elasticity tensors at the product scale. To extract the
sought-after modular arrangement, we partition the obtained elasticity tensors
with a novel deterministic clustering algorithm and interpret its outputs
within Wang tiling formalism. Finally, we design interiors of individual
modules by solving a single-scale topology optimization problem with the design
space reduced by modular mapping, conveniently starting from an initial guess
provided by free material optimization. We illustrate these developments with
three benchmarks first, covering compliance minimization of modular structures,
and, for the first time, the design of non-periodic compliant modular
mechanisms. Furthermore, we design a set of modules reusable in an inverter and
in gripper mechanisms, which ultimately pave the way towards the rational
design of modular architectured (meta)materials.Comment: 30 page
NOESIS: A Framework for Complex Network Data Analysis
Network data mining has attracted a lot of attention since a large number of real-world problems have to deal with complex
network data. In this paper, we present NOESIS, an open-source framework for network-based data mining. NOESIS features a
large number of techniques and methods for the analysis of structural network properties, network visualization, community
detection, link scoring, and link prediction. e proposed framework has been designed following solid design principles and
exploits parallel computing using structured parallel programming. NOESIS also provides a stand-alone graphical user interface
allowing the use of advanced software analysis techniques to users without prior programming experience. is framework is
available under a BSD open-source software license.The NOESIS project was partially supported by the Spanish
Ministry of Economy and the European Regional Development
Fund (FEDER), under grant TIN2012–36951, and the
Spanish Ministry of Education under the program “Ayudas
para contratos predoctorales para la formación de doctores
2013” (predoctoral grant BES–2013–064699)
Using Structural and Semantic Information to Identify Software Components
Component Based Software Engineering (CBSE) seeks to promote the reuse of software by using existing software modules into the development process. However, the availability of such a reusable component is not immediate and is costly and time consuming. As an alternative, the extraction from preexisting OO software can be considered.In this work, we evaluate two community detection algorithms for the task of software components identification. Considering 'components' as 'communities', the aim is to evaluate how independent, yet cohesive, the components are when extracted by structurally informed algorithms.We analyze 412 Java systems and evaluate the cohesion of the extracted communities using four document representation techniques. The evaluation aims to find which algorithm extracts the most semantically cohesive, yet separated communities.The results show a good performance in both algorithms, however, each has its own strengths. Leiden extracts less cohesive, but better separated, and better clustered components that depend more on similar ones. Infomap, on the other side, creates more cohesive, slightly overlapping clusters that are less likely to depend on other semantically similar components
Multi-layer Architecture For Storing Visual Data Based on WCF and Microsoft SQL Server Database
In this paper we present a novel architecture for storing visual data.
Effective storing, browsing and searching collections of images is one of the
most important challenges of computer science. The design of architecture for
storing such data requires a set of tools and frameworks such as SQL database
management systems and service-oriented frameworks. The proposed solution is
based on a multi-layer architecture, which allows to replace any component
without recompilation of other components. The approach contains five
components, i.e. Model, Base Engine, Concrete Engine, CBIR service and
Presentation. They were based on two well-known design patterns: Dependency
Injection and Inverse of Control. For experimental purposes we implemented the
SURF local interest point detector as a feature extractor and -means
clustering as indexer. The presented architecture is intended for content-based
retrieval systems simulation purposes as well as for real-world CBIR tasks.Comment: Accepted for the 14th International Conference on Artificial
Intelligence and Soft Computing, ICAISC, June 14-18, 2015, Zakopane, Polan
- …