817 research outputs found
Towards implementing integrated building product libraries
Electronic product catalogues and brochures are gaining
popularity but there is little agreement on content, format and
searching methods. This limits their usability and integration with
existing construction software tools. This paper examines a productmodelling
approach to delivering building product information and
describes a proposed multi-tier client-server environment. ISO/STEP
and IAI/IFC building product models are considered to facilitate
representation, exchange and sharing of product information. The
proposed architecture incorporates scalability with middleware
components that would provide single or few points of entry to
integrated product information. This paper is part of a research
project, which builds on the results of related projects including
ConstructIT Strategy, PROCAT-GEN, Active Catalog, COMBINE and ARROW,
towards implementing the required software components
Recommended from our members
Development of an online collaborative working environment for design and manufacturing
This research is to develop a novel collaborative working environment (CWE) for manufacturing and design using advanced Web/Internet technologies such as Web Service, Grid Service and other related software tools/packages. To achieve the above, the following research modules are developed by the author: A service oriented framework for computer aid design, which acts as an online collaboration system, has been developed with the utilisation of the latest technology, Web Service. The concept of Service-Oriented Architecture has been implemented in the framework. Users from anywhere in the world can join the design process from their PCs, no matter what operation system they are using. The service-oriented system has the capability of going through firewalls and can afford multi-users due to the characteristics of Web service. Also the loose-coupling structure makes the system very easy to be updated. Another module for the CWE is to solve the software sharing problem when the platform is used among several geographically dispersed users or organisations. A software package bank system has been developed, which utilised the ideology of service oriented approach and successfully solved traditional problems in this field. Based on the outcomes mentioned above, the research finally developed a more powerful infrastructure using Grid service, which is a further development of Grid computing and Web service. The Grid service is considered to be the most important future solvent for Internet
Development of an integrated product information management system
This thesis reports on a research project undertaken over a four year period investigating
and developing a software framework and application for integrating and managing
building product information for construction engineering. The research involved
extensive literature research, observation of the industry practices and interviews with
construction industry practitioners and systems implementers to determine how best to
represent and present product information to support the construction process.
Applicable product models for information representation were reviewed and evaluated
to determine present suitability. The IFC product model was found to be the most
applicable. Investigations of technologies supporting the product model led to the
development of a software tool, the IFC Assembly Viewer, which aided further
investigations into the suitability of the product model (in its current state) for the
exchange and sharing of product information. A software framework, or reusable
software design and application, called PROduct Information Management System
(PROMIS), was developed based on a non-standard product model but with flexibility
to work with the IFC product model when sufficiently mature. The software comprises
three subsystems namely: ProductWeb, ModelManager.NET and Product/Project
Service (or P2Service). The key features of this system were shared project databases,
parametric product specification, integration of product information sources, and
application interaction and integration through interface components. PROMIS was
applied to and tested with a modular construction business for the management of
product information and for integration of product and project information through the
design and construction (production) process
Forum Session at the First International Conference on Service Oriented Computing (ICSOC03)
The First International Conference on Service Oriented Computing (ICSOC) was held in Trento, December 15-18, 2003. The focus of the conference ---Service Oriented Computing (SOC)--- is the new emerging paradigm for distributed computing and e-business processing that has evolved from object-oriented and component computing to enable building agile networks of collaborating business applications distributed within and across organizational boundaries. Of the 181 papers submitted to the ICSOC conference, 10 were selected for the forum session which took place on December the 16th, 2003. The papers were chosen based on their technical quality, originality, relevance to SOC and for their nature of being best suited for a poster presentation or a demonstration. This technical report contains the 10 papers presented during the forum session at the ICSOC conference. In particular, the last two papers in the report ere submitted as industrial papers
Adaptive object management for distributed systems
This thesis describes an architecture supporting the management of pluggable software components and evaluates it against the requirement for an enterprise integration platform for the manufacturing and petrochemical industries. In a distributed environment, we need mechanisms to manage objects and their interactions. At the least, we must be able to create objects in different processes on different nodes; we must be able to link them together so that they can pass messages to each other across the network; and we must deliver their messages in a timely and reliable manner. Object based environments which support these services already exist, for example ANSAware(ANSA, 1989), DEC's Objectbroker(ACA,1992), Iona's Orbix(Orbix,1994)Yet such environments provide limited support for composing applications from pluggable components. Pluggability is the ability to install and configure a component into an environment dynamically when the component is used, without specifying static dependencies between components when they are produced. Pluggability is supported to a degree by dynamic binding. Components may be programmed to import references to other components and to explore their interfaces at runtime, without using static type dependencies. Yet thus overloads the component with the responsibility to explore bindings. What is still generally missing is an efficient general-purpose binding model for managing bindings between independently produced components. In addition, existing environments provide no clear strategy for dealing with fine grained objects. The overhead of runtime binding and remote messaging will severely reduce performance where there are a lot of objects with complex patterns of interaction. We need an adaptive approach to managing configurations of pluggable components according to the needs and constraints of the environment. Management is made difficult by embedding bindings in component implementations and by relying on strong typing as the only means of verifying and validating bindings. To solve these problems we have built a set of configuration tools on top of an existing distributed support environment. Specification tools facilitate the construction of independent pluggable components. Visual composition tools facilitate the configuration of components into applications and the verification of composite behaviours. A configuration model is constructed which maintains the environmental state. Adaptive management is made possible by changing the management policy according to this state. Such policy changes affect the location of objects, their bindings, and the choice of messaging system
Integrating legacy mainframe systems: architectural issues and solutions
For more than 30 years, mainframe computers have been the backbone of computing systems throughout the world. Even today it is estimated that some 80% of the worlds' data is held on such machines. However, new business requirements and pressure from evolving technologies, such as the Internet is pushing these existing systems to their limits and they are reaching breaking point. The Banking and Financial Sectors in particular have been relying on mainframes for the longest time to do their business and as a result it is they that feel these pressures the most.
In recent years there have been various solutions for enabling a re-engineering of these legacy systems. It quickly became clear that to completely rewrite them was not possible so various integration strategies emerged.
Out of these new integration strategies, the CORBA standard by the Object Management Group emerged as the strongest, providing a standards based solution that enabled the mainframe applications become a peer in a distributed computing environment.
However, the requirements did not stop there. The mainframe systems were reliable, secure, scalable and fast, so any integration strategy had to ensure that the new distributed systems did not lose any of these benefits. Various patterns or general solutions to the problem of meeting these requirements have arisen and this research looks at applying some of these patterns to mainframe based CORBA applications.
The purpose of this research is to examine some of the issues involved with making mainframebased legacy applications inter-operate with newer Object Oriented Technologies
An investigation into the implementation issues and challenges of service oriented architecture
Several literatures have been published about the semantic web services being the solution to interoperability challenges within the Service Oriented Architecture (SOA) framework. The aim of this dissertation was to find out, if the introduction of the semantic layer into the SOA infrastructure will actually solve these challenges. In order to determine the existence of these challenges, a traditional web service built on XML technology was developed; first to understand the technology behind web services and secondly to demonstrate the limitations of the original SOA framework especially in the area of automatic service discovery and automatic service composition. To further investigate how the Semantic layer could solve these limitations; a semantic web service was developed, to explore the tools and models available to develop semantic web services and the possible challenges that could arise from the inclusion of the semantic layer into the SOA infrastructure. These two applications were evaluated and compared in terms of their capabilities and underlying technologies to find out if truly, the semantic web services could solve the interoperability challenges within the SOA infrastructure. Since semantic web services are built using ontologies, they have well described interfaces that allow for automatic web service discovery and invocation; it was found out that truly, they can solve the interoperability challenges in the SOA framework. However, there are a number of challenges that could impede the development of the Semantic SOA; such challenges were discussed in this paper. Finally, this paper concludes by highlighting areas in which the work in this research could be extended
Current Trends and New Challenges of Databases and Web Applications for Systems Driven Biological Research
Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and Web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases and Web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research
Internet-based solutions to support distributed manufacturing
With the globalisation and constant changes in the marketplace, enterprises are adapting themselves to face new challenges. Therefore, strategic corporate alliances to share knowledge, expertise and resources represent an advantage in an increasing competitive world. This has led the integration of companies, customers, suppliers and partners using networked environments. This thesis presents three novel solutions in the tooling area, developed for Seco tools Ltd, UK. These approaches implement a proposed distributed computing architecture using Internet technologies to assist geographically dispersed tooling engineers in process planning tasks. The systems are summarised as follows. TTS is a Web-based system to support engineers and technical staff in the task of providing technical advice to clients. Seco sales engineers access the system from remote machining sites and submit/retrieve/update the required tooling data located in databases at the company headquarters. The communication platform used for this system provides an effective mechanism to share information nationwide. This system implements efficient methods, such as data relaxation techniques, confidence score and importance levels of attributes, to help the user in finding the closest solutions when specific requirements are not fully matched In the database. Cluster-F has been developed to assist engineers and clients in the assessment of cutting parameters for the tooling process. In this approach the Internet acts as a vehicle to transport the data between users and the database. Cluster-F is a KD approach that makes use of clustering and fuzzy set techniques. The novel proposal In this system is the implementation of fuzzy set concepts to obtain the proximity matrix that will lead the classification of the data. Then hierarchical clustering methods are applied on these data to link the closest objects. A general KD methodology applying rough set concepts Is proposed In this research. This covers aspects of data redundancy, Identification of relevant attributes, detection of data inconsistency, and generation of knowledge rules. R-sets, the third proposed solution, has been developed using this KD methodology. This system evaluates the variables of the tooling database to analyse known and unknown relationships in the data generated after the execution of technical trials. The aim is to discover cause-effect patterns from selected attributes contained In the database. A fourth system was also developed. It is called DBManager and was conceived to administrate the systems users accounts, sales engineers’ accounts and tool trial monitoring process of the data. This supports the implementation of the proposed distributed architecture and the maintenance of the users' accounts for the access restrictions to the system running under this architecture
- …