15 research outputs found

    Automated Energy Distribution and Reliability System Status Report

    Full text link

    Internet-based solutions to support distributed manufacturing

    Get PDF
    With the globalisation and constant changes in the marketplace, enterprises are adapting themselves to face new challenges. Therefore, strategic corporate alliances to share knowledge, expertise and resources represent an advantage in an increasing competitive world. This has led the integration of companies, customers, suppliers and partners using networked environments. This thesis presents three novel solutions in the tooling area, developed for Seco tools Ltd, UK. These approaches implement a proposed distributed computing architecture using Internet technologies to assist geographically dispersed tooling engineers in process planning tasks. The systems are summarised as follows. TTS is a Web-based system to support engineers and technical staff in the task of providing technical advice to clients. Seco sales engineers access the system from remote machining sites and submit/retrieve/update the required tooling data located in databases at the company headquarters. The communication platform used for this system provides an effective mechanism to share information nationwide. This system implements efficient methods, such as data relaxation techniques, confidence score and importance levels of attributes, to help the user in finding the closest solutions when specific requirements are not fully matched In the database. Cluster-F has been developed to assist engineers and clients in the assessment of cutting parameters for the tooling process. In this approach the Internet acts as a vehicle to transport the data between users and the database. Cluster-F is a KD approach that makes use of clustering and fuzzy set techniques. The novel proposal In this system is the implementation of fuzzy set concepts to obtain the proximity matrix that will lead the classification of the data. Then hierarchical clustering methods are applied on these data to link the closest objects. A general KD methodology applying rough set concepts Is proposed In this research. This covers aspects of data redundancy, Identification of relevant attributes, detection of data inconsistency, and generation of knowledge rules. R-sets, the third proposed solution, has been developed using this KD methodology. This system evaluates the variables of the tooling database to analyse known and unknown relationships in the data generated after the execution of technical trials. The aim is to discover cause-effect patterns from selected attributes contained In the database. A fourth system was also developed. It is called DBManager and was conceived to administrate the systems users accounts, sales engineers’ accounts and tool trial monitoring process of the data. This supports the implementation of the proposed distributed architecture and the maintenance of the users' accounts for the access restrictions to the system running under this architecture

    Architecture and implementation of online communities

    Get PDF
    Thesis (Ph.D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.Includes bibliographical references.by Philip Greenspun.Ph.D

    Merge replication in Microsoft's SQL server 7.0

    No full text

    A methodology for the distributed and collaborative management of engineering knowledge

    Get PDF
    The problems of collaborative engineering design and management at the conceptual stage in a large network of dissimilar enterprises was investigated. This issue in engineering design is a result of the supply chain and virtual enterprise (VE) oriented industry that demands faster time to market and accurate cost/manufacturing analysis from conception. Current tools and techniques do not completely fulfil this requirement due to a lack of coherent inter-enterprise collaboration and a dearth of manufacturing knowledge available at the concept stage. Client-server and peer to peer systems were tested for communication, as well as various techniques for knowledge management and propagation including Product Lifecycle Management (PLM) and expert systems. As a result of system testing, and extensive literature review, several novel techniques were proposed and tested to improve the coherent management of knowledge and enable inter-enterprise collaboration. The techniques were trialled on two engineering project examples. An automotive Tier-1 supplier which designs products whose components are sub­contracted to a large supply chain and assembled for an Original Equipment Manufacturer (OEM) was used as a test scenario. The utility of the systems for integrating large VEs into a coherent project with unified specifications were demonstrated in a simple example, and problems associated with engineering document management overcome via re-usable, configurable, object oriented ontologies propagated throughout the VE imposing a coherent nomenclature and engineering product definition. All knowledge within the system maintains links from specification - concept - design - testing through to manufacturing stages, aiding the participating enterprises in maintaining their knowledge and experience for future projects. This potentially speeds the process of innovation by enabling companies to concentrate on value-added aspects of designs whilst ‘bread-and-butter’ expertise is reused. The second example, a manufacturer of rapid-construction steel bridges, demonstrated the manufacturing dimension of the methodology, where the early stage of design, and the generation of new concepts by reusing existing manufacturing knowledge bases was demonstrated. The solution consisted of a de-centralised super-peer net architecture to establish and maintain communications between enterprises in a VE. The enterprises are able to share knowledge in a common format and nomenclature via the building-block shareable super-ontology that can be tailored on a project by project basis, whilst maintaining the common nomenclature of the ‘super-ontology’ eliminating knowledge interpretation issues. The two-tier architecture developed as part of the solution glues together the peer-peer and super-ontologies to form a coherent system for internal knowledge management and product development as well as external virtual enterprise product development and knowledge management. In conclusion, the methodology developed for collaboration and knowledge management was shown to be more appropriate for use by smaller enterprises collaborating in a large Virtual Enterprise than PLM technology in terms of: usability, configurability, cost of system and individual control over intellectual property rights

    dspace 6.0 manual

    Get PDF

    Third International Symposium on Space Mission Operations and Ground Data Systems, part 1

    Get PDF
    Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The papers focus on improvements in the efficiency, effectiveness, productivity, and quality of data acquisition, ground systems, and mission operations. New technology, techniques, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations

    DSpace 5.x Documentation

    Get PDF
    DSpace is an open source software platform that enables organisations to: - capture and describe digital material using a submission workflow module, or a variety of programmatic ingest options - distribute an organisation's digital assets over the web through a search and retrieval system - preserve digital assets over the long term This system documentation includes a functional overview of the system, which is a good introduction to the capabilities of the system, and should be readable by non-technical folk. Everyone should read this section first because it introduces some terminology used throughout the rest of the documentation. For people actually running a DSpace service, there is an installation guide, and sections on configuration and the directory structure. Finally, for those interested in the details of how DSpace works, and those potentially interested in modifying the code for their own purposes, there is a detailed architecture and design section.DSpace is an open source software platform that enables organisations to: - capture and describe digital material using a submission workflow module, or a variety of programmatic ingest options - distribute an organisation's digital assets over the web through a search and retrieval system - preserve digital assets over the long term This system documentation includes a functional overview of the system, which is a good introduction to the capabilities of the system, and should be readable by non-technical folk. Everyone should read this section first because it introduces some terminology used throughout the rest of the documentation. For people actually running a DSpace service, there is an installation guide, and sections on configuration and the directory structure. Finally, for those interested in the details of how DSpace works, and those potentially interested in modifying the code for their own purposes, there is a detailed architecture and design section

    European Information Technology Observatory 1996

    Get PDF
    corecore