50,825 research outputs found

    Impliance: A Next Generation Information Management Appliance

    Full text link
    ably successful in building a large market and adapting to the changes of the last three decades, its impact on the broader market of information management is surprisingly limited. If we were to design an information management system from scratch, based upon today's requirements and hardware capabilities, would it look anything like today's database systems?" In this paper, we introduce Impliance, a next-generation information management system consisting of hardware and software components integrated to form an easy-to-administer appliance that can store, retrieve, and analyze all types of structured, semi-structured, and unstructured information. We first summarize the trends that will shape information management for the foreseeable future. Those trends imply three major requirements for Impliance: (1) to be able to store, manage, and uniformly query all data, not just structured records; (2) to be able to scale out as the volume of this data grows; and (3) to be simple and robust in operation. We then describe four key ideas that are uniquely combined in Impliance to address these requirements, namely the ideas of: (a) integrating software and off-the-shelf hardware into a generic information appliance; (b) automatically discovering, organizing, and managing all data - unstructured as well as structured - in a uniform way; (c) achieving scale-out by exploiting simple, massive parallel processing, and (d) virtualizing compute and storage resources to unify, simplify, and streamline the management of Impliance. Impliance is an ambitious, long-term effort to define simpler, more robust, and more scalable information systems for tomorrow's enterprises.Comment: This article is published under a Creative Commons License Agreement (http://creativecommons.org/licenses/by/2.5/.) You may copy, distribute, display, and perform the work, make derivative works and make commercial use of the work, but, you must attribute the work to the author and CIDR 2007. 3rd Biennial Conference on Innovative Data Systems Research (CIDR) January 710, 2007, Asilomar, California, US

    Considerations for a design and operations knowledge support system for Space Station Freedom

    Get PDF
    Engineering and operations of modern engineered systems depend critically upon detailed design and operations knowledge that is accurate and authoritative. A design and operations knowledge support system (DOKSS) is a modern computer-based information system providing knowledge about the creation, evolution, and growth of an engineered system. The purpose of a DOKSS is to provide convenient and effective access to this multifaceted information. The complexity of Space Station Freedom's (SSF's) systems, elements, interfaces, and organizations makes convenient access to design knowledge especially important, when compared to simpler systems. The life cycle length, being 30 or more years, adds a new dimension to space operations, maintenance, and evolution. Provided here is a review and discussion of design knowledge support systems to be delivered and operated as a critical part of the engineered system. A concept of a DOKSS for Space Station Freedom (SSF) is presented. This is followed by a detailed discussion of a DOKSS for the Lyndon B. Johnson Space Center and Work Package-2 portions of SSF

    Ship product modelling

    Get PDF
    This paper is a fundamental review of ship product modeling techniques with a focus on determining the state of the art, to identify any shortcomings and propose future directions. The review addresses ship product data representations, product modeling techniques and integration issues, and life phase issues. The most significant development has been the construction of the ship Standard for the Exchange of Product Data (STEP) application protocols. However, difficulty has been observed with respect to the general uptake of the standards, in particular with the application to legacy systems, often resulting in embellishments to the standards and limiting the ability to further exchange the product data. The EXPRESS modeling language is increasingly being superseded by the extensible mark-up language (XML) as a method to map the STEP data, due to its wider support throughout the information technology industry and its more obvious structure and hierarchy. The associated XML files are, however, larger than those produced using the EXPRESS language and make further demands on the already considerable storage required for the ship product model. Seamless integration between legacy applications appears to be difficult to achieve using the current technologies, which often rely on manual interaction for the translation of files. The paper concludes with a discussion of future directions that aim to either solve or alleviate these issues

    HOMEBOTS: Intelligent Decentralized Services for Energy Management

    Get PDF
    The deregulation of the European energy market, combined with emerging advanced capabilities of information technology, provides strategic opportunities for new knowledge-oriented services on the power grid. HOMEBOTS is the namewe have coined for one of these innovative services: decentralized power load management at the customer side, automatically carried out by a `society' of interactive household, industrial and utility equipment. They act as independent intelligent agents that communicate and negotiate in a computational market economy. The knowledge and competence aspects of this application are discussed, using an improved \ud version of task analysis according to the COMMONKADS knowledge methodology. Illustrated by simulation results, we indicate how customer knowledge can be mobilized to achieve joint goals of cost and energy savings. General implications for knowledge creation and its management are discussed

    Proceedings of International Workshop "Global Computing: Programming Environments, Languages, Security and Analysis of Systems"

    Get PDF
    According to the IST/ FET proactive initiative on GLOBAL COMPUTING, the goal is to obtain techniques (models, frameworks, methods, algorithms) for constructing systems that are flexible, dependable, secure, robust and efficient. The dominant concerns are not those of representing and manipulating data efficiently but rather those of handling the co-ordination and interaction, security, reliability, robustness, failure modes, and control of risk of the entities in the system and the overall design, description and performance of the system itself. Completely different paradigms of computer science may have to be developed to tackle these issues effectively. The research should concentrate on systems having the following characteristics: • The systems are composed of autonomous computational entities where activity is not centrally controlled, either because global control is impossible or impractical, or because the entities are created or controlled by different owners. • The computational entities are mobile, due to the movement of the physical platforms or by movement of the entity from one platform to another. • The configuration varies over time. For instance, the system is open to the introduction of new computational entities and likewise their deletion. The behaviour of the entities may vary over time. • The systems operate with incomplete information about the environment. For instance, information becomes rapidly out of date and mobility requires information about the environment to be discovered. The ultimate goal of the research action is to provide a solid scientific foundation for the design of such systems, and to lay the groundwork for achieving effective principles for building and analysing such systems. This workshop covers the aspects related to languages and programming environments as well as analysis of systems and resources involving 9 projects (AGILE , DART, DEGAS , MIKADO, MRG, MYTHS, PEPITO, PROFUNDIS, SECURE) out of the 13 founded under the initiative. After an year from the start of the projects, the goal of the workshop is to fix the state of the art on the topics covered by the two clusters related to programming environments and analysis of systems as well as to devise strategies and new ideas to profitably continue the research effort towards the overall objective of the initiative. We acknowledge the Dipartimento di Informatica and Tlc of the University of Trento, the Comune di Rovereto, the project DEGAS for partially funding the event and the Events and Meetings Office of the University of Trento for the valuable collaboration

    A Practical Guide to Integrating Reproductive Health and HIV/AIDS into Grant Proposals to the Global Fund

    Get PDF
    Integrating RH and HIV can greatly contribute to mitigating the AIDS pandemic by reducing unintended pregnancy; preventing perinatal transmission; expanding to more target groups; reducing gender based violence; meeting the needs of people living with HIV and providing our youth with the knowledge and services they need. Whether to integrate, how to integrate and exactly what to integrate will depend on a country's epidemiological profile, policies and program structures.Experience with implementation of integration initiatives in countries around the world shows that scale up and sustainability requires attention to policy and program operations issues. This document, with links to a range of resources, will help CCMs, civil society organizations and others developing proposals for the Global Fund that contribute to preventing HIV and mitigating the effects of the AIDS pandemic through programs that link and integrate RH and HIV/AIDS
    • …
    corecore