1,321,748 research outputs found

    System data structures for on-line distributed data base management system

    Get PDF
    Described herein are the data structures used in implementing a distributed data base management system (DBMS) for the Mirror Fusion Test Facility (MFTF), a part of the Mirror Fusion Energy Program at the Lawrence Livermore National Laboratory. The hardware and software frameworks within which the DBMS have been developed are first described, followed by a brief look at the motivation and fundamental design goals of the system. The structures are given in detail

    Information system for administrating and distributing color images through internet

    Get PDF
    The information system for administrating and distributing color images through the Internet ensures the consistent replication of color images, their storage - in an on-line data base - and predictable distribution, by means of a digitally distributed flow, based on Windows platform and POD (Print On Demand) technology. The consistent replication of color images inde-pendently from the parameters of the processing equipment and from the features of the programs composing the technological flow, is ensured by the standard color management sys-tem defined by ICC (International Color Consortium), which is integrated by the Windows operation system and by the POD technology. The latter minimize the noticeable differences between the colors captured, displayed or printed by various replication equipments and/or edited by various graphical applications. The system integrated web application ensures the uploading of the color images in an on-line database and their administration and distribution among the users via the Internet. For the preservation of the data expressed by the color im-ages during their transfer along a digitally distributed flow, the software application includes an original tool ensuring the accurate replication of colors on computer displays or when printing them by means of various color printers or presses. For development and use, this application employs a hardware platform based on PC support and a competitive software platform, based on: the Windows operation system, the .NET. Development medium and the C# programming language. This information system is beneficial for creators and users of color images, the success of the printed or on-line (Internet) publications depending on the sizeable, predictable and accurate replication of colors employed for the visual expression of information in every activity fields of the modern society. The herein introduced information system enables all interested persons to access the information from the on-line database, whenever they want, wherever they are, by means of the digital infrastructure, computer net-works and modern communication technologies.Color management, data bank, web application, Internet, POD, Color space, PCS, CMM, CMS, ICC, CIEXYZ, CIEL*a*b, RGB, CMYK, ICC profile.

    Distributed multimedia systems

    Get PDF
    A distributed multimedia system (DMS) is an integrated communication, computing, and information system that enables the processing, management, delivery, and presentation of synchronized multimedia information with quality-of-service guarantees. Multimedia information may include discrete media data, such as text, data, and images, and continuous media data, such as video and audio. Such a system enhances human communications by exploiting both visual and aural senses and provides the ultimate flexibility in work and entertainment, allowing one to collaborate with remote participants, view movies on demand, access on-line digital libraries from the desktop, and so forth. In this paper, we present a technical survey of a DMS. We give an overview of distributed multimedia systems, examine the fundamental concept of digital media, identify the applications, and survey the important enabling technologies.published_or_final_versio

    On-line data archives

    Get PDF
    ©2001 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.Digital libraries and other large archives of electronically retrievable and manipulable material are becoming widespread in both commercial and scientific arenas. Advances in networking technologies have led to a greater proliferation of wide-area distributed data warehousing with associated data management challenges. We review tools and technologies for supporting distributed on-line data archives and explain our key concept of active data archives, in which data can be, processed on-demand before delivery. We are developing wide-area data warehousing software infrastructure for geographically distributed archives of large scientific data sets, such as satellite image data, that are stored hierarchically on disk arrays and tape silos and are accessed by a variety of scientific and decision support applications. Interoperability is a major issue for distributed data archives and requires standards for server interfaces and metadata. We review present activities and our contributions in developing such standards for different application areas.K. Hawick, P. Coddington, H. James, C. Patte

    Promoting good health and welfare in European organic laying hens

    Get PDF
    Egg production in line with organic principles includes outdoor access, preferential use of preventative measures and alternative treatment methods, a 100% organic diet from 2012 onwards and consistent use of non beak-trimmed birds. This proposal focuses on the main challenges for organic laying hen farms regarding disease management, adverse animal welfare and negative impacts on the environment. Parasite infestation levels as well as prevalence of major health and welfare problems such as feather pecking, cannibalism, keel bone and foot lesions are affected by a combination of housing and management factors, e.g. with respect to feeding or hygiene, genotype or therapeutic treatments. The design and management of the range influence how well and evenly it is used by the hens and the extent to which nutrients accumulate in the surrounding environment. By adopting an epidemiological approach, major risk factors for diseases, and negative welfare and environmental impacts will be identified. 107 flocks distributed over 8 countries will be included in the observational study with a cross sectional design. Flocks will be visited twice at specified age periods during two seasons. Housing, management and animal based data will be recorded during interviews, direct measurements or from farm documentation. Recommendations will be formulated based on analyses carried out in four specific work packages. These recommendations will help organic egg producers to further develop bird health and welfare according to the organic principles, and to enhance economic competitiveness through improved bird health and performance

    Economics of Information Processing in Operations Organizations

    Get PDF
    This paper studies a fundamental management question: how does information economics affect the organization of management? We view management hierarchies as tree-like structures designed to minimize real and opportunity costs related to information processing and decision making. “Line” production activities stand at the end nodes of a hierarchy tree. Data from these bottom nodes are processed and distributed to higher level nodes that combine information from the lower nodes. The question we ask is: “how do the real and opportunity costs of information processing affect the tree”. We solve for the optimal tree which includes the links and capacity at each of the nodes. Models are formulated on two underlying premises: complexity costs arise due to processing different types of data, and queuing effects due to data arrival and processing uncertainties create delay which is an opportunity cost

    Distribution Policies for Datalog

    Get PDF
    Modern data management systems extensively use parallelism to speed up query processing over massive volumes of data. This trend has inspired a rich line of research on how to formally reason about the parallel complexity of join computation. In this paper, we go beyond joins and study the parallel evaluation of recursive queries. We introduce a novel framework to reason about multi-round evaluation of Datalog programs, which combines implicit predicate restriction with distribution policies to allow expressing a combination of data-parallel and query-parallel evaluation strategies. Using our framework, we reason about key properties of distributed Datalog evaluation, including parallel-correctness of the evaluation strategy, disjointness of the computation effort, and bounds on the number of communication rounds

    The organisational effects of installing a distributed processing system

    Get PDF
    Bibliography: 238-248.Since its introduction to business in 1952, computerised data processing has undergone a number of substantial changes, both in the hardware and the techniques that are used. The introduction of miniaturisation, and the resultant lowering of the costs of circuitry, has led to the widespread use of mini- and micro-computers. There has also been a large increase in the use of communication facilities. Initially, almost all organisations centralised their computer facilities at the Head Office and systems were run in the batch mode. The need to service the requirements of remote users was resolved by installing on-line facilities and providing unintelligent terminals to those users. Alternatively, stand-alone computers were installed at the remote locations. However, the requirements of businesses for centralised reporting and control led to the need to install processing units at the user sites and to connect those computers, via communications links, to a computer facility located at Head Office. In this way distributed data processing evolved. The provision of this type of processing mode has important implications to the organisation in such areas as costs, staffing, planning, control and systems design. This thesis, therefore, investigates the current (1980) trends in relation to distributed processing. It specifically examines the developments in hardware, software, and data communications. It assesses the criteria that should be considered by an organisation in selecting either the centralisation or distribution of its processing facilities. Through a field study both successful and unsuccessful distributed installations are examined. Conclusions are then drawn and recommendations made, to provide management with working guidelines when assessing the feasibility and practicality of distributed processing for its organisation. The findings of the study are appropriate for both general management and DP management with only centralised computing experience; and for individuals offering professional computer consultancy services to existing or potential users
    corecore