978 research outputs found

    JCOMM/IODE Expert Team on Data Management Practices (ETDMP), first session, hosted by Flanders Marine Institute, Oostende, Belgium 15-28 September 2003

    Get PDF
    The 1st Session of the JCOMM/IODE Expert Team on Data Management Practices (ETDMP) was held in Oostende, Belgium between 15 and 18 September 2003. The ETDMP members discussed the requirements for end-to-end data management (E2EDM); existing and planned data management mechanisms and practices; cooperation with other programmes and expert teams; the strategy for the development of E2EDM; and future cooperation with the Ocean Information Technology (OIT) Pilot Project. The Group agreed on an Action Plan for the intersessional period based on three pilot projects identified by the sessional working groups: metadata management, data assembly, quality control and quality assurance, and the development of an E2EDM Prototype

    HUDDL for description and archive of hydrographic binary data

    Get PDF
    Many of the attempts to introduce a universal hydrographic binary data format have failed or have been only partially successful. In essence, this is because such formats either have to simplify the data to such an extent that they only support the lowest common subset of all the formats covered, or they attempt to be a superset of all formats and quickly become cumbersome. Neither choice works well in practice. This paper presents a different approach: a standardized description of (past, present, and future) data formats using the Hydrographic Universal Data Description Language (HUDDL), a descriptive language implemented using the Extensible Markup Language (XML). That is, XML is used to provide a structural and physical description of a data format, rather than the content of a particular file. Done correctly, this opens the possibility of automatically generating both multi-language data parsers and documentation for format specification based on their HUDDL descriptions, as well as providing easy version control of them. This solution also provides a powerful approach for archiving a structural description of data along with the data, so that binary data will be easy to access in the future. Intending to provide a relatively low-effort solution to index the wide range of existing formats, we suggest the creation of a catalogue of format descriptions, each of them capturing the logical and physical specifications for a given data format (with its subsequent upgrades). A C/C++ parser code generator is used as an example prototype of one of the possible advantages of the adoption of such a hydrographic data format catalogue

    The Colour of Ocean Data: International Symposium on oceanographic data and information management, with special attention to biological data. Brussels, Belgium, 25-27 November 2002: book of abstracts

    Get PDF
    Ocean data management plays a crucial role in global as well as local matters. The Intergovernmental Oceanographic Commission -with its network of National Oceanographic Data Centres- and the International Council of Scientific Unions- with its World Data Centres- have played a major catalysing role in establishing the existing ocean data management practices. No one can think of data management without thinking of information technology. New developments in computer hard- and software force us to continually rethink the way we manage ocean data. One of the major challenges in this is to try and close the gap between the haves and the have-nots, and to assist scientists in less fortunate countries to manage oceanographic data flows in a suitable and timely fashion. So far major emphasis has been on the standardisation and exchange of physical oceanographic data in open ocean conditions. But the colour of the ocean data is changing. The ‘blue’ ocean sciences get increasingly interested in including geological, chemical and biological data. Moreover the shallow sea areas get more and more attention as highly productive biological areas that need to be seen in close association with the deep seas. How to fill in the gap of widely accepted standards for data structures that can serve the deep ‘blue’ and the shallow ‘green’ biological data management is a major issue that has to be addressed. And there is more: data has to be turned into information. In the context of ocean data management, scientists, data managers and decision makers are all very much dependent on each other. Decision makers will stimulate research topics with policy priority and hence guide researchers. Scientists need to provide data managers with reliable and first quality controlled data in such a way that the latter can translate and make them available for the decision makers. But do they speak the same ‘language’? Are they happy with the access they have to the data? And if not, can they learn from each other’s expectations and experience? The objective of this symposium is to harmonize ocean colours and languages and create a forum for data managers, scientists and decision makers with a major interest in oceanography, and open to everyone interested in ocean data management

    Dataset metadata

    Get PDF

    Data documentation & metadata

    Get PDF

    Implementation and Deployment of a Library of the High-level Application Programming Interfaces (SemSorGrid4Env)

    No full text
    The high-level API service is designed to support rapid development of thin web applications and mashups beyond the state of the art in GIS, while maintaining compatibility with existing tools and expectations. It provides a fully configurable API, while maintaining a separation of concerns between domain experts, service administrators and mashup developers. It adheres to REST and Linked Data principles, and provides a novel bridge between standards-based (OGC O&M) and Semantic Web approaches. This document discusses the background motivations for the HLAPI (including experiences gained from any previously implemented versions), before moving onto specific details of the final implementation, including configuration and deployment instructions, as well as a full tutorial to assist mashup developers with using the exposed observation data

    Huddl: the Hydrographic Universal Data Description Language

    Get PDF
    Since many of the attempts to introduce a universal hydrographic data format have failed or have been only partially successful, a different approach is proposed. Our solution is the Hydrographic Universal Data Description Language (HUDDL), a descriptive XML-based language that permits the creation of a standardized description of (past, present, and future) data formats, and allows for applications like HUDDLER, a compiler that automatically creates drivers for data access and manipulation. HUDDL also represents a powerful solution for archiving data along with their structural description, as well as for cataloguing existing format specifications and their version control. HUDDL is intended to be an open, community-led initiative to simplify the issues involved in hydrographic data access

    Proceedings of the 2004 ONR Decision-Support Workshop Series: Interoperability

    Get PDF
    In August of 1998 the Collaborative Agent Design Research Center (CADRC) of the California Polytechnic State University in San Luis Obispo (Cal Poly), approached Dr. Phillip Abraham of the Office of Naval Research (ONR) with the proposal for an annual workshop focusing on emerging concepts in decision-support systems for military applications. The proposal was considered timely by the ONR Logistics Program Office for at least two reasons. First, rapid advances in information systems technology over the past decade had produced distributed collaborative computer-assistance capabilities with profound potential for providing meaningful support to military decision makers. Indeed, some systems based on these new capabilities such as the Integrated Marine Multi-Agent Command and Control System (IMMACCS) and the Integrated Computerized Deployment System (ICODES) had already reached the field-testing and final product stages, respectively. Second, over the past two decades the US Navy and Marine Corps had been increasingly challenged by missions demanding the rapid deployment of forces into hostile or devastate dterritories with minimum or non-existent indigenous support capabilities. Under these conditions Marine Corps forces had to rely mostly, if not entirely, on sea-based support and sustainment operations. Particularly today, operational strategies such as Operational Maneuver From The Sea (OMFTS) and Sea To Objective Maneuver (STOM) are very much in need of intelligent, near real-time and adaptive decision-support tools to assist military commanders and their staff under conditions of rapid change and overwhelming data loads. In the light of these developments the Logistics Program Office of ONR considered it timely to provide an annual forum for the interchange of ideas, needs and concepts that would address the decision-support requirements and opportunities in combined Navy and Marine Corps sea-based warfare and humanitarian relief operations. The first ONR Workshop was held April 20-22, 1999 at the Embassy Suites Hotel in San Luis Obispo, California. It focused on advances in technology with particular emphasis on an emerging family of powerful computer-based tools, and concluded that the most able members of this family of tools appear to be computer-based agents that are capable of communicating within a virtual environment of the real world. From 2001 onward the venue of the Workshop moved from the West Coast to Washington, and in 2003 the sponsorship was taken over by ONR’s Littoral Combat/Power Projection (FNC) Program Office (Program Manager: Mr. Barry Blumenthal). Themes and keynote speakers of past Workshops have included: 1999: ‘Collaborative Decision Making Tools’ Vadm Jerry Tuttle (USN Ret.); LtGen Paul Van Riper (USMC Ret.);Radm Leland Kollmorgen (USN Ret.); and, Dr. Gary Klein (KleinAssociates) 2000: ‘The Human-Computer Partnership in Decision-Support’ Dr. Ronald DeMarco (Associate Technical Director, ONR); Radm CharlesMunns; Col Robert Schmidle; and, Col Ray Cole (USMC Ret.) 2001: ‘Continuing the Revolution in Military Affairs’ Mr. Andrew Marshall (Director, Office of Net Assessment, OSD); and,Radm Jay M. Cohen (Chief of Naval Research, ONR) 2002: ‘Transformation ... ’ Vadm Jerry Tuttle (USN Ret.); and, Steve Cooper (CIO, Office ofHomeland Security) 2003: ‘Developing the New Infostructure’ Richard P. Lee (Assistant Deputy Under Secretary, OSD); and, MichaelO’Neil (Boeing) 2004: ‘Interoperability’ MajGen Bradley M. Lott (USMC), Deputy Commanding General, Marine Corps Combat Development Command; Donald Diggs, Director, C2 Policy, OASD (NII

    Service Based Marketplace for Applications

    Get PDF
    The Grid has revolutionized the way computations are done on the Internet. Access to remote computational resources and ad hoc creation of virtual organizations across administrative domains opens new opportunities on the Grid. The newly developed web services based Open Grid Services Architecture makes the Grid more accessible by allowing the Grid to be constructed from distinct platform independent components. Together they provide an environment for application sharing (or trading), collaborations and access to remote data repositories. The application marketplace is a natural extension to this application sharing environment. The marketplace addresses the fact that the existing infrastructure is still incomplete without provisions for publishing and discovering applications and resources, including the application descriptors that must be moved between the market participants. This work demonstrates a web service instance-based infrastructure, the application market that allows the sellers, the application and the CPU providers to publish their applications for the users to find and use. The application market uses a portal architecture built on top of Globus toolkit 3.0 that interacts with the providers and the users. The market services provide distinct interfaces that allow providers to advertise applications and users to select, configure, and run these applications. The applications themselves are modeled as stateful objects represented using XML which can be exchanged between the providers and users when required. The marketplace, through its interfaces, effectively hides the compute resource and application complexity thus allowing end users to explore and use applications unfamiliar to them with ease
    • 

    corecore