9,391 research outputs found

    Mediaflux: a data management platform for collaborative research

    No full text
    Mediafluxâ„¢, from Arcitecta, enables the discovery, sharing and collaborative evolution of knowledge. It provides flexible capture and storage of arbitrary scientific and research data. Jason will look at how e-Science customers, such as the Howard Florey Institute for Neuroscience and the Australian Animal Health Laboratories at the CSIRO, utilize Mediafluxâ„¢ for collaboration and research. He will conclude with a discussion of the interoperability issues for distributed research involving heterogeneous data repositories. Mediafluxâ„¢ is a digital asset management and application platform that can be rapidly applied to problem domains requiring capture, characterization and dissemination of digital information and data. It supports flexible XML metadata and metadata evolution, trans-coding, workflow, replication, federated repositories and includes a high performance hybrid object-XML database and an embedded DICOM server. Mediafluxâ„¢ has been commercially deployed since 2003

    Data DNA: The Next Generation of Statistical Metadata

    Get PDF
    Describes the components of a complete statistical metadata system and suggests ways to create and structure metadata for better access and understanding of data sets by diverse users

    Pathways: Augmenting interoperability across scholarly repositories

    Full text link
    In the emerging eScience environment, repositories of papers, datasets, software, etc., should be the foundation of a global and natively-digital scholarly communications system. The current infrastructure falls far short of this goal. Cross-repository interoperability must be augmented to support the many workflows and value-chains involved in scholarly communication. This will not be achieved through the promotion of single repository architecture or content representation, but instead requires an interoperability framework to connect the many heterogeneous systems that will exist. We present a simple data model and service architecture that augments repository interoperability to enable scholarly value-chains to be implemented. We describe an experiment that demonstrates how the proposed infrastructure can be deployed to implement the workflow involved in the creation of an overlay journal over several different repository systems (Fedora, aDORe, DSpace and arXiv).Comment: 18 pages. Accepted for International Journal on Digital Libraries special issue on Digital Libraries and eScienc

    Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.

    Get PDF
    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu

    An Architecture for Information Commerce Systems

    Get PDF
    The increasing use of the Internet in business and commerce has created a number of new business opportunities and the need for supporting models and platforms. One of these opportunities is information commerce (i-commerce), a special case of ecommerce focused on the purchase and sale of information as a commodity. In this paper we present an architecture for i-commerce systems using OPELIX (Open Personalized Electronic Information Commerce System) [11] as an example. OPELIX provides an open information commerce platform that enables enterprises to produce, sell, deliver, and manage information products and related services over the Internet. We focus on the notion of information marketplace, a virtual location that enables i-commerce, describe the business and domain model for an information marketplace, and discuss the role of intermediaries in this environment. The domain model is used as the basis for the software architecture of the OPELIX system. We discuss the characteristics of the OPELIX architecture and compare our approach to related work in the field

    The multi-faceted use of the OAI-PMH in the LANL Repository

    Get PDF
    This paper focuses on the multifaceted use of the OAI-PMH in a repository architecture designed to store digital assets at the Research Library of the Los Alamos National Laboratory (LANL), and to make the stored assets available in a uniform way to various downstream applications. In the architecture, the MPEG-21 Digital Item Declaration Language is used as the XML-based format to represent complex digital objects. Upon ingestion, these objects are stored in a multitude of autonomous OAI-PMH repositories. An OAI-PMH compliant Repository Index keeps track of the creation and location of all those repositories, whereas an Identifier Resolver keeps track of the location of individual objects. An OAI-PMH Federator is introduced as a single-point-of-access to downstream harvesters. It hides the complexity of the environment to those harvesters, and allows them to obtain transformations of stored objects. While the proposed architecture is described in the context of the LANL library, the paper will also touch on its more general applicability

    THE OPTIMIZATION OF THE INTERNAL AND EXTERNAL REPORTING IN FINANCIAL ACCOUNTING: ADOPTING XBRL INTERNATIONAL STANDARD

    Get PDF
    More and more enterprises, especially the listed companies, have adopted newaccounting norms and regulations (IFRS or US GAAP, Bale II and, in perspective, SURFI),manifesting interest for publishing financial reports using a standard format able to considerablyimprove their communication, data collection in the receiving units, control and analysis offinancial information. When switching to the new accounting rules specified in international orregional standards and norms, regulatory and control bodies recommend the XBRL format forfinancial reporting, with recognition of the regional jurisdiction. Our paper makes a review of theliterature, presents the XBRL specific elements and proposes possible solutions for internal andexternal financial reporting of an enterprise. Finally, it concludes on the benefits of adopting XBRLat national level in a potential XBRL Romania project.accounting norms, financial reporting, XBRL, taxonomy, XBRL jurisdiction.
    • …
    corecore