7,996 research outputs found
Extraction and parsing of herbarium specimen data: Exploring the use of the Dublin core application profile framework
Herbaria around the world house millions of plant specimens; botanists and other researchers value these resources as ingredients in biodiversity research. Even when the specimen sheets are digitized and made available online, the critical information about the specimen stored on the sheet are not in a usable (i.e., machine-processible) form. This paper describes a current research and development project that is designing and testing high-throughput workflows that combine machine- and human-processes to extract and parse the specimen label data. The primary focus of the paper is the metadata needs for the workflow and the creation of the structured metadata records describing the plant specimen. In the project, we are exploring the use of the new Dublin Core Metadata Initiative framework for application profiles. First articulated as the Singapore Framework for Dublin Core Application Profiles in 2007, the use of this framework is in its infancy. The promises of this framework for maximum interoperability and for documenting the use of metadata for maximum reusability, and for supporting metadata applications that are in conformance with Web architectural principles provide the incentive to explore and add implementation experience regarding this new framework
Transformation Report: The missing Standard for Data Exchange
The data exchange with STEP ISO 10303 is state of the art, but it is still a fundamental problem to guarantee a given quality of service to integrated operational and informational applications. In STEP there are defined descriptive methods, data specifications, implementation resources and conformance testing, but there is nothing to document how the data is processed. A success report of the mapped data from the source to the target tool is missing. In this paper we introduce a Transformation Report for documenting the data transformation from the source to the target tool. With this report the trustworthiness of the received data can be significantly improved by documenting the data loss, semantic and syntactic errors. With the information in the report it should be possible to infer the proper value to define rules that fix the data after it has been determined to be incorrect or to find a suitable data integrations strategy into a target tool or repository. The intention of the paper is to suggest a standardised Transformation Report, that can be automatically processed and that contains all information for an automated reconciliation process
Recommended from our members
The Volcker Rule: A Legal Analysis
This report provides an introduction to the Volcker Rule, which is the regulatory regime imposed upon banking institutions and their affiliates under Section 619 of the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010 (P.L. 111-203). The Volker Rule is designed to prohibit âbanking entitiesâ from engaging in all forms of âproprietary tradingâ (i.e., making investments for their own âtrading accountsâ)âactivities that former Federal Reserve Chairman Paul A. Volcker often condemned as contrary to conventional banking practices and a potential risk to financial stability. The statutory language provides only general outlines of prohibited activities and exceptions. Through it, however, Congress has empowered five federal financial regulators with authority to conduct coordinated rulemakings to fill in the details and complete the difficult task of crafting regulations to identify prohibited activities, while continuing to permit activities considered essential to the safety and soundness of banking institutions or to the maintenance of strong capital markets. In December 2014, more than two years after enactment of the law, coordinated implementing regulations were issued by the Office of the Comptroller of the Currency (OCC), the Federal Deposit Insurance Corporation (FDIC), the Board of Governors of the Federal Reserve System (FRB), the Securities and Exchange Commission (SEC), and the Commodity Futures Trading Commission (CFTC).
The Rule is premised on a two-pronged central core restricting activities by âbanking entitiesââa term that includes all FDIC-insured bank and thrift institutions; all bank, thrift, or financial holding companies; all foreign banking operations with certain types of presence in the United States; and all affiliates and subsidiaries of any of these entities. Specifically, the Rule broadly prohibits banking entities from engaging in âproprietary tradingâ and from making investments in or having relationships with hedge and similar âcovered fundsâ that are exempt from registering with the CFTC as commodity pool operators or with the SEC under the Investment Advisors Act. The Rule couples its broad prohibitions with numerous exclusions and by designating myriad activities as permissible so long as various terms and conditions are met, unless they otherwise would involve or result in a material conflict of interest; a material exposure to high-risk assets or high-risk trading strategies; pose a threat to the safety and soundness of the banking entity; or pose a threat to the financial stability of the United States.
The exceptions to the ban on proprietary trading include underwriting by securities underwriters; market-making âdesigned not to exceed the reasonably expected near term demands of clientsâ; trading in government securities; fiduciary activities; insurance company portfolio investments; and risk-mitigating hedging activities. The ban on investing in and owning âcovered fundsâ exempts certain types of funds, under specified conditions, and permits de minimis investment in any such fund up to 3% of the outstanding ownership interests of the fund with an aggregate cap on the total ownership interest in âcovered fundsâ of 3% of the banking entityâs core capital.
To prevent evasion, the Rule has extensive requirements mandating comprehensive compliance programs that include ongoing management involvement, precise metrics measuring risk assessment, verification and documentation of any activities conducted under one of the Ruleâs exceptions or exclusions, and recurring reports and assessments. Full compliance is required by July 21, 2015, subject to the possibility that further extensions may be provided by the regulators. In the case of investments involving âilliquid fundsâ subject to contractual provisions seriously impacting their marketability or sale, full divestiture might not be required until July 21, 2022
Improving NRM Investment through a policy performance lens
Choosing a mechanism to encourage landholders to change their land management in order to deliver environmental outcomes is a complicated process. Careful instrument selection may count for little if uptake and adoption are insufficient to meet performance targets. Similarly, investors may require assurance that the proposed investment will deliver the stated goals. In order to reduce the uptake uncertainty facing policy makers we evaluate and describe several possible methods to guide and frame adoption targets. We conclude that referring to past adoption experience of a wide range of mechanisms offers the best approach to setting feasible adoption targets for future mechanisms. We call this adoption points of reference. This approach is tested by application to mechanisms focusing on delivering water quality improvements in GBR catchments. We conclude that the points of reference approach is appropriate and useful but should be supported by processes designed to incorporate the impact of heterogeneity and local knowledge and an emphasis on improving the accuracy of future data.adoption targets, NRM investment, reasonable assurance, water quality,
Documenting numerical experiments in support of the Coupled Model Intercomparison Project Phase 6 (CMIP6)
Numerical simulation, and in particular simulation of the earth system, relies on contributions from diverse communities, from those who develop models to those involved in devising, executing, and analysing numerical experiments. Often these people work in different institutions and may be working with significant separation in time (particularly analysts, who may be working on data produced years earlier), and they typically communicate via published information (whether journal papers, technical notes, or websites). The complexity of the models, experiments, and methodologies, along with the diversity (and sometimes inexact nature) of information sources, can easily lead to misinterpretation of what was actually intended or done. In this paper we introduce a taxonomy of terms for more clearly defining numerical experiments, put it in the context of previous work on experimental ontologies, and describe how we have used it to document the experiments of the sixth phase for the Coupled Model Intercomparison Project (CMIP6). We describe how, through iteration with a range of CMIP6 stakeholders, we rationalized multiple sources of information and improved the clarity of experimental definitions. We demonstrate how this process has added value to CMIP6 itself by (a) helping those devising experiments to be clear about their goals and their implementation, (b) making it easier for those executing experiments to know what is intended, (c) exposing interrelationships between experiments, and (d) making it clearer for third parties (data users) to understand the CMIP6 experiments. We conclude with some lessons learnt and how these may be applied to future CMIP phases as well as other modelling campaigns
Software Testing and Documenting Automation
This article describes some approaches to problem of testing and documenting automation in
information systems with graphical user interface. Combination of data mining methods and theory of finite state
machines is used for testing automation. Automated creation of software documentation is based on using
metadata in documented system. Metadata is built on graph model. Described approaches improve performance
and quality of testing and documenting processes
- âŠ