1,391 research outputs found

    F1000 recommendations as a new data source for research evaluation: A comparison with citations

    Get PDF
    F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations. More research is needed to identify the main reasons for differences between recommendations and citations in assessing the impact of publications

    Querying Schemas With Access Restrictions

    Full text link
    We study verification of systems whose transitions consist of accesses to a Web-based data-source. An access is a lookup on a relation within a relational database, fixing values for a set of positions in the relation. For example, a transition can represent access to a Web form, where the user is restricted to filling in values for a particular set of fields. We look at verifying properties of a schema describing the possible accesses of such a system. We present a language where one can describe the properties of an access path, and also specify additional restrictions on accesses that are enforced by the schema. Our main property language, AccLTL, is based on a first-order extension of linear-time temporal logic, interpreting access paths as sequences of relational structures. We also present a lower-level automaton model, Aautomata, which AccLTL specifications can compile into. We show that AccLTL and A-automata can express static analysis problems related to "querying with limited access patterns" that have been studied in the database literature in the past, such as whether an access is relevant to answering a query, and whether two queries are equivalent in the accessible data they can return. We prove decidability and complexity results for several restrictions and variants of AccLTL, and explain which properties of paths can be expressed in each restriction.Comment: VLDB201

    Probabilistic methods for wind turbine blades

    Get PDF
    The European Energy Research Alliance (EERA) has as a key purpose to elevate cooperation between national research institutes to a new level, from ad-hoc participation in joint projects to collectively planning and implementing joint strategic research programmes. The RES directive and the SET Plan enforce a high rate of deployment of wind energy, on- and offshore for the European Union’s member states leading to a high challenge for research in the two priority areas: Integration and Offshore. Wind energy was therefore at an early stage identified as an area for a joint research programme where the key players are the national wind energy research institutes but open to and encouraging universities to participate in the activities

    Samplers and Extractors for Unbounded Functions

    Get PDF
    Blasiok (SODA\u2718) recently introduced the notion of a subgaussian sampler, defined as an averaging sampler for approximating the mean of functions f from {0,1}^m to the real numbers such that f(U_m) has subgaussian tails, and asked for explicit constructions. In this work, we give the first explicit constructions of subgaussian samplers (and in fact averaging samplers for the broader class of subexponential functions) that match the best known constructions of averaging samplers for [0,1]-bounded functions in the regime of parameters where the approximation error epsilon and failure probability delta are subconstant. Our constructions are established via an extension of the standard notion of randomness extractor (Nisan and Zuckerman, JCSS\u2796) where the error is measured by an arbitrary divergence rather than total variation distance, and a generalization of Zuckerman\u27s equivalence (Random Struct. Alg.\u2797) between extractors and samplers. We believe that the framework we develop, and specifically the notion of an extractor for the Kullback-Leibler (KL) divergence, are of independent interest. In particular, KL-extractors are stronger than both standard extractors and subgaussian samplers, but we show that they exist with essentially the same parameters (constructively and non-constructively) as standard extractors

    Reliability-based assessment procedures for existing concrete structures

    Get PDF
    A feasibility study of reliability theory as a tool for the assessment of present safety and residual service life of damaged concrete structures has been performed in order to find a transparent methodology for the assessment procedure. It is concluded that the current guidelines are open to interpretation and that the variation in the results obtained regarding the structural safety is too great to be acceptable. Interpretations by the engineer are also included when deterministic methods are used, but probabilistic methods are more sensitive to the assumptions made and the differences in the results will therefore be greater. In a literature survey it is concluded that residual service life predictions should not be expected to be valid for more than 10 to 15 years, due to the large variability of the variables involved in the analysis. Based on these conclusions predictive models that are suitable for the inclusion of new data, and methods for the incorporation of new data are proposed. Information from the field of medical statistics and robotics suggests that linear regression models are well suited for this type of updated monitoring. Two test cases were studied, a concrete dam and a railway bridge. From the dam case, it was concluded that the safety philosophy in the deterministic dam specific assessment guidelines further development. Probabilistic descriptions of important variables, such as ice loads and friction coefficients, are needed if reliability theory is to be used for assessment purposes. During the study of the railway bridge it became clear that model uncertainties for different failure mechanisms used in concrete design are lacking. If Bayesian updating is to be used as a tool for incorporation of test data regarding concrete strength info the reliability analysis, a priori information must be established. A need for a probabilistic description of the hardening process of concrete was identified for the purpose of establishing a priori information. This description can also be used as qualitative assessment of the concrete. If there is a large discrepancy between the predicted value and the measured value, the concrete should be investigated regarding deterioration due to, for example internal frost or alkali silica reactions. Reliability theory is well suited for the assessment process since features of the reliability theory such as sensitivity analysis give good decision support for matters concerning both safety and service life predictions
    • …
    corecore