1,618 research outputs found

    F1000 recommendations as a new data source for research evaluation: A comparison with citations

    Get PDF
    F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations. More research is needed to identify the main reasons for differences between recommendations and citations in assessing the impact of publications

    Hardness Amplification of Optimization Problems

    Get PDF
    In this paper, we prove a general hardness amplification scheme for optimization problems based on the technique of direct products. We say that an optimization problem ? is direct product feasible if it is possible to efficiently aggregate any k instances of ? and form one large instance of ? such that given an optimal feasible solution to the larger instance, we can efficiently find optimal feasible solutions to all the k smaller instances. Given a direct product feasible optimization problem ?, our hardness amplification theorem may be informally stated as follows: If there is a distribution D over instances of ? of size n such that every randomized algorithm running in time t(n) fails to solve ? on 1/?(n) fraction of inputs sampled from D, then, assuming some relationships on ?(n) and t(n), there is a distribution D\u27 over instances of ? of size O(n??(n)) such that every randomized algorithm running in time t(n)/poly(?(n)) fails to solve ? on 99/100 fraction of inputs sampled from D\u27. As a consequence of the above theorem, we show hardness amplification of problems in various classes such as NP-hard problems like Max-Clique, Knapsack, and Max-SAT, problems in P such as Longest Common Subsequence, Edit Distance, Matrix Multiplication, and even problems in TFNP such as Factoring and computing Nash equilibrium

    Active classification with comparison queries

    Full text link
    We study an extension of active learning in which the learning algorithm may ask the annotator to compare the distances of two examples from the boundary of their label-class. For example, in a recommendation system application (say for restaurants), the annotator may be asked whether she liked or disliked a specific restaurant (a label query); or which one of two restaurants did she like more (a comparison query). We focus on the class of half spaces, and show that under natural assumptions, such as large margin or bounded bit-description of the input examples, it is possible to reveal all the labels of a sample of size nn using approximately O(logn)O(\log n) queries. This implies an exponential improvement over classical active learning, where only label queries are allowed. We complement these results by showing that if any of these assumptions is removed then, in the worst case, Ω(n)\Omega(n) queries are required. Our results follow from a new general framework of active learning with additional queries. We identify a combinatorial dimension, called the \emph{inference dimension}, that captures the query complexity when each additional query is determined by O(1)O(1) examples (such as comparison queries, each of which is determined by the two compared examples). Our results for half spaces follow by bounding the inference dimension in the cases discussed above.Comment: 23 pages (not including references), 1 figure. The new version contains a minor fix in the proof of Lemma 4.

    Samplers and Extractors for Unbounded Functions

    Get PDF
    Blasiok (SODA\u2718) recently introduced the notion of a subgaussian sampler, defined as an averaging sampler for approximating the mean of functions f from {0,1}^m to the real numbers such that f(U_m) has subgaussian tails, and asked for explicit constructions. In this work, we give the first explicit constructions of subgaussian samplers (and in fact averaging samplers for the broader class of subexponential functions) that match the best known constructions of averaging samplers for [0,1]-bounded functions in the regime of parameters where the approximation error epsilon and failure probability delta are subconstant. Our constructions are established via an extension of the standard notion of randomness extractor (Nisan and Zuckerman, JCSS\u2796) where the error is measured by an arbitrary divergence rather than total variation distance, and a generalization of Zuckerman\u27s equivalence (Random Struct. Alg.\u2797) between extractors and samplers. We believe that the framework we develop, and specifically the notion of an extractor for the Kullback-Leibler (KL) divergence, are of independent interest. In particular, KL-extractors are stronger than both standard extractors and subgaussian samplers, but we show that they exist with essentially the same parameters (constructively and non-constructively) as standard extractors

    Reliability-Based Design of Reinforced Concrete Raft Footings Using Finite Element Method

    Get PDF
    In this study, a FORTRAN-based reliability-based design program was developed for the design of raft footings based on the ultimate and serviceability design requirements of BS8110 (1997). The well-known analysis of plate on elastic foundation using displacement method of analysis was used in conjunction with the design point method. The design point method was adopted for designing to a pre-determined safety level, T. Example of the design of a raft footing is included to demonstrate the simplicity of the procedure. It was found among other findings that there is a saving of about 64% of longitudinal reinforcement applied at the column face using the proposed method as compared with the BS8110 design method. Also, the depth of footing required using the proposed procedure was found to be 47% lower than in the deterministic method using BS8110. Also, considering a target safety index of 3.0 was found to be cheaper than considering a target safety index of 4.0 for the same loading, material and geometrical properties of the footing. It is therefore concluded that the proposed procedure is quite suitable for application

    Dynamic Complexity of Parity Exists Queries

    Get PDF
    Given a graph whose nodes may be coloured red, the parity of the number of red nodes can easily be maintained with first-order update rules in the dynamic complexity framework DynFO of Patnaik and Immerman. Can this be generalised to other or even all queries that are definable in first-order logic extended by parity quantifiers? We consider the query that asks whether the number of nodes that have an edge to a red node is odd. Already this simple query of quantifier structure parity-exists is a major roadblock for dynamically capturing extensions of first-order logic. We show that this query cannot be maintained with quantifier-free first-order update rules, and that variants induce a hierarchy for such update rules with respect to the arity of the maintained auxiliary relations. Towards maintaining the query with full first-order update rules, it is shown that degree-restricted variants can be maintained

    The IceCube Realtime Alert System

    Get PDF
    Following the detection of high-energy astrophysical neutrinos in 2013, their origin is still unknown. Aiming for the identification of an electromagnetic counterpart of a rapidly fading source, we have implemented a realtime analysis framework for the IceCube neutrino observatory. Several analyses selecting neutrinos of astrophysical origin are now operating in realtime at the detector site in Antarctica and are producing alerts to the community to enable rapid follow-up observations. The goal of these observations is to locate the astrophysical objects responsible for these neutrino signals. This paper highlights the infrastructure in place both at the South Pole detector site and at IceCube facilities in the north that have enabled this fast follow-up program to be developed. Additionally, this paper presents the first realtime analyses to be activated within this framework, highlights their sensitivities to astrophysical neutrinos and background event rates, and presents an outlook for future discoveries.Comment: 33 pages, 9 figures, Published in Astroparticle Physic
    corecore