1,454 research outputs found

    F1000 recommendations as a new data source for research evaluation: A comparison with citations

    Get PDF
    F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations. More research is needed to identify the main reasons for differences between recommendations and citations in assessing the impact of publications

    Permanent and live load model for probabilistic structural fire analysis : a review

    No full text
    Probabilistic analysis is receiving increased attention from fire engineers, assessment bodies and researchers. It is however often unclear which probabilistic models are appropriate for the analysis. For example, in probabilistic structural fire engineering, the models used to describe the permanent and live load differ widely between studies. Through a literature review, it is observed that these diverging load models largely relate to the same underlying datasets and basic methodologies, while differences can be attributed (largely) to specific assumptions in different background papers which have become consolidated through repeated use in application studies by different researchers. Taking into account the uncovered background information, consolidated probabilistic load models are proposed

    Reliability and risk acceptance criteria for civil engineering structures

    Get PDF
    The specification of risk and reliability acceptance criteria is a key issue of reliability verifications of new and existing structures. Current target reliability levels in standards appear to have considerable scatter. Critical review of risk acceptance approaches to societal, economic and environmental risk indicates that an optimal design strategy is mostly dominated by economic aspects while human safety aspects need to be verified only in special cases. It is recommended to specify the target levels considering economic optimisation and the marginal life-saving costs principle, as both these approaches take into account the failure consequences and costs of safety measures

    The Two Media Literacies: A Cultural Studies Perspective

    Get PDF
    This analysis identifies the problem of the media saturation of people’s lives as a reason for developing programs teaching media literacy. It argues that the basic or foundational disciplines found in cultural studies, such as applied semiotics, psychoanalytic theory, sociological theory, and Marxist analysis, are the proper way to teach media criticism and media literacy (See Berger, A.A., Media Analysis Techniques, 6th edition, 2019). The methods by themselves are not adequate, which means that teaching media literacy also involves providing exercises and learning games that show students how to apply the theories they learn to their analyses of media texts (See Berger, A.A., Games and Activities for Media, Communication, and Cultural Studies Students,2004). Finally, it is suggested that media literacy should be taught at all educational levels

    Computing a Nonnegative Matrix Factorization -- Provably

    Full text link
    In the Nonnegative Matrix Factorization (NMF) problem we are given an n×mn \times m nonnegative matrix MM and an integer r>0r > 0. Our goal is to express MM as AWA W where AA and WW are nonnegative matrices of size n×rn \times r and r×mr \times m respectively. In some applications, it makes sense to ask instead for the product AWAW to approximate MM -- i.e. (approximately) minimize \norm{M - AW}_F where \norm{}_F denotes the Frobenius norm; we refer to this as Approximate NMF. This problem has a rich history spanning quantum mechanics, probability theory, data analysis, polyhedral combinatorics, communication complexity, demography, chemometrics, etc. In the past decade NMF has become enormously popular in machine learning, where AA and WW are computed using a variety of local search heuristics. Vavasis proved that this problem is NP-complete. We initiate a study of when this problem is solvable in polynomial time: 1. We give a polynomial-time algorithm for exact and approximate NMF for every constant rr. Indeed NMF is most interesting in applications precisely when rr is small. 2. We complement this with a hardness result, that if exact NMF can be solved in time (nm)o(r)(nm)^{o(r)}, 3-SAT has a sub-exponential time algorithm. This rules out substantial improvements to the above algorithm. 3. We give an algorithm that runs in time polynomial in nn, mm and rr under the separablity condition identified by Donoho and Stodden in 2003. The algorithm may be practical since it is simple and noise tolerant (under benign assumptions). Separability is believed to hold in many practical settings. To the best of our knowledge, this last result is the first example of a polynomial-time algorithm that provably works under a non-trivial condition on the input and we believe that this will be an interesting and important direction for future work.Comment: 29 pages, 3 figure

    Social Anomia against the Backdrop of Misinformation/ Disinformation: a Cognitive Approach to the Multivalent Data in Cyberspace

    Get PDF
    The present study is an attempt to problematize the multivalent data in the cyberspace through the lenses of Wittgenstein’s analytic philosophy of language. Adopting this linguistic philosophy approach is aimed at exploring the dichotomous question of whether cyberspace is a possibility for social power or it is a contributory cause of communicative discontinuity and henceforth a possibility for social anomia. The central argument here is that within cyberspace there exist three languages at work each one with a degree of semiotic power: pictorial, verbal and mathematical. Since none of them is based on a one-to-one correspondence between the signifiers and the signifieds, the cyberspace users in practice are led to misinformation and disinformation instead of information. This situation creates an epistemic chasm in their real life. This is because their finite mind is not able to grasp the infinite reality of the cyberspace multivalent data. Accordingly, cyberspace with its abundance of misinformation and disinformation leads us to a mental disorder. This constitutes the real power of social media in creating a socio-political turmoil and anomia

    Reliability-based assessment procedures for existing concrete structures

    Get PDF
    A feasibility study of reliability theory as a tool for the assessment of present safety and residual service life of damaged concrete structures has been performed in order to find a transparent methodology for the assessment procedure. It is concluded that the current guidelines are open to interpretation and that the variation in the results obtained regarding the structural safety is too great to be acceptable. Interpretations by the engineer are also included when deterministic methods are used, but probabilistic methods are more sensitive to the assumptions made and the differences in the results will therefore be greater. In a literature survey it is concluded that residual service life predictions should not be expected to be valid for more than 10 to 15 years, due to the large variability of the variables involved in the analysis. Based on these conclusions predictive models that are suitable for the inclusion of new data, and methods for the incorporation of new data are proposed. Information from the field of medical statistics and robotics suggests that linear regression models are well suited for this type of updated monitoring. Two test cases were studied, a concrete dam and a railway bridge. From the dam case, it was concluded that the safety philosophy in the deterministic dam specific assessment guidelines further development. Probabilistic descriptions of important variables, such as ice loads and friction coefficients, are needed if reliability theory is to be used for assessment purposes. During the study of the railway bridge it became clear that model uncertainties for different failure mechanisms used in concrete design are lacking. If Bayesian updating is to be used as a tool for incorporation of test data regarding concrete strength info the reliability analysis, a priori information must be established. A need for a probabilistic description of the hardening process of concrete was identified for the purpose of establishing a priori information. This description can also be used as qualitative assessment of the concrete. If there is a large discrepancy between the predicted value and the measured value, the concrete should be investigated regarding deterioration due to, for example internal frost or alkali silica reactions. Reliability theory is well suited for the assessment process since features of the reliability theory such as sensitivity analysis give good decision support for matters concerning both safety and service life predictions
    corecore