9 research outputs found

    The State of Altmetrics: A Tenth Anniversary Celebration

    Get PDF
    Altmetric’s mission is to help others understand the influence of research online.We collate what people are saying about published research in sources such as the mainstream media, policy documents, social networks, blogs, and other scholarly and non-scholarly forums to provide a more robust picture of the influence and reach of scholarly work. Altmetric works with some of the biggest publishers, funders, businesses and institutions around the world to deliver this data in an accessible and reliable format. Contents Altmetrics, Ten Years Later, Euan Adie (Altmetric (founder) & Overton) Reflections on Altmetrics, Gemma Derrick (University of Lancaster), Fereshteh Didegah (Karolinska Institutet & Simon Fraser University), Paul Groth (University of Amsterdam), Cameron Neylon (Curtin University), Jason Priem (Our Research), Shenmeng Xu (University of North Carolina at Chapel Hill), Zohreh Zahedi (Leiden University) Worldwide Awareness and Use of Altmetrics, Yin-Leng Theng (Nanyang Technological University) Leveraging Machine Learning on Altmetrics Big Data, Saeed-Ul Hassan (Information Technology University), Naif R. Aljohani (King Abdulaziz University), Timothy D. Bowman (Wayne State University) Altmetrics as Social-Spatial Sensors, Vanash M. Patel (West Hertfordshire Hospitals NHS Trust), Robin Haunschild (Max Planck Institute for Solid State Research), Lutz Bornmann (Administrative Headquarters of the Max Planck Society) Altmetric’s Fable of the Hare and the Tortoise, Mike Taylor (Digital Science) The Future of Altmetrics: A Community Vision, Liesa Ross (Altmetric), Stacy Konkiel (Altmetric

    Scholarly use of social media and altmetrics : a review of the literature

    Full text link
    Social media has become integrated into the fabric of the scholarly communication system in fundamental ways: principally through scholarly use of social media platforms and the promotion of new indicators on the basis of interactions with these platforms. Research and scholarship in this area has accelerated since the coining and subsequent advocacy for altmetrics—that is, research indicators based on social media activity. This review provides an extensive account of the state-of-the art in both scholarly use of social media and altmetrics. The review consists of two main parts: the first examines the use of social media in academia, examining the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system

    Analysis of Emerging Reputation and Funding Mechanisms in the Context of Open Science 2.0

    Get PDF
    This report covers the outcomes of two studies funded by JRC IPTS to explore emerging drivers for Open Science 2.0. In general, Open Science 2.0 is associated with themes such as open access to scientific outputs, open data, citizen science and open peer evaluation systems. This study, however, focused on less explored themes, namely on alternative funding mechanisms for scientific research and on emerging reputation mechanisms for scholars resulting from Web 2.0 platforms and applications. It has been demonstrated that both are providing significant new opportunities for researchers to disseminate, share, explore and collaborate with other researchers, but it remains to be seen whether they will be able to bring about more disruptive change in how science and research systems function in the future. They could well do so, especially if related changes being considered by the European Commission on ‘Science 2.0: Science in Transition’ are taken into account.JRC.J.3-Information Societ

    (E‑) Valuative Metrics as a Contested Field: A Comparative Analysis of the Altmetrics‑ and the Leiden Manifesto

    Get PDF
    This article comparatively analyzes two manifestos in the field of quantitative science evaluation, the Altmetrics Manifesto (AM) and the Leiden Manifesto (LM). It employs perspectives from the Sociology of (E-) Valuation to make sense of highly visible critiques that organize the current discourse. Four motifs can be reconstructed from the manifestos’ valuation strategies. The AM criticizes the confinedness of established evaluation practices and pledges for an expansion of quantitative research evaluation. The LM denounces the proliferation of ill-applied research metrics and calls for an enclosure of metric research assessment. It can be shown that these motifs are organized diametrically: The two manifestos represent opposed positions in a critical discourse on (e-) valuative metrics. They manifest quantitative science evaluation as a contested field.Peer Reviewe

    The state of altmetrics: a tenth anniversary celebration

    Get PDF
    Altmetric’s mission is to help others understand the influence of research online.We collate what people are saying about published research in sources such as the mainstream media, policy documents, social networks, blogs, and other scholarly and non-scholarly forums to provide a more robust picture of the influence and reach of scholarly work. Altmetric works with some of the biggest publishers, funders, businesses and institutions around the world to deliver this data in an accessible and reliable format.ContentsAltmetrics, Ten Years Later, Euan Adie (Altmetric (founder) & Overton)Reflections on Altmetrics, Gemma Derrick (University of Lancaster), Fereshteh Didegah (Karolinska Institutet & Simon Fraser University), Paul Groth (University of Amsterdam), Cameron Neylon (Curtin University), Jason Priem (Our Research), Shenmeng Xu (University of North Carolina at Chapel Hill), Zohreh Zahedi (Leiden University)Worldwide Awareness and Use of Altmetrics, Yin-Leng Theng (Nanyang Technological University)Leveraging Machine Learning on Altmetrics Big Data, Saeed-Ul Hassan (Information Technology University), Naif R. Aljohani (King Abdulaziz University), Timothy D. Bowman (Wayne State University)Altmetrics as Social-Spatial Sensors, Vanash M. Patel (West Hertfordshire Hospitals NHS Trust), Robin Haunschild (Max Planck Institute for Solid State Research), Lutz Bornmann (Administrative Headquarters of the Max Planck Society)Altmetric’s Fable of the Hare and the Tortoise, Mike Taylor (Digital Science)The Future of Altmetrics: A Community Vision, Liesa Ross (Altmetric), Stacy Konkiel (Altmetric)https://digitalcommons.unl.edu/scholcom/170 Merit, Expertise and Measuremen

    E-visibility of environmental sciences researchers at the University of South Africa

    Get PDF
    Abstract : Research e-visibility in theory enables a researcher to establish and maintain a digital research portfolio utilising various research e-profiles on a number of research online communities and platforms. E-visibility embodies the online presence of the researcher and their research, researcher’s discoverability via research e-profiles and the accessibility of research output on online research communities. The rationale for this study has its foundation in the premise that enhancing the e-visibility of a researcher will increase the research and societal impact of the researcher. The development of an e-visibility strategy for the School of Environmental Sciences (SES) at the University of South Africa (Unisa) would be instrumental in enhancing the e-visibility of the researchers. This study aims at establishing guidelines for the development of an e-visibility strategy for SES researchers at Unisa as part of research support via the Library services. Altmetric and bibliometric data of the SES researchers, were collected during the 2-year period (December 2014 and December 2017) and e-visibility surveys were conducted at the beginning of the study (December 2014) and at the end of the study (April 2017) as part of a longitudinal e-visibility study. The data was analysed using statistical methods to ascertain: 1) the SES researchers e-visibility status, 2) the SES researchers’ perceptions about e-visibility, 3) the altmetric-bibliometric correlations (relationships) from the altmetrics sourced from the academic social networking tools and the bibliometrics derived from the citation resources, and 4) identifying e-visibility practices and actions increasing research and societal impact. The results reflected a total increase in online presence, discoverability, and accessibility therefore indicating an overall increase in the actual and perceived e-visibility of the SES researchers. The survey conducted at the end of the study, found that 73% of the SES researchers indicating that their e-visibility increased with online presence being enhanced, 69% were more discoverable and 76% of their research output was more accessible after applying what they learnt during the e-visibility awareness and training...Ph.D. (Information Management

    Identifying the Invisible Impact of Scholarly Publications: A Multi-Disciplinary Analysis Using Altmetrics

    Get PDF
    A thesis submitted in partial fulfilment of the requirements of the University of Wolverhampton for the degree of Doctor of Philosophy.The field of ‘altmetrics’ is concerned with alternative metrics for the impact of research publications using social web data. Empirical studies are needed, however, to assess the validity of altmetrics from different perspectives. This thesis partly fills this gap by exploring the suitability and reliability of two altmetrics resources: Mendeley, a social reference manager website, and Faculty of F1000 (F1000), a post- publishing peer review platform. This thesis explores the correlations between the new metrics and citations at the level of articles for several disciplines and investigates the contexts in which the new metrics can be useful for research evaluation across different fields. Low and medium correlations were found between Mendeley readership counts and citations for Social Sciences, Humanities, Medicine, Physics, Chemistry and Engineering articles from the Web of Science (WoS), suggesting that Mendeley data may reflect different aspects of research impact. A comparison between information flows based on Mendeley bookmarking data and cross-disciplinary citation analysis for social sciences and humanities disciplines revealed substantial similarities and some differences. This suggests that Mendeley readership data could be used to help identify knowledge transfer between scientific disciplines, especially for people that read but do not author articles, as well as providing evidence of impact at an earlier stage than is possible with citation counts. The majority of Mendeley readers for Clinical Medicine, Engineering and Technology, Social Science, Physics and Chemistry papers were PhD students and postdocs. The highest correlations between citations and Mendeley readership counts were for types of Mendeley users that often authored academic papers, suggesting that academics bookmark papers in Mendeley for reasons related to scientific publishing. In order to identify the extent to which Mendeley bookmarking counts reflect readership and to establish the motivations for bookmarking scientific papers in Mendeley, a large-scale survey found that 83% of Mendeley users read more than half of the papers in their personal libraries. The main reasons for bookmarking papers were citing in future publications, using in professional activities, citing in a thesis, and using in teaching and assignments. Thus, Mendeley bookmarking counts can potentially indicate the readership impact of research papers that have educational value for non-author users inside academia or the impact of research papers on practice for readers outside academia. This thesis also examines the relationship between article types (i.e., “New Finding”, “Confirmation”, “Clinical Trial”, “Technical Advance”, “Changes to Clinical Practice”, “Review”, “Refutation”, “Novel Drug Target”), citation counts and F1000 article factors (FFa). In seven out of nine cases, there were no significant differences between article types in terms of rankings based on citation counts and the F1000 Article Factor (FFa) scores. Nevertheless, citation counts and FFa scores were significantly different for articles tagged: “New finding” or “Changes to Clinical Practice”. This means that F1000 could be used in research evaluation exercises when the importance of practical findings needs to be recognised. Furthermore, since the majority of the studied articles were reviewed in their year of publication, F1000 could also be useful for quick evaluations

    B!SON: A Tool for Open Access Journal Recommendation

    Get PDF
    Finding a suitable open access journal to publish scientific work is a complex task: Researchers have to navigate a constantly growing number of journals, institutional agreements with publishers, funders’ conditions and the risk of Predatory Publishers. To help with these challenges, we introduce a web-based journal recommendation system called B!SON. It is developed based on a systematic requirements analysis, built on open data, gives publisher-independent recommendations and works across domains. It suggests open access journals based on title, abstract and references provided by the user. The recommendation quality has been evaluated using a large test set of 10,000 articles. Development by two German scientific libraries ensures the longevity of the project
    corecore