213 research outputs found

    Genesis of Altmetrics or Article-level Metrics for Measuring Efficacy of Scholarly Communications: Current Perspectives

    Get PDF
    The article-level metrics (ALMs) or altmetrics becomes a new trendsetter in recent times for measuring the impact of scientific publications and their social outreach to intended audiences. The popular social networks such as Facebook, Twitter, and Linkedin and social bookmarks such as Mendeley and CiteULike are nowadays widely used for communicating research to larger transnational audiences. In 2012, the San Francisco Declaration on Research Assessment got signed by the scientific and researchers communities across the world. This declaration has given preference to the ALM or altmetrics over traditional but faulty journal impact factor (JIF)-based assessment of career scientists. JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters' Web of Science database. Furthermore, JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. This paper provides a glimpse of genesis of altmetrics in measuring efficacy of scholarly communications and highlights available altmetric tools and social platforms linking altmetric tools, which are widely used in deriving altmetric scores of scholarly publications. The paper thus argues for institutions and policy makers to pay more attention to altmetrics based indicators for evaluation purpose but cautions that proper safeguards and validations are needed before their adoption

    The pros and cons of the use of altmetrics in research assessment

    Get PDF
    © 2020 The Authors. Published by Levi Library Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: http://doi.org/10.29024/sar.10Many indicators derived from the web have been proposed to supplement citation-based indicators in support of research assessments. These indicators, often called altmetrics, are available commercially from Altmetric.com and Elsevier’s Plum Analytics or can be collected directly. These organisations can also deliver altmetrics to support institutional selfevaluations. The potential advantages of altmetrics for research evaluation are that they may reflect important non-academic impacts and may appear before citations when an article is published, thus providing earlier impact evidence. Their disadvantages often include susceptibility to gaming, data sparsity, and difficulties translating the evidence into specific types of impact. Despite these limitations, altmetrics have been widely adopted by publishers, apparently to give authors, editors and readers insights into the level of interest in recently published articles. This article summarises evidence for and against extending the adoption of altmetrics to research evaluations. It argues that whilst systematicallygathered altmetrics are inappropriate for important formal research evaluations, they can play a role in some other contexts. They can be informative when evaluating research units that rarely produce journal articles, when seeking to identify evidence of novel types of impact during institutional or other self-evaluations, and when selected by individuals or groups to support narrative-based non-academic claims. In addition, Mendeley reader counts are uniquely valuable as early (mainly) scholarly impact indicators to replace citations when gaming is not possible and early impact evidence is needed. Organisations using alternative indicators need recruit or develop in-house expertise to ensure that they are not misused, however

    The Many Publics of Science: Using Altmetrics to Identify Common Communication Channels by Scientific field

    Full text link
    Altmetrics have led to new quantitative studies of science through social media interactions. However, there are no models of science communication that respond to the multiplicity of non-academic channels. Using the 3653 authors with the highest volume of altmetrics mentions from the main channels (Twitter, News, Facebook, Wikipedia, Blog, Policy documents, and Peer reviews) to their publications (2016-2020), it has been analyzed where the audiences of each discipline are located. The results evidence the generalities and specificities of these new communication models and the differences between areas. These findings are useful for the development of science communication policies and strategies

    Altmetrics for Digital Libraries: Concepts, Applications, Evaluation, and Recommendations

    Get PDF
    The volume of scientific literature is rapidly increasing, which has led to researchers becoming overloaded by the number of articles that they have available for reading and difficulties in estimating their quality and relevance (e.g., based on their research interests). Library portals, in these circumstances, are increasingly getting more relevant by using quality indicators that can help researchers during their research discovery process. Several evaluation methods (e.g., citations, Journal Impact Factor, and peer-reviews) have been used and suggested by library portals to help researchers filter out the relevant articles (e.g., articles that have received high citations) for their needs. However, in some cases, these methods have been criticized, and a number of weaknesses have been identified and discussed. For example, citations usually take a long time to appear, and some articles that are important can remain uncited. With the growing presence of social media today, new alternative indicators, known as “altmetrics,” have been encountered and proposed as complementary indicators to traditional measures (i.e., bibliometrics). They can help to identify the online attention received by articles, which might act as a further indicator for research assessment. One often mentioned advantage of these alternative indicators is, for example, that they appear much faster compared to citations. A large number of studies have explored altmetrics for different disciplines, but few studies have reported about altmetrics in the fields of Economics and Business Studies. Furthermore, no studies can be found so far that analyzed altmetrics within these disciplines with respect to libraries and information overload. Thus, this thesis explores opportunities for introducing altmetrics as new method for filtering relevant articles (in library portals) within the discipline of Economic and Business Studies literature. To achieve this objective, we have worked on four main aspects of investigating altmetrics and altmetrics data, respectively, of which the results can be used to fill the gap in this field of research. (1) We first highlight to what extent altmetric information from the two altmetric providers Mendeley and Altmetric.com is present within the journals of Economics and Business Studies. Based on the coverage, we demonstrate that altmetrics data are sparse in these disciplines, and when considering altmetrics data for real-world applications (e.g., in libraries), higher aggregation levels, such as journal level, can overcome their sparsity well. (2) We perform and discuss the correlations of citations on article and journal levels between different types and sources of altmetrics. We could show that Mendeley counts are positive and strongly correlated with citation counts on both article and journal levels, whereas other indicators such as Twitter counts and Altmetric Attention Score are significantly correlated only on journal level. With these correlations, we could suggest Mendeley counts for Economic and Business Studies journals/articles as an alternative indicator to citations. (3) In conjunction with the findings related to altmetrics in Economics and Business Studies journals, we discuss three use cases derived from three ZBW personas in terms of altmetrics. We investigate the use of altmetrics data for potential users with interests in new trends, social media platforms and journal rankings. (4) We investigated the behavior of economic researchers using a survey by exploring the usefulness of different altmetrics on journal level while they make decisions for selecting one article for reading. According to the user evaluation results, we demonstrate that altmetrics are not well known and understood by the economic community. However, this does not mean that these indicators are not helpful at all to economists. Instead, it brings forward the problem of how to introduce altmetrics to the economic community in the right way using which characteristics (e.g., as visible numbers attached at library records or behind the library’s relevance ranking system). Considering the aforementioned findings of this thesis, we can suggest several forms of presenting altmetric information in library portals, using EconBiz as the proof-of-concept, with the intention to assist both researchers and libraries to identify relevant journals or articles (e.g., highly mentioned online and recently published) for their need and to cope with the information overload

    Altmetrics and Open Access

    Get PDF
    Altmetrics, in contrast to traditional metrics, measure the societal impact research outputs have on the public in general, using social media platforms as their primary data sources. In this study, differences in Altmetric Scores between open and closed access articles of German research institutions in the field of natural sciences have been analyzed. For this investigation data from the years 2013 to 2017 was gathered from Web of Science, Altmetric.com and Unpaywall. Results indicated that articles published in open access gain higher Altmetric Attention Scores compared to articles behind subscription paywalls, although the difference was statistically not significant. Research outputs published in gold open access had the highest scores, followed by articles in green and then hybrid open access. Furthermore, articles by publishers with higher percentages of open access content gained higher Altmetric Attention Scores than articles distributed by those with medium or low percentages. In a future study additional databases could be included as well as data from years to come. Moreover, a comparable study for the field of humanities would be conceivable, including other document types such as books or contributions in anthologies as well.Altmetrics messen, im Gegensatz zu traditionellen Metriken, den Einfluss von Forschungsergebnissen auf die breite Gesellschaft und nutzen dafür vor allem Social- Media-Plattformen als Datenquelle. In dieser Studie wurden Unterschiede in Altmetric Scores von in Open und Closed Access publizierten Artikeln deutscher Forschungseinrichtungen in den Naturwissenschaften untersucht. Hierfür wurden Daten der Jahre 2013 bis 2017 von Web of Science, Altmetric.com und Unpaywall gesammelt. Die Ergebnisse wiesen darauf hin, dass Artikel in Open Access höhere Altmetric Attention Scores erhalten als Artikel hinter Bezahlschranken. Eine statistische Signifikanz dieser Ergebnisse konnte jedoch nicht nachgewiesen werden. In Gold Open Access publizierte Forschungsergebnisse erreichten die höchsten Werte, gefolgt von in Green und Hybrid Open Access publizierten Artikeln. Zudem wiesen Artikel, die von Verlagen mit hohen Anteilen an Open Access-Inhalten veröffentlicht wurden, höhere Scores auf als jene von Verlagen mit mittleren bis niedrigen Anteilen. In zukünftige umfassendere Studien könnten zusĂ€tzliche Datenbanken einbezogen werden sowie Daten aus den kommenden Jahren. Zudem wĂ€re eine vergleichbare Studie für die Geisteswissenschaften denkbar, unter Einbezug weiterer Dokumententypen wie Büchern und BeitrĂ€gen in SammelbĂ€nden

    Altmetrics: Metrics beyond traditional citations

    Get PDF
    Altmetrics is a movement that aims to capture new and previously invisible types of impact of scholarly publications on social web platforms such as news sites, Wikipedia, blogs, microblogs, social bookmarking tools and online reference managers. For evaluating the present work the authors used an online aggregator Altmetric.com which helps in exploring and collecting the social attention score of the research output globally through different platforms. For the collection of data, the authors used a subscription based aggregator Altmetric.com. The data of 1266 journals were collected on certain parameters: Platforms; Mention types; Twitter Demographics; Department wise. First the data were collected for analyzing the possible quantity of platforms used for mentioning the research output of these journals with their altmetric mention score, then followed by data collection as per mention type with their social attention score like Facebook, News Story, Twitter etc. Another parameter which was twitter demographics of the countries in which the data were collected of 207 countries in terms of posts and profiles. Then the last collection was collected to analyze the altmetric attention score taken by the selected departments. In this way data was collected as per objectives and made the study relevant and result oriented

    Social media metrics for new research evaluation

    Get PDF
    This chapter approaches, both from a theoretical and practical perspective, the most important principles and conceptual frameworks that can be considered in the application of social media metrics for scientific evaluation. We propose conceptually valid uses for social media metrics in research evaluation. The chapter discusses frameworks and uses of these metrics as well as principles and recommendations for the consideration and application of current (and potentially new) metrics in research evaluation.Comment: Forthcoming in Glanzel, W., Moed, H.F., Schmoch U., Thelwall, M. (2018). Springer Handbook of Science and Technology Indicators. Springe

    Altmetrics for Digital Libraries

    Get PDF
    The volume of scientific literature is increasing and researchers have difficulties in estimating their quality and relevance. Library portals, therefore, are getting more relevant by using quality indicators to help researchers during their research process. With the growing presence of social media, altmetrics have been proposed as complementary indicators to traditional measures. Altmetrics can help to identify online attention and can appear much faster than citations. This study explores altmetrics for filtering relevant articles (in library portals) within the discipline of Economic and Business Studies literature. Firstly, it highlights the altmetrics presence from Mendeley and Altmetric.com for the journals in the above-mentioned disciplines. It presents correlations between citation and altmetrics on article and journal level, suggesting Mendeley counts as an alternative indicator to citations. Afterward, it investigates the use of altmetrics data for potential users with interests in new trends, social media platforms, and journal rankings. Lastly, it explores the behavior of economic researchers using a survey by discovering the usefulness of different altmetrics. With the findings of this study, several forms of altmetrics in library portals are discussed, using EconBiz as the proof-of-concept, to assist both researchers and libraries to identify relevant journals or articles and to cope with the information overload
    • 

    corecore