1,439 research outputs found
Genesis of Altmetrics or Article-level Metrics for Measuring Efficacy of Scholarly Communications: Current Perspectives
The article-level metrics (ALMs) or altmetrics becomes a new trendsetter in
recent times for measuring the impact of scientific publications and their
social outreach to intended audiences. The popular social networks such as
Facebook, Twitter, and Linkedin and social bookmarks such as Mendeley and
CiteULike are nowadays widely used for communicating research to larger
transnational audiences. In 2012, the San Francisco Declaration on Research
Assessment got signed by the scientific and researchers communities across the
world. This declaration has given preference to the ALM or altmetrics over
traditional but faulty journal impact factor (JIF)-based assessment of career
scientists. JIF does not consider impact or influence beyond citations count as
this count reflected only through Thomson Reuters' Web of Science database.
Furthermore, JIF provides indicator related to the journal, but not related to
a published paper. Thus, altmetrics now becomes an alternative metrics for
performance assessment of individual scientists and their contributed scholarly
publications. This paper provides a glimpse of genesis of altmetrics in
measuring efficacy of scholarly communications and highlights available
altmetric tools and social platforms linking altmetric tools, which are widely
used in deriving altmetric scores of scholarly publications. The paper thus
argues for institutions and policy makers to pay more attention to altmetrics
based indicators for evaluation purpose but cautions that proper safeguards and
validations are needed before their adoption
Scholarly Metrics Baseline: A Survey of Faculty Knowledge, Use, and Opinion About Scholarly Metrics
This article presents the results of a faculty survey conducted at the University of Vermont during academic year 2014-2015. The survey asked faculty about: familiarity with scholarly metrics, metric seeking habits, help seeking habits, and the role of metrics in their department’s tenure and promotion process. The survey also gathered faculty opinions on how well scholarly metrics reflect the importance of scholarly work and how faculty feel about administrators gathering institutional scholarly metric information. Results point to the necessity of understanding the campus landscape of faculty knowledge, opinion, importance, and use of scholarly metrics before engaging faculty in further discussions about quantifying the impact of their scholarly work
Utilising content marketing metrics and social networks for academic visibility
There are numerous assumptions on research evaluation in terms of quality and relevance of academic contributions. Researchers are becoming increasingly acquainted with bibliometric indicators, including; citation analysis, impact factor, h-index, webometrics and academic social networking sites. In this light, this chapter presents a review of these concepts as it considers relevant theoretical underpinnings that are related to the content marketing of scholars. Therefore, this contribution critically evaluates previous papers that revolve on the subject of academic reputation as it deliberates on the individual researchers’ personal branding. It also explains how metrics are currently being used to rank the academic standing of journals as well as higher educational institutions. In a nutshell, this chapter implies that the scholarly impact depends on a number of factors including accessibility of publications, peer review of academic work as well as social networking among scholars.peer-reviewe
Measuring Social Media Activity of Scientific Literature: An Exhaustive Comparison of Scopus and Novel Altmetrics Big Data
This paper measures social media activity of 15 broad scientific disciplines
indexed in Scopus database using Altmetric.com data. First, the presence of
Altmetric.com data in Scopus database is investigated, overall and across
disciplines. Second, the correlation between the bibliometric and altmetric
indices is examined using Spearman correlation. Third, a zero-truncated
negative binomial model is used to determine the association of various factors
with increasing or decreasing citations. Lastly, the effectiveness of altmetric
indices to identify publications with high citation impact is comprehensively
evaluated by deploying Area Under the Curve (AUC) - an application of receiver
operating characteristic. Results indicate a rapid increase in the presence of
Altmetric.com data in Scopus database from 10.19% in 2011 to 20.46% in 2015. A
zero-truncated negative binomial model is implemented to measure the extent to
which different bibliometric and altmetric factors contribute to citation
counts. Blog count appears to be the most important factor increasing the
number of citations by 38.6% in the field of Health Professions and Nursing,
followed by Twitter count increasing the number of citations by 8% in the field
of Physics and Astronomy. Interestingly, both Blog count and Twitter count
always show positive increase in the number of citations across all fields.
While there was a positive weak correlation between bibliometric and altmetric
indices, the results show that altmetric indices can be a good indicator to
discriminate highly cited publications, with an encouragingly AUC= 0.725
between highly cited publications and total altmetric count. Overall, findings
suggest that altmetrics could better distinguish highly cited publications.Comment: 34 Pages, 3 Figures, 15 Table
Are methodological quality and completeness of reporting associated with citation-based measures of publication impact? A secondary analysis of a systematic review of dementia biomarker studies
Objective: To determine whether methodological and reporting quality are associated with surrogate measures of publication impact in the field of dementia biomarker studies.
Methods: We assessed dementia biomarker studies included in a previous systematic review in terms of methodological and reporting quality using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) and Standards for Reporting of Diagnostic Accuracy (STARD), respectively. We extracted additional study and journal-related data from each publication to account for factors shown to be associated with impact in previous research. We explored associations between potential determinants and measures of publication impact in univariable and stepwise multivariable linear regression analyses.
Outcome measures: We aimed to collect data on four measures of publication impact: two traditional measures—average number of citations per year and 5-year impact factor of the publishing journal and two alternative measures—the Altmetric Attention Score and counts of electronic downloads.
Results: The systematic review included 142 studies. Due to limited data, Altmetric Attention Scores and electronic downloads were excluded from the analysis, leaving traditional metrics as the only analysed outcome measures. We found no relationship between QUADAS and traditional metrics. Citation rates were independently associated with 5-year journal impact factor (β=0.42; p<0.001), journal subject area (β=0.39; p<0.001), number of years since publication (β=-0.29; p<0.001) and STARD (β=0.13; p<0.05). Independent determinants of 5-year journal impact factor were citation rates (β=0.45; p<0.001), statement on conflict of interest (β=0.22; p<0.01) and baseline sample size (β=0.15; p<0.05).
Conclusions: Citation rates and 5-year journal impact factor appear to measure different dimensions of impact. Citation rates were weakly associated with completeness of reporting, while neither traditional metric was related to methodological rigour. Our results suggest that high publication usage and journal outlet is not a guarantee of quality and readers should critically appraise all papers regardless of presumed impact
- …