458 research outputs found

    Why do papers have many Mendeley readers but few Scopus-indexed citations and vice versa?

    Get PDF
    Counts of citations to academic articles are widely used as indicators of their scholarly impact. In addition, alternative indicators derived from social websites have been proposed to cover some of the shortcomings of citation counts. The most promising such indicator is counts of readers of an article in the social reference sharing site Mendeley. Although Mendeley reader counts tend to correlate strongly and positively with citation counts within scientific fields, an understanding of causes of citation-reader anomalies is needed before Mendeley reader counts can be used with confidence as indicators. In response, this article proposes a list reasons for anomalies based upon an analysis of articles that are highly cited but have few Mendeley readers, or vice versa. The results show that there are both technical and legitimate reasons for differences, with the latter including communities that use research but do not cite it in Scopus-indexed publications or do not use Mendeley. The results also suggest that the lower of the two values (citation counts, reader counts) tends to underestimate of the impact of an article and so taking the maximum is a reasonable strategy for a combined impact indicator

    Differences between journals and years in the proportions of students, researchers and faculty registering Mendeley articles

    Get PDF
    This article contains two investigations into Mendeley reader counts with the same dataset. Mendeley reader counts provide evidence of early scholarly impact for journal articles, but reflect the reading of a relatively young subset of all researchers. To investigate whether this age bias is constant or varies by narrow field and publication year, this article compares the proportions of student, researcher and faculty readers for articles published 1996-2016 in 36 large monodisciplinary journals. In these journals, undergraduates recorded the newest research and faculty the oldest, with large differences between journals. The existence of substantial differences in the composition of readers between related fields points to the need for caution when using Mendeley readers as substitutes for citations for broad fields. The second investigation shows, with the same data, that there are substantial differences between narrow fields in the time taken for Scopus citations to be as numerous as Mendeley readers. Thus, even narrow field differences can impact on the relative value of Mendeley compared to citation counts

    COVID-19 publications: Database coverage, citations, readers, tweets, news, Facebook walls, Reddit posts

    Get PDF
    © 2020 The Authors. Published by MIT Press. This is an open access article available under a Creative Commons licence. The published version can be accessed at the following link on the publisher’s website: https://doi.org/10.1162/qss_a_00066The COVID-19 pandemic requires a fast response from researchers to help address biological, medical and public health issues to minimize its impact. In this rapidly evolving context, scholars, professionals and the public may need to quickly identify important new studies. In response, this paper assesses the coverage of scholarly databases and impact indicators during 21 March to 18 April 2020. The rapidly increasing volume of research, is particularly accessible through Dimensions, and less through Scopus, the Web of Science, and PubMed. Google Scholar’s results included many false matches. A few COVID-19 papers from the 21,395 in Dimensions were already highly cited, with substantial news and social media attention. For this topic, in contrast to previous studies, there seems to be a high degree of convergence between articles shared in the social web and citation counts, at least in the short term. In particular, articles that are extensively tweeted on the day first indexed are likely to be highly read and relatively highly cited three weeks later. Researchers needing wide scope literature searches (rather than health focused PubMed or medRxiv searches) should start with Dimensions (or Google Scholar) and can use tweet and Mendeley reader counts as indicators of likely importance

    Early Mendeley readers correlate with later citation counts

    Get PDF
    This is an accepted manuscript of an article published by Springer in Scientometrics on 26/03/2018, available online: https://doi.org/10.1007/s11192-018-2715-9 The accepted version of the publication may differ from the final published version.Counts of the number of readers registered in the social reference manager Mendeley have been proposed as an early impact indicator for journal articles. Although previous research has shown that Mendeley reader counts for articles tend to have a strong positive correlation with synchronous citation counts after a few years, no previous studies have compared early Mendeley reader counts with later citation counts. In response, this first diachronic analysis compares reader counts within a month of publication with citation counts after 20 months for ten fields. There were moderate or strong correlations in eight out of ten fields, with the two exceptions being the smallest categories (n=18, 36) with wide confidence intervals. The correlations are higher than the correlations between later citations and early citations, showing that Mendeley reader counts are more useful early impact indicators than citation counts

    Are Mendeley Reader Counts Useful Impact Indicators in all Fields?

    Get PDF
    Reader counts from the social reference sharing site Mendeley are known to be valuable for early research evaluation. They have strong correlations with citation counts for journal articles but appear about a year before them. There are disciplinary differences in the value of Mendeley reader counts but systematic evidence is needed at the level of narrow fields to reveal its extent. In response, this article compares Mendeley reader counts with Scopus citation counts for journal articles from 2012 in 325 narrow Scopus fields. Despite strong positive correlations in most fields, averaging 0.671, the correlations in some fields are as weak as 0.255. Technical reasons explain most weaker correlations, suggesting that the underlying relationship is almost always strong. The exceptions are caused by unusually high educational or professional use or topics of interest within countries that avoid Mendeley. The findings suggest that if care is taken then Mendeley reader counts can be used for early citation impact evidence in almost all fields and for related impact in some of the remainder. As an additional application of the results, cross-checking with Mendeley data can be used to identify indexing anomalies in citation databases

    Can Microsoft Academic be used for citation analysis of preprint archives? The case of the Social Science Research Network

    Get PDF
    This is an accepted manuscript of an article published by Springer in Scientometrics on 07/03/2018, available online: https://doi.org/10.1007/s11192-018-2704-z The accepted version of the publication may differ from the final published version.Preprint archives play an important scholarly communication role within some fields. The impact of archives and individual preprints are difficult to analyse because online repositories are not indexed by the Web of Science or Scopus. In response, this article assesses whether the new Microsoft Academic can be used for citation analysis of preprint archives, focusing on the Social Science Research Network (SSRN). Although Microsoft Academic seems to index SSRN comprehensively, it groups a small fraction of SSRN papers into an easily retrievable set that has variations in character over time, making any field normalisation or citation comparisons untrustworthy. A brief parallel analysis of arXiv suggests that similar results would occur for other online repositories. Systematic analyses of preprint archives are nevertheless possible with Microsoft Academic when complete lists of archive publications are available from other sources because of its promising coverage and citation results

    Devising a Resilience Rating System For Charities & The Non-Profit Sector

    Get PDF
    One of the sectoral issues that COVID has shone a light on is that whilst social investors, grant funders and sector support organisations acquire detailed data about the activities they have commissioned individually, they do not have access to a similar level of data about the wider sectors in which they are operating. �This report was written in the first 4 weeks of lockdown in 2020 and developed a framework for assessing financial resilience of Third Sector Organisations

    A Path Toward the Use of Trail Users’ Tweets to Assess Effectiveness of the Environmental Stewardship Scheme: An Exploratory Analysis of the Pennine Way National Trail

    Get PDF
    Large and unofficial data sets, for instance those gathered from social media, are increasingly being used in geographical research and explored as decision support tools for policy development. Social media data have the potential to provide new insight into phenomena about which there is little information from conventional sources. Within this context, this paper explores the potential of social media data to evaluate the aesthetic management of landscape. Specifically, this project utilises the perceptions of visitors to the Pennine Way National Trail, which passes through land managed under the Environmental Stewardship Scheme (ESS). The method analyses sentiment in trail users’ public Twitter messages (tweets) with the aim of assessing the extent to which the ESS maintains landscape character within the trail corridor. The method demonstrates the importance of filtering social media data to convert it into useful information. After filtering, the results are based on 161 messages directly related to the trail. Although small, this sample illustrates the potential for social media to be used as a cheap and increasingly abundant source of information. We suggest that social media data in this context should be seen as a resource that can complement, rather than replace, conventional data sources such as questionnaires and interviews. Furthermore, we provide guidance on how social media could be effectively used by conservation bodies, such as Natural England, which are charged with the management of areas of environmental value worldwide

    Do ResearchGate Scores create ghost academic reputations?

    Get PDF
    [EN] The academic social network site ResearchGate (RG) has its own indicator, RG Score, for its members. The high profile nature of the site means that the RG Score may be used for recruitment, promotion and other tasks for which researchers are evaluated. In response, this study investigates whether it is reasonable to employ the RG Score as evidence of scholarly reputation. For this, three different author samples were investigated. An outlier sample includes 104 authors with high values. A Nobel sample comprises 73 Nobel winners from Medicine and Physiology, Chemistry, Physics and Economics (from 1975 to 2015). A longitudinal sample includes weekly data on 4 authors with different RG Scores. The results suggest that high RG Scores are built primarily from activity related to asking and answering questions in the site. In particular, it seems impossible to get a high RG Score solely through publications. Within RG it is possible to distinguish between (passive) academics that interact little in the site and active platform users, who can get high RG Scores through engaging with others inside the site (questions, answers, social networks with influential researchers). Thus, RG Scores should not be mistaken for academic reputation indicators.Alberto Martin-Martin enjoys a four-year doctoral fellowship (FPU2013/05863) granted by the Ministerio de Educacion, Cultura, y Deporte (Spain). Enrique Orduna-Malea holds a postdoctoral fellowship (PAID-10-14), from the Polytechnic University of Valencia (Spain).Orduña Malea, E.; Martín-Martín, A.; Thelwall, M.; Delgado-López-Cózar, E. (2017). Do ResearchGate Scores create ghost academic reputations?. Scientometrics. 112(1):443-460. https://doi.org/10.1007/s11192-017-2396-9S4434601121Bosman, J. & Kramer, B. (2016). Innovations in scholarly communication—data of the global 2015–2016 survey. Available at: http://zenodo.org/record/49583 #. Accessed December 11, 2016.González-Díaz, C., Iglesias-García, M., & Codina, L. (2015). Presencia de las universidades españolas en las redes sociales digitales científicas: Caso de los estudios de comunicación. El profesional de la información, 24(5), 1699–2407.Goodwin, S., Jeng, W., & He, D. (2014). Changing communication on ResearchGate through interface updates. Proceedings of the American Society for Information Science and Technology, 51(1), 1–4.Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520(7548), 429–431.Hoffmann, C. P., Lutz, C., & Meckel, M. (2015). A relational altmetric? Network centrality on ResearchGate as an indicator of scientific impact. Journal of the Association for Information Science and Technology, 67(4), 765–775.Jiménez-Contreras, E., de Moya Anegón, F., & Delgado López-Cózar, E. (2003). The evolution of research activity in Spain: The impact of the National Commission for the Evaluation of Research Activity (CNEAI). Research Policy, 32(1), 123–142.Jordan, K. (2014a). Academics’ awareness, perceptions and uses of social networking sites: Analysis of a social networking sites survey dataset (December 3, 2014). Available at: http://dx.doi.org/10.2139/ssrn.2507318 . Accessed December 11, 2016.Jordan, K. (2014b). Academics and their online networks: Exploring the role of academic social networking sites. First Monday, 19(11). Available at: http://dx.doi.org/10.5210/fm.v19i11.4937 . Accessed December 11, 2016.Jordan, K. (2015). Exploring the ResearchGate score as an academic metric: reflections and implications for practice. Quantifying and Analysing Scholarly Communication on the Web (ASCW’15), 30 June 2015, Oxford. Available at: http://ascw.know-center.tugraz.at/wp-content/uploads/2015/06/ASCW15_jordan_response_kraker-lex.pdf . Accessed December 11, 2016.Kadriu, A. (2013). Discovering value in academic social networks: A case study in ResearchGate. Proceedings of the ITI 2013—35th Int. Conf. on Information Technology Interfaces Information Technology Interfaces, pp. 57–62.Kraker, P. & Lex, E. (2015). A critical look at the ResearchGate score as a measure of scientific reputation. Proceedings of the Quantifying and Analysing Scholarly Communication on the Web workshop (ASCW’15), Web Science conference 2015. Available at: http://ascw.know-center.tugraz.at/wp-content/uploads/2016/02/ASCW15_kraker-lex-a-critical-look-at-the-researchgate-score_v1-1.pdf . Accessed December 11, 2016.Li, L., He, D., Jeng, W., Goodwin, S. & Zhang, C. (2015). Answer quality characteristics and prediction on an academic Q&A Site: A case study on ResearchGate. Proceedings of the 24th International Conference on World Wide Web Companion, pp. 1453–1458.Martín-Martín, A., Orduna-Malea, E., Ayllón, J. M. & Delgado López-Cózar, E. (2016). The counting house: measuring those who count. Presence of Bibliometrics, Scientometrics, Informetrics, Webometrics and Altmetrics in the Google Scholar Citations, ResearcherID, ResearchGate, Mendeley & Twitter. Available at: https://arxiv.org/abs/1602.02412 . Accessed December 11, 2016.Martín-Martín, A., Orduna-Malea, E. & Delgado López-Cózar, E. (2016). The role of ego in academic profile services: Comparing Google Scholar, ResearchGate, Mendeley, and ResearcherID. Researchgate, Mendeley, and Researcherid. The LSE Impact of Social Sciences blog. Available at: http://blogs.lse.ac.uk/impactofsocialsciences/2016/03/04/academic-profile-services-many-mirrors-and-faces-for-a-single-ego . Accessed December 11, 2016.Matthews, D. (2016). Do academic social networks share academics’ interests?. Times Higher Education. Available at: https://www.timeshighereducation.com/features/do-academic-social-networks-share-academics-interests . Accessed December 11, 2016.Memon, A. R. (2016). ResearchGate is no longer reliable: leniency towards ghost journals may decrease its impact on the scientific community. Journal of the Pakistan Medical Association, 66(12), 1643–1647.Mikki, S., Zygmuntowska, M., Gjesdal, Ø. L. & Al Ruwehy, H. A. (2015). Digital presence of norwegian scholars on academic network sites-where and who are they?. Plos One 10(11). Available at: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0142709 . Accessed December 11, 2016.Nicholas, D., Clark, D., & Herman, E. (2016). ResearchGate: Reputation uncovered. Learned Publishing, 29(3), 173–182.Orduna-Malea, E., Martín-Martín, A., & Delgado López-Cózar, E. (2016). The next bibliometrics: ALMetrics (Author Level Metrics) and the multiple faces of author impact. El profesional de la información, 25(3), 485–496.Ortega, Jose L. (2015). Relationship between altmetric and bibliometric indicators across academic social sites: The case of CSIC’s members. Journal of informetrics, 9(1), 39–49.Ortega, Jose L. (2016). Social network sites for scientists. Cambridge: Chandos.Ovadia, S. (2014). ResearchGate and Academia. edu: Academic social networks. Behavioral & Social Sciences Librarian, 33(3), 165–169.Thelwall, M., & Kousha, K. (2015). ResearchGate: Disseminating, communicating, and measuring Scholarship? Journal of the Association for Information Science and Technology, 66(5), 876–889.Thelwall, M. & Kousha, K. (2017). ResearchGate articles: Age, discipline, audience size and impact. Journal of the Association for Information Science and Technology, 68(2), 468–479.Van Noorden, R. (2014). Online collaboration: Scientists and the social network. Nature, 512(7513), 126–129.Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S. et al. (2015). The Metric Tide: Independent Review of the Role of Metrics in Research Assessment and Management. HEFCE. Available at: http://doi.org/10.13140/RG.2.1.4929.1363 . Accessed December 11, 2016

    U.S. academic libraries: understanding their web presence and their relationship with economic indicators

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/s11192-013-1001-0The main goal of this research is to analyze the web structure and performance of units and services belonging to U.S. academic libraries in order to check their suitability for webometric studies. Our objectives include studying their possible correlation with economic data and assessing their use for complementary evaluation purposes. We conducted a survey of library homepages, institutional repositories, digital collections, and online catalogs (a total of 374 URLs) belonging to the 100 U.S. universities with the highest total expenditures in academic libraries according to data provided by the National Center for Education Statistics. Several data points were taken and analyzed, including web variables (page count, external links, and visits) and economic variables (total expenditures, expenditures on printed and electronic books, and physical visits). The results indicate that the variety of URL syntaxes is wide, diverse and complex, which produces a misrepresentation of academic libraries’ web resources and reduces the accuracy of web analysis. On the other hand, institutional and web data indicators are not highly correlated. Better results are obtained by correlating total library expenditures with URL mentions measured by Google (r = 0.546) and visits measured by Compete (r = 0.573), respectively. Because correlation values obtained are not highly significant, we estimate such correlations will increase if users can avoid linkage problems (due to the complexity of URLs) and gain direct access to log files (for more accurate data about visits).Orduña Malea, E.; Regazzi, JJ. (2014). U.S. academic libraries: understanding their web presence and their relationship with economic indicators. Scientometrics. 98(1):315-336. doi:10.1007/s11192-013-1001-0S315336981Adecannby, J. (2011). Web link analysis of interrelationship between top ten African universities and world universities. Annals of library and information studies, 58(2), 128–138.Aguillo, I. F. (2009). Measuring the institutions’ footprint in the web. Library Hi Tech, 27(4), 540–556.Aguillo, I. F., Ortega, J. L., & Fernández, M. (2008). Webometric Ranking of World Universities: Introduction, methodology, and future developments. Higher education in Europe, 33(2/3), 234–244.Aguillo, I. F., Ortega, J. L., Fernandez, M., & Utrilla, A. M. (2010). Indicators for a webometric ranking of open Access repositories. Scientometrics, 82(3), 477–486.Arakaki, M., & Willet, P. (2009). Webometric analysis of departments of librarianship and information science: A follow-up study. Journal of information science, 35(2), 143–152.Arlitsch, K., & O’Brian, P. S. (2012). Invisible institutional repositories: Addresing the low indexing ratios of IR in Google Scholar. Library Hi Tech, 30(1), 60–81.Bar-Ilan, J. (1999). Search engine results over time—A case study on search engine stability”. Cybermetrics, 2/3. Retrieved February 18, 2013 from http://www.cindoc.csic.es/cybermetrics/articles/v2i1p1.html.Bar-Ilan, J. (2001). Data collection methods on the Web for informetric purposes: A review and analysis. Scientometrics, 50(1), 7–32.Bermejo, F. (2007). The internet audience: Constitution & measurement. New York: Peter Lang Pub Incorporated.Buigues-Garcia, M., & Gimenez-Chornet, V. (2012). Impact of Web 2.0 on national libraries. International Journal of Information Management, 32(1), 3–10.Chu, H., He, S., & Thelwall, M. (2002). Library and information science schools in Canada and USA: A Webometric perspective. Journal of education for Library and Information Science, 43(2), 110–125.Chua, Alton, Y. K., & Goh, D. H. (2010). A study of Web 2.0 applications in library websites. Library and Information Science Research, 32(3), 203–211.Gallego, I., García, I.-M., & Rodríguez, L. (2009). Universities’ websites: Disclosure practices and the revelation of financial information. The International Journal of Digital Accounting Research, 9(15), 153–192.Gomes, B. & Smith, B. T. (2003). Detecting query-specific duplicate documents. [Patent]. Retrieved February 18, 2013 from http://www.patents.com/Detecting-query-specific-duplicate-documents/US6615209/en-US .Harinarayana, N. S., & Raju, N. V. (2010). Web 2.0 features in university library web sites. Electronic Library, 28(1), 69–88.Lewandowski, D., Wahlig, H., & Meyer-Bautor, G. (2006). The freshness of web search engine databases. Journal of Information Science, 32(2), 131–148.Mahmood, K., & Richardson, J. V, Jr. (2012). Adoption of Web 2.0 in US academic libraries: A survey of ARL library websites. Program, 45(4), 365–375.Orduña-Malea, E., & Ontalba-Ruipérez, J-A. (2012). Selective linking from social platforms to university websites: A case study of the Spanish academic system. Scientometrics. (in press).Ortega, J. L., & Aguillo, I. F. (2009). Mapping World-class universities on the Web. Information Processing and Management, 45(2), 272–279.Ortega, José L. & Aguillo, Isidro F. (2009b). North America Academic Web Space: Multicultural Canada vs. The United States Homogeneity. In: ASIST & ISSI pre-conference symposium on informetrics and scientometrics.Phan, T., Hardesty, L., Sheckells, C., & George, A. (2009). Documentation for the academic libraries survey (ALS) public-use data file: Fiscal year 2008. Washington DC: National Center for Education Statistics. Institute of Education Sciences U.S. Department of Education.Qiu, J., Cheng, J., & Wang, Z. (2004). An analysis of backlinks counts and web impact factors for Chinese university websites. Scientometrics, 60(3), 463–473.Regazzi, J. J. (2012a). Constrained?—An analysis of U.S. Academic Libraries and shifts in spending, staffing and utilization, 1998–2008. College and Research Libraries, 73(5), 449–468.Regazzi, J. J. (2012b). Comparing Academic Library Spending with Public Libraries, Public K-12 Schools, Higher Education Public Institutions, and Public Hospitals Between 1998–2008. Journal of Academic Librarianship, 38(4), 205–216.Rousseau, R. (1999). Daily time series of common single word searches in AltaVista and NorthernLight. Cybermetrics, 2/3. Retrieved February 18, 2013 from http://www.cindoc.csic.es/cybermetrics/articles/v2i1p2.html .Sato, S., & Itsumura, H. (2011). How do people use open access papers in non-academic activities? A link analysis of papers deposited in institutional repositories. Library, Information and Media Studies, 9(1), 51–64.Scholze, F. (2007). Measuring research impact in an open access environment. Liber Quarterly: The Journal of European Research Libraries, 17(1–4), 220–232.Smith, A. G. (2011). Wikipedia and institutional repositories: An academic symbiosis? In: Proceedings of the ISSI 2011 conference. Durban, South Africa, 4–7 July 2011. Retrieved February 18, 2013 from http://www.vuw.ac.nz/staff/alastair_smith/publns/SmithAG2011_ISSI_paper.pdf .Smith, A.G. (2012). Webometric evaluation of institutional repositories. In: Proceedings of the 8th international conference on webometrics informetrics and scientometrics & 13th collnet meeting. Seoul (Korea), 722–729.Smith, A., & Thelwall, M. (2002). Web impact factors for Australasian Universities. Scientometrics, 54(3), 363–380.Tang, R., & Thelwall, M. (2008). A hyperlink analysis of US public and academic libraries’ web sites. Library Quarterly, 78(4), 419–435.Thelwall, M. (2008). Extracting accurate and complete results from search engines: Case study Windows Live. Journal of the American Society for Information Science and Technology, 59(1), 38–50.Thelwall, M. (2009). Introduction to webometrics: Quantitative web research for the social sciences. San Rafael: Morgan & Claypool.Thelwall, M., & Sud, P. (2011). A comparison of methods for collecting web citation data for academic organisations. Journal of the American Society for Information Science and Technology, 62(8), 1488–1497.Thelwall, M., Sud, P., & Wilkinson, D. (2012). Link and co-inlink network diagrams with URL citations or title mentions. Journal of the American Society for Information Science and Technology, 63(10), 1960–1972.Thelwall, M., & Zuccala, A. (2008). A University-centred European Union link analysis. Scientometrics, 75(3), 407–442.Uyar, A. (2009a). Google stemming mechanisms. Journal of Information Science, 35(5), 499–514.Uyar, A. (2009b). Investigation of the accuracy of search engine hit counts. Journal of Information Science, 35(4), 469–480.Zuccala, A., Thelwall, M., Oppenheim, C., & Dhiensa, R. (2007). Web intelligence analyses of digital libraries: A case study of the National Electronic Library for Health (NeLH). Journal of Documentation, 63(4), 558–589
    corecore