21,073 research outputs found

    The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or From an Academic Department Are Consistent With a Discrete Lognormal Model

    Full text link
    How to quantify the impact of a researcher's or an institution's body of work is a matter of increasing importance to scientists, funding agencies, and hiring committees. The use of bibliometric indicators, such as the h-index or the Journal Impact Factor, have become widespread despite their known limitations. We argue that most existing bibliometric indicators are inconsistent, biased, and, worst of all, susceptible to manipulation. Here, we pursue a principled approach to the development of an indicator to quantify the scientific impact of both individual researchers and research institutions grounded on the functional form of the distribution of the asymptotic number of citations. We validate our approach using the publication records of 1,283 researchers from seven scientific and engineering disciplines and the chemistry departments at the 106 U.S. research institutions classified as "very high research activity". Our approach has three distinct advantages. First, it accurately captures the overall scientific impact of researchers at all career stages, as measured by asymptotic citation counts. Second, unlike other measures, our indicator is resistant to manipulation and rewards publication quality over quantity. Third, our approach captures the time-evolution of the scientific impact of research institutions.Comment: 20 pages, 11 figures, 3 table

    Applied Evaluative Informetrics: Part 1

    Full text link
    This manuscript is a preprint version of Part 1 (General Introduction and Synopsis) of the book Applied Evaluative Informetrics, to be published by Springer in the summer of 2017. This book presents an introduction to the field of applied evaluative informetrics, and is written for interested scholars and students from all domains of science and scholarship. It sketches the field's history, recent achievements, and its potential and limits. It explains the notion of multi-dimensional research performance, and discusses the pros and cons of 28 citation-, patent-, reputation- and altmetrics-based indicators. In addition, it presents quantitative research assessment as an evaluation science, and focuses on the role of extra-informetric factors in the development of indicators, and on the policy context of their application. It also discusses the way forward, both for users and for developers of informetric tools.Comment: The posted version is a preprint (author copy) of Part 1 (General Introduction and Synopsis) of a book entitled Applied Evaluative Bibliometrics, to be published by Springer in the summer of 201

    Do ResearchGate Scores create ghost academic reputations?

    Get PDF
    [EN] The academic social network site ResearchGate (RG) has its own indicator, RG Score, for its members. The high profile nature of the site means that the RG Score may be used for recruitment, promotion and other tasks for which researchers are evaluated. In response, this study investigates whether it is reasonable to employ the RG Score as evidence of scholarly reputation. For this, three different author samples were investigated. An outlier sample includes 104 authors with high values. A Nobel sample comprises 73 Nobel winners from Medicine and Physiology, Chemistry, Physics and Economics (from 1975 to 2015). A longitudinal sample includes weekly data on 4 authors with different RG Scores. The results suggest that high RG Scores are built primarily from activity related to asking and answering questions in the site. In particular, it seems impossible to get a high RG Score solely through publications. Within RG it is possible to distinguish between (passive) academics that interact little in the site and active platform users, who can get high RG Scores through engaging with others inside the site (questions, answers, social networks with influential researchers). Thus, RG Scores should not be mistaken for academic reputation indicators.Alberto Martin-Martin enjoys a four-year doctoral fellowship (FPU2013/05863) granted by the Ministerio de Educacion, Cultura, y Deporte (Spain). Enrique Orduna-Malea holds a postdoctoral fellowship (PAID-10-14), from the Polytechnic University of Valencia (Spain).Orduña Malea, E.; Martín-Martín, A.; Thelwall, M.; Delgado-López-Cózar, E. (2017). Do ResearchGate Scores create ghost academic reputations?. Scientometrics. 112(1):443-460. https://doi.org/10.1007/s11192-017-2396-9S4434601121Bosman, J. & Kramer, B. (2016). Innovations in scholarly communication—data of the global 2015–2016 survey. Available at: http://zenodo.org/record/49583 #. Accessed December 11, 2016.González-Díaz, C., Iglesias-García, M., & Codina, L. (2015). Presencia de las universidades españolas en las redes sociales digitales científicas: Caso de los estudios de comunicación. El profesional de la información, 24(5), 1699–2407.Goodwin, S., Jeng, W., & He, D. (2014). Changing communication on ResearchGate through interface updates. Proceedings of the American Society for Information Science and Technology, 51(1), 1–4.Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, 520(7548), 429–431.Hoffmann, C. P., Lutz, C., & Meckel, M. (2015). A relational altmetric? Network centrality on ResearchGate as an indicator of scientific impact. Journal of the Association for Information Science and Technology, 67(4), 765–775.Jiménez-Contreras, E., de Moya Anegón, F., & Delgado López-Cózar, E. (2003). The evolution of research activity in Spain: The impact of the National Commission for the Evaluation of Research Activity (CNEAI). Research Policy, 32(1), 123–142.Jordan, K. (2014a). Academics’ awareness, perceptions and uses of social networking sites: Analysis of a social networking sites survey dataset (December 3, 2014). Available at: http://dx.doi.org/10.2139/ssrn.2507318 . Accessed December 11, 2016.Jordan, K. (2014b). Academics and their online networks: Exploring the role of academic social networking sites. First Monday, 19(11). Available at: http://dx.doi.org/10.5210/fm.v19i11.4937 . Accessed December 11, 2016.Jordan, K. (2015). Exploring the ResearchGate score as an academic metric: reflections and implications for practice. Quantifying and Analysing Scholarly Communication on the Web (ASCW’15), 30 June 2015, Oxford. Available at: http://ascw.know-center.tugraz.at/wp-content/uploads/2015/06/ASCW15_jordan_response_kraker-lex.pdf . Accessed December 11, 2016.Kadriu, A. (2013). Discovering value in academic social networks: A case study in ResearchGate. Proceedings of the ITI 2013—35th Int. Conf. on Information Technology Interfaces Information Technology Interfaces, pp. 57–62.Kraker, P. & Lex, E. (2015). A critical look at the ResearchGate score as a measure of scientific reputation. Proceedings of the Quantifying and Analysing Scholarly Communication on the Web workshop (ASCW’15), Web Science conference 2015. Available at: http://ascw.know-center.tugraz.at/wp-content/uploads/2016/02/ASCW15_kraker-lex-a-critical-look-at-the-researchgate-score_v1-1.pdf . Accessed December 11, 2016.Li, L., He, D., Jeng, W., Goodwin, S. & Zhang, C. (2015). Answer quality characteristics and prediction on an academic Q&A Site: A case study on ResearchGate. Proceedings of the 24th International Conference on World Wide Web Companion, pp. 1453–1458.Martín-Martín, A., Orduna-Malea, E., Ayllón, J. M. & Delgado López-Cózar, E. (2016). The counting house: measuring those who count. Presence of Bibliometrics, Scientometrics, Informetrics, Webometrics and Altmetrics in the Google Scholar Citations, ResearcherID, ResearchGate, Mendeley & Twitter. Available at: https://arxiv.org/abs/1602.02412 . Accessed December 11, 2016.Martín-Martín, A., Orduna-Malea, E. & Delgado López-Cózar, E. (2016). The role of ego in academic profile services: Comparing Google Scholar, ResearchGate, Mendeley, and ResearcherID. Researchgate, Mendeley, and Researcherid. The LSE Impact of Social Sciences blog. Available at: http://blogs.lse.ac.uk/impactofsocialsciences/2016/03/04/academic-profile-services-many-mirrors-and-faces-for-a-single-ego . Accessed December 11, 2016.Matthews, D. (2016). Do academic social networks share academics’ interests?. Times Higher Education. Available at: https://www.timeshighereducation.com/features/do-academic-social-networks-share-academics-interests . Accessed December 11, 2016.Memon, A. R. (2016). ResearchGate is no longer reliable: leniency towards ghost journals may decrease its impact on the scientific community. Journal of the Pakistan Medical Association, 66(12), 1643–1647.Mikki, S., Zygmuntowska, M., Gjesdal, Ø. L. & Al Ruwehy, H. A. (2015). Digital presence of norwegian scholars on academic network sites-where and who are they?. Plos One 10(11). Available at: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0142709 . Accessed December 11, 2016.Nicholas, D., Clark, D., & Herman, E. (2016). ResearchGate: Reputation uncovered. Learned Publishing, 29(3), 173–182.Orduna-Malea, E., Martín-Martín, A., & Delgado López-Cózar, E. (2016). The next bibliometrics: ALMetrics (Author Level Metrics) and the multiple faces of author impact. El profesional de la información, 25(3), 485–496.Ortega, Jose L. (2015). Relationship between altmetric and bibliometric indicators across academic social sites: The case of CSIC’s members. Journal of informetrics, 9(1), 39–49.Ortega, Jose L. (2016). Social network sites for scientists. Cambridge: Chandos.Ovadia, S. (2014). ResearchGate and Academia. edu: Academic social networks. Behavioral & Social Sciences Librarian, 33(3), 165–169.Thelwall, M., & Kousha, K. (2015). ResearchGate: Disseminating, communicating, and measuring Scholarship? Journal of the Association for Information Science and Technology, 66(5), 876–889.Thelwall, M. & Kousha, K. (2017). ResearchGate articles: Age, discipline, audience size and impact. Journal of the Association for Information Science and Technology, 68(2), 468–479.Van Noorden, R. (2014). Online collaboration: Scientists and the social network. Nature, 512(7513), 126–129.Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S. et al. (2015). The Metric Tide: Independent Review of the Role of Metrics in Research Assessment and Management. HEFCE. Available at: http://doi.org/10.13140/RG.2.1.4929.1363 . Accessed December 11, 2016

    Do academics doubt their own research?

    Get PDF
    When do experts doubt or question their own previously published research and why? An online survey was designed and distributed across academic staff and postgraduate research students at different universities in Great Britain. Respondents (n = 202 - 244) identified the likelihoods of six different (quasi) hypothetical occurrences causing them to doubt or question work they have published in peer reviewed journals. They are: two objective and two semi-objective citation based metrics, plus two semi-objective metrics based on verbalised reactions. Only limited support is found from this study to suggest that the authors of primary research would agree with any judgements made by others about their research based on these metrics. The occurrence most likely to cause respondents to doubt or question their previously published research was where the majority of citing studies suggested mistakes in their work. In a multivariate context, only age and nationality are significant determinants of doubt beyond average likelihoods. Understanding and acknowledging what makes authors of primary research doubt their own research could increase the validity of those who pass judgement

    Motivations for self-archiving on an academic social networking site:A study on researchgate

    Get PDF
    © 2019 ASIS & T This study investigates motivations for self-archiving research items on academic social networking sites (ASNSs). A model of these motivations was developed based on two existing motivation models: motivation for self-archiving in academia and motivations for information sharing in social media. The proposed model is composed of 18 factors drawn from personal, social, professional, and external contexts, including enjoyment, personal/professional gain, reputation, learning, self-efficacy, altruism, reciprocity, trust, community interest, social engagement, publicity, accessibility, self-archiving culture, influence of external actors, credibility, system stability, copyright concerns, additional time, and effort. Two hundred and twenty-six ResearchGate users participated in the survey. Accessibility was the most highly rated factor, followed by altruism, reciprocity, trust, self-efficacy, reputation, publicity, and others. Personal, social, and professional factors were also highly rated, while external factors were rated relatively low. Motivations were correlated with one another, demonstrating that RG motivations for self-archiving could increase or decrease based on several factors in combination with motivations from the personal, social, professional, and external contexts. We believe the findings from this study can increase our understanding of users' motivations in sharing their research and provide useful implications for the development and improvement of ASNS services, thereby attracting more active users
    • …
    corecore