26,401 research outputs found

    Scientometric Analysis of the Scientometric Literature

    Get PDF
     Using bibliographic records from the Social Science Citation Index, Science Citation Index, and Arts & Humanities Citation Index, this paper tries to give a complete view of the evolution of the field of Scientometrics based on its literature published during 1980 to 2009. This is a descriptive survey using scientometric indicators.Findings revealed that out of 691 articles in the field of Scientometrics, a total number of 183 articles (26.48%) were written during 1980 to 2009 by the top ten authors. Some of these articles were produced in authors’ collaboration and some of them were by single authors.   Geographical analysis indicated that the field had evolved considerably in different regions of the world. Hungarian Academy of Science with 40 records (5.71%) was the most productive institution in the field of Scientometrics. Furthermore, chronological analysis disclosed that the scientific production in the field of Scientometrics showed a slow increase from 1980 to 2009. The overwhelming majority of documents were in English, and the international journal of Scientometrics was the most prolific journal in the field. It has also been declared that 67.87% of the literature was published in the area of Library and Information Science

    Revealing the online network between university and industry: the case of Turkey

    Full text link
    The present paper attempts to explore the relationship between the Turkish academic and industry systems by mapping the relationships under web indicators. We used the top 100 Turkish universities and the top 10 Turkish companies in 10 industrial sectors in order to observe the performance of web impact indicators. Total page count metric is obtained through Google Turkey and the pure link metrics have been gathered from Open Site Explorer. The indicators obtained both for web presence and web visibility indicated that there are significant differences between the group of academic institutions and those related to companies within the web space of Turkey. However, this current study is exploratory and should be replicated with a larger sample of both Turkish universities and companies in each sector. Likewise, a longitudinal study rather than sectional would eliminate or smooth fluctuations of web data (especially URL mentions) as a more adequate understanding of the relations between Turkish institutions, and their web impact, is reached.Orduña Malea, E.; Aytac, S. (2015). Revealing the online network between university and industry: the case of Turkey. Scientometrics. 105(3):1849-1866. doi:10.1007/s11192-015-1596-4S184918661053Aguillo, I. F., Granadino, B., Ortega, J. L., & Prieto, J. A. (2006). Scientific research activity and communication measured with cybermetrics indicators. Journal of the American Society for Information Science and Technology, 57(10), 1296–1302.Arslan, M. L., & Seker, S. E. (2014). Web based reputation index of Turkish Universities. International Journal of E-Education E-Business E-Management and E-Learning, 4(3), 197–203.Aytac, S. (2010). International scholarly collaboration in science, technology and medicine and social science of Turkish scientists. The International Information & Library Review, 42(4), 227–241.Bahçıvan, E. (ed.) (2013). Turkey’s Top 500 Industrial enterprises 2012. The Journal of the Istanbul Chamber of Industry, 48(569), 1–124 (special issue).Barabasi, A. L., & Albert, R. (1999). Emergence of Scaling in Random Networks. Science, 286(5439), 509–512.Cankir, B., Arslan, M. L., & Seker, S. E. (2015). Web Reputation Index for XU030 Quote Companies. Journal of Industrial and Intelligent Information, 3(2), 110–113.Faba-Fernández, C., Guerrero-Bote, Vicente P., & Moya-Anegón, F. (2003). Data mining in a closed web environment. Scientometrics, 58(3), 623–640.Fruchterman, T. M., & Reingold, E. M. (1991). Graph drawing by force‐directed placement. Software: Practice and experience, 21(11), 1129–1164.García-Santiago, L., & De Moya-Anegón, F. (2009). Using co-outlinks to mine heterogeneous networks. Scientometrics, 79(3), 681–702.Jolliffe, I. (2002). Principal component analysis. New York: Springer.Khan, G. F., & Park, H. W. (2011). Measuring the triple helix on the web: Longitudinal trends in the university–industry–government relationship in Korea. Journal of the American Society for Information Science and Technology, 62(12), 2443–2455.Leydesdorff, L., & Etzkowitz, H. (1996). Emergence of a triple helix of university–industry–government relations. Science and public policy, 23(5), 279–286.Leydesdorff, L., & Park, H. W. (2014). Can synergy in triple-helix relations be quantified? A review of the development of the triple-helix indicator. arXiv preprint arXiv:1401.2342.Meyer, M. (2000). What is special about patent citations? Differences between scientific and patent citations. Scientometrics, 49(1), 93–123.Meyer, M., Siniläinen, T., & Utecht, J. T. (2003). Towards hybrid triple helix indicators: A study of university-related patents and a survey of academic inventors. Scientometrics, 58(2), 321–350.Minguillo, D., & Thelwall, M. (2012). Mapping the network structure of science parks: An exploratory study of cross-sectoral interactions reflected on the web. Aslib Proceedings, 64(4), 332–357.Montesinos, P., Carot, J. M., Martínez, J. M., & Mora, F. (2008). Third mission ranking for world class universities: Beyond teaching and research. Higher Education in Europe, 33(2/3), 259–271.Orduna-Malea, E., & López-Cózar, E. D. (2014). Google scholar metrics evolution: An analysis according to languages. Scientometrics, 98(3), 2353–2367.Ortega, J. L., Orduna-Malea, E., & Aguillo, I. F. (2014). Are web mentions accurate substitutes for inlinks for Spanish universities? Online Information Review, 38(1), 59–77.Priem, J., & Hemminger, B. H. (2010). Scientometrics 2.0: New metrics of scholarly impact on the social Web. First Monday, 15(7). http://firstmonday.org/ojs/index.php/fm/article/viewArticle/2874 . Accessed 31 December 2014.Romero-Frías, E. (2011). Googling companies-a webometric approach to business studies. Leading Issues in Business Research Methods, 7(1), 93–106.Romero-Frías, E., & Vaughan, L. (2010). Patterns of web linking to heterogeneous groups of companies: The case of stock exchange indexes. Aslib Proceedings, 62(2), 144–164.Stuart, D., & Thelwall, M. (2006). Investigating triple helix relationships using URL citations: A case study of the UK West Midlands automobile industry. Research Evaluation, 15(2), 97–106.Thelwall, M. (2004). Link analysis: An information science approach. San Diego: Academic Press.Thelwall, M. (2014). A brief history of altmetrics. Research trends, 37, 3–4. http://www.researchtrends.com/issue-37-june-2014/a-brief-history-of-altmetrics/ . Accessed 31 December 2014.Thelwall, M., & Harries, G. (2003). The connection between the research of a university and counts of links to its Web pages: An investigation based upon a classification of the relationships of pages to the research of the host university. Journal of the American Society for Information Science and Technology, 54(7), 594–602.Vaughan, L. (2004). Exploring website features for business information. Scientometrics, 61(3), 467–477.Vaughan, L. (2006). Visualizing linguistic and cultural differences using Web co-link data. Journal of the American Society for Information Science and Technology, 57(9), 1178–1193.Vaughan, L., & Yang, R. (2012). Web data as academic and business quality estimates: A comparison of three data sources. Journal of the American Society for Information Science and Technology, 63(10), 1960–1972.Vaughan, L., Gao, Y., & Kipp, M. (2006). Why are hyperlinks to business Websites created? A content analysis. Scientometrics, 67(2), 291–300.Vaughan, L., & Romero-Frías, E. (2012). Exploring web keyword analysis as an alternative to link analysis: A multi-industry case. Scientometrics, 93(1), 217–232.Vaughan, L., & Thelwall, M. (2003). Scholarly use of the web: What are the key inducers of links to journal web sites? Journal of the American Society for Information Science and Technology, 54(1), 29–38.Vaughan, L., & Wu, G. (2004). Links to commercial web sites as a source of business information. Scientometrics, 60(3), 487–496.Wilkinson, D., & Thelwall, M. (2013). Search markets and search results: The case of Bing. Library & Information Science Research, 35(4), 318–325

    Crossing the academic ocean? Judit Bar-Ilan's oeuvre on search engines studies

    Full text link
    [EN] The main objective of this work is to analyse the contributions of Judit Bar-Ilan to the search engines studies. To do this, two complementary approaches have been carried out. First, a systematic literature review of 47 publications authored and co-authored by Judit and devoted to this topic. Second, an interdisciplinarity analysis based on the cited references (publications cited by Judit) and citing documents (publications that cite Judit's work) through Scopus. The systematic literature review unravels an immense amount of search engines studied (43) and indicators measured (especially technical precision, overlap and fluctuation over time). In addition to this, an evolution over the years is detected from descriptive statistical studies towards empirical user studies, with a mixture of quantitative and qualitative methods. Otherwise, the interdisciplinary analysis evidences that a significant portion of Judit's oeuvre was intellectually founded on the computer sciences, achieving a significant, but not exclusively, impact on library and information sciences.Orduña-Malea, E. (2020). Crossing the academic ocean? Judit Bar-Ilan's oeuvre on search engines studies. Scientometrics. 123(3):1317-1340. https://doi.org/10.1007/s11192-020-03450-4S131713401233Bar-Ilan, J. (1998a). On the overlap, the precision and estimated recall of search engines. A case study of the query “Erdos”. Scientometrics,42(2), 207–228. https://doi.org/10.1007/bf02458356.Bar-Ilan, J. (1998b). The mathematician, Paul Erdos (1913–1996) in the eyes of the Internet. Scientometrics,43(2), 257–267. https://doi.org/10.1007/bf02458410.Bar-Ilan, J. (2000). The web as an information source on informetrics? A content analysis. Journal of the American Society for Information Science and Technology,51(5), 432–443. https://doi.org/10.1002/(sici)1097-4571(2000)51:5%3C432:aid-asi4%3E3.0.co;2-7.Bar-Ilan, J. (2001). Data collection methods on the web for informetric purposes: A review and analysis. Scientometrics,50(1), 7–32.Bar-Ilan, J. (2002). Methods for measuring search engine performance over time. Journal of the American Society for Information Science and Technology,53(4), 308–319. https://doi.org/10.1002/asi.10047.Bar-Ilan, J. (2003). Search engine results over time: A case study on search engine stability. Cybermetrics,2/3, 1–16.Bar-Ilan, J. (2005a). Expectations versus reality—Search engine features needed for Web research at mid 2005. Cybermetrics,9, 1–26.Bar-Ilan, J. (2005b). Expectations versus reality—Web search engines at the beginning of 2005. In Proceedings of ISSI 2005: 10th international conference of the international society for scientometrics and informetrics (Vol. 1, pp. 87–96).Bar-Ilan, J. (2010). The WIF of Peter Ingwersen’s website. In B. Larsen, J. W. Schneider, & F. Åström (Eds.), The Janus Faced Scholar a Festschrift in honour of Peter Ingwersen (pp. 119–121). Det Informationsvidenskabelige Akademi. Retrieved 15 January 15, 2020, from https://vbn.aau.dk/ws/portalfiles/portal/90357690/JanusFacedScholer_Festschrift_PeterIngwersen_2010.pdf#page=122.Bar-Ilan, J. (2018). Eugene Garfield on the web in 2001. Scientometrics,114(2), 389–399. https://doi.org/10.1007/s11192-017-2590-9.Bar-Ilan, J., Mat-Hassan, M., & Levene, M. (2006). Methods for comparing rankings of search engine results. Computer Networks,50(10), 1448–1463. https://doi.org/10.1016/j.comnet.2005.10.020.Thelwall, M. (2017). Judit Bar-Ilan: Information scientist, computer scientist, scientometrician. Scientometrics,113(3), 1235–1244. https://doi.org/10.1007/s11192-017-2551-3

    Trends in Russian research output indexed in Scopus and Web of Science

    Full text link
    Trends are analysed in the annual number of documents published by Russian institutions and indexed in Scopus and Web of Science, giving special attention to the time period starting in the year 2013 in which the Project 5-100 was launched by the Russian Government. Numbers are broken down by document type, publication language, type of source, research discipline, country and source. It is concluded that Russian publication counts strongly depend upon the database used, and upon changes in database coverage, and that one should be cautious when using indicators derived from WoS, and especially from Scopus, as tools in the measurement of research performance and international orientation of the Russian science system.Comment: Author copy of a manuscript accepted for publication in the journal Scientometrics, May 201

    Novel Approaches to the Development and Application of Informetric and Scientometric Tools Special Issue of Journal of Data and Information Science on ISSI2019 Conference-Part II

    Get PDF
    This is the second part of the Journal of Data and Information Science (JDIS) Special Issue on ISSI 2019, the 17th International Conference on Scientometrics and Informetrics (ISSI2019) held in Rome, on 2–5 September 2019 and includes additional 10 selected posters presented during the conference largely expanded by the authors afterwards. The papers included in this volume have been grouped in three broad themes: - Indicators & Databases (4 papers); - Social context, Innovation, and Policy (3 papers); - Application domains (3 papers)

    Self-defined information indices: application to the case of university rankings

    Full text link
    [EN] University rankings are now relevant decision-making tools for both institutional and private purposes in the management of higher education and research. However, they are often computed only for a small set of institutions using some sophisticated parameters. In this paper we present a new and simple algorithm to calculate an approximation of these indices using some standard bibliometric variables, such as the number of citations from the scientific output of universities and the number of articles per quartile. To show our technique, some results for the ARWU index are presented. From a technical point of view, our technique, which follows a standard machine learning scheme, is based on the interpolation of two classical extrapolation formulas for Lipschitz functions defined in metric spaces-the so-called McShane and Whitney formulae-. In the model, the elements of the metric space are the universities, the distances are measured using some data that can be extracted from the Incites database, and the Lipschitz function is the ARWU index.The third and fourth authors gratefully acknowledge the support of the Ministerio de Ciencia, Innovacion y Universidades (Spain), Agencia Estatal de Investigacion, and FEDER, under Grant MTM2016-77054-C2-1-P. The first author gratefully acknowledge the support of Catedra de Transparencia y Gestion de Datos, Universitat Politecnica de Valencia y Generalitat Valenciana, Spain.Ferrer Sapena, A.; Erdogan, E.; Jiménez-Fernández, E.; Sánchez Pérez, EA.; Peset Mancebo, MF. (2020). Self-defined information indices: application to the case of university rankings. Scientometrics. 124(3):2443-2456. https://doi.org/10.1007/s11192-020-03575-6S244324561243Aguillo, I., Bar-Ilan, J., Levene, M., & Ortega, J. (2010). Comparing university rankings. Scientometrics, 85(1), 243–256.Asadi, K., Dipendra, M., & Littman, M. L. (2018). Lipschitz continuity in model-based reinforcement learning. In Proceedings of the 35th International Conference on Machine Learning, Proc. Mach. Lear. Res., Vol. 80, pp. 264–273.Bougnol, M. L., & Dulá, J. H. (2013). A mathematical model to optimize decisions to impact multi-attribute rankings. Scientometrics, 95(2), 785–796.Çakır, M. P., Acartürk, C., Alaşehir, O., & Çilingir, C. (2015). A comparative analysis of global and national university ranking systems. Scientometrics, 103(3), 813–848.Cancino, C. A., Merigó, J. M., & Coronado, F. C. (2017). A bibliometric analysis of leading universities in innovation research. Journal of Innovation & Knowledge, 2(3), 106–124.Chen, K.-H., & Liao, P.-Y. (2012). A comparative study on world university rankings: A bibliometric survey. Scientometrics, 92(1), 89–103.Cinzia, D., & Bonaccorsi, A. (2017). Beyond university rankings? Generating new indicators on universities by linking data in open platforms. Journal of the Association for Information Science and Technology, 68(2), 508–529.Cobzaş, Ş., Miculescu, R., & Nicolae, A. (2019). Lipschitz functions. Berlin: Springer.Deza, M. M., & Deza, E. (2009). Encyclopedia of distances. Berlin: Springer.2019 U-Multirank ranking: European universities performing well. https://ec.europa.eu/education/news/u-multirank-publishes-sixth-edition-en .Dobrota, M., Bulajic, M., Bornmann, L., & Jeremic, V. (2016). A new approach to the QS university ranking using the composite I-distance indicator: Uncertainty and sensitivity analyses. Journal of the Association for Information Science and Technology, 67(1), 200–211.Falciani, H., Calabuig, J. M., & Sánchez Pérez, E. A. (2020). Dreaming machine learning: Lipschitz extensions for reinforcement learning on financial markets. Neurocomputing, 398, 172–184.Kehm, B. M. (2014). Global university rankings—Impacts and unintended side effects. European Journal of Education, 49(1), 102–112.Lim, M. A., & Øerberg, J. W. (2017). Active instruments: On the use of university rankings in developing national systems of higher education. Policy Reviews in Higher Education, 1(1), 91–108.Luo, F., Sun, A., Erdt, M., Raamkumar, A. S., & Theng, Y. L. (2018). Exploring prestigious citations sourced from top universities in bibliometrics and altmetrics: A case study in the computer science discipline. Scientometrics, 114(1), 1–17.Marginson, S. (2014). University rankings and social science. European Journal of Education, 49(1), 45–59.Pagell, R. A. (2014). Bibliometrics and university research rankings demystified for librarians. Library and information sciences (pp. 137–160). Berlin: Springer.Rao, A. (2015). Algorithms for Lipschitz extensions on graphs. Yale University: ProQuest Dissertations Publishing, 10010433.Rosa, K. D., Metsis, V., & Athitsos, V. (2012). Boosted ranking models: A unifying framework for ranking predictions. Knowledge and Information Systems, 30(3), 543–568.Saisana, M., d’Hombres, B., & Saltelli, A. (2011). Rickety numbers: Volatility of university rankings and policy implications. Research Policy, 40(1), 165–177.Tabassum, A., Hasan, M., Ahmed, S., Tasmin, R., Abdullah, D. M., & Musharrat, T. (2017). University ranking prediction system by analyzing influential global performance indicators. In 2017 9th International Conference on Knowledge and Smart Technology (KST) (pp. 126–131) IEEE.Van Raan, A. F. J., Van Leeuwen, T. N., & Visser, M. S. (2011). Severe language effect in university rankings: Particularly Germany and France are wronged in citation-based rankings. Scientometrics, 88(2), 495–498.von Luxburg, U., & Bousquet, O. (2004). Distance-based classification with Lipschitz functions. Journal of Machine Learning Research, 5, 669–695

    Analytical Study of the Most Citied International Research Journals of Library and Information Science

    Get PDF
    The focus of this paper is on the research productivity and their relevant parameters of top twenty international journals of Library and Information Science from 2015 to 2019. The analysis of data revealed that Journal of the Association for Information Science and Technology (JAIST) is the top cited journal followed by Scientometrics and Journal of Informetric during the mentioned period. The year 2015 was reported as the most productive year with 45% citations from the top twenty LIS journals. The average citation rate of Scholar Google is at the top followed by Scopus. “The sharing economy” was declared as the most cited research paper with 2391 citations followed by “The journal coverage of Web of Science and Scopus” with 688 citations
    corecore