46 research outputs found

    Inter-field nonlinear transformation of journal impact indicators: The case of the h-index

    Full text link
    [EN] Impact indices used for joint evaluation of research items coming from different scientific fields must be comparable. Often a linear transformation -a normalization or another basic operation-is considered to be enough for providing the correct translation to a unified setting in which all the fields are adequately treated. In this paper it is shown that this is not always true. The attention is centered in the case of the h-index. It is proved that it that cannot be translated by means of direct normalization preserving its genuine meaning. According to the universality of citation distribution, it is shown that a slight variant of the h-index is necessary for this notion to produce comparable values when applied to different scientific fields. A complete example concerning a group of top scientists is shown.The first author was supported by Ministerio de Economia, Industria y Competitividad under Research Grant CSO2015-65594-C2-1R Y 2R (MINECO/FEDER, UE). The second author was suported by Ministerio de Economia, Industria y Competitividad and FEDER under Research Grant MTM2016-77054-C2-1-PFerrer Sapena, A.; Sánchez Pérez, EA. (2019). Inter-field nonlinear transformation of journal impact indicators: The case of the h-index. Journal of Interdisciplinary Mathematics. 22(2):177-199. https://doi.org/10.1080/09720502.2019.1616913S177199222Geuna, A., & Piolatto, M. (2016). Research assessment in the UK and Italy: Costly and difficult, but probably worth it (at least for a while). Research Policy, 45(1), 260-271. doi:10.1016/j.respol.2015.09.004Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251-261. doi:10.1016/j.respol.2011.09.007Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 16569-16572. doi:10.1073/pnas.0507655102Egghe, L. (2010). The Hirsch index and related impact measures. Annual Review of Information Science and Technology, 44(1), 65-114. doi:10.1002/aris.2010.1440440109Van Leeuwen, T. (2008). Testing the validity of the Hirsch-index for research assessment purposes. Research Evaluation, 17(2), 157-160. doi:10.3152/095820208x319175Alonso, S., Cabrerizo, F. J., Herrera-Viedma, E., & Herrera, F. (2009). h-Index: A review focused in its variants, computation and standardization for different scientific fields. Journal of Informetrics, 3(4), 273-289. doi:10.1016/j.joi.2009.04.001Imperial, J., & Rodríguez-Navarro, A. (2007). Usefulness of Hirsch’s h-index to evaluate scientific research in Spain. Scientometrics, 71(2), 271-282. doi:10.1007/s11192-007-1665-4Aoun, S. G., Bendok, B. R., Rahme, R. J., Dacey, R. G., & Batjer, H. H. (2013). Standardizing the Evaluation of Scientific and Academic Performance in Neurosurgery—Critical Review of the «h» Index and its Variants. World Neurosurgery, 80(5), e85-e90. doi:10.1016/j.wneu.2012.01.052Waltman, L., & van Eck, N. J. (2011). The inconsistency of the h-index. Journal of the American Society for Information Science and Technology, 63(2), 406-415. doi:10.1002/asi.21678Rousseau, R., García-Zorita, C., & Sanz-Casado, E. (2013). The h-bubble. Journal of Informetrics, 7(2), 294-300. doi:10.1016/j.joi.2012.11.012Burrell, Q. L. (2013). The h-index: A case of the tail wagging the dog? Journal of Informetrics, 7(4), 774-783. doi:10.1016/j.joi.2013.06.004Schreiber, M. (2013). How relevant is the predictive power of the h-index? A case study of the time-dependent Hirsch index. Journal of Informetrics, 7(2), 325-329. doi:10.1016/j.joi.2013.01.001Khan, N. R., Thompson, C. J., Taylor, D. R., Gabrick, K. S., Choudhri, A. F., Boop, F. R., & Klimo, P. (2013). Part II: Should the h-Index Be Modified? An Analysis of the m-Quotient, Contemporary h-Index, Authorship Value, and Impact Factor. World Neurosurgery, 80(6), 766-774. doi:10.1016/j.wneu.2013.07.011Schreiber, M. (2013). A case study of the arbitrariness of the h-index and the highly-cited-publications indicator. Journal of Informetrics, 7(2), 379-387. doi:10.1016/j.joi.2012.12.006Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431. doi:10.1038/520429aDienes, K. R. (2015). Completing h. Journal of Informetrics, 9(2), 385-397. doi:10.1016/j.joi.2015.01.003Ayaz, S., & Afzal, M. T. (2016). Identification of conversion factor for completing-h index for the field of mathematics. Scientometrics, 109(3), 1511-1524. doi:10.1007/s11192-016-2122-zWaltman, L. (2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10(2), 365-391. doi:10.1016/j.joi.2016.02.007Van Eck, N. J., & Waltman, L. (2008). Generalizing the h- and g-indices. Journal of Informetrics, 2(4), 263-271. doi:10.1016/j.joi.2008.09.004Egghe, L., & Rousseau, R. (2008). An h-index weighted by citation impact. Information Processing & Management, 44(2), 770-780. doi:10.1016/j.ipm.2007.05.003Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131-152. doi:10.1007/s11192-006-0144-7Iglesias, J. E., & Pecharromán, C. (2007). Scaling the h-index for different scientific ISI fields. Scientometrics, 73(3), 303-320. doi:10.1007/s11192-007-1805-xEgghe, L. (2008). Examples of simple transformations of the h-index: Qualitative and quantitative conclusions and consequences for other indices. Journal of Informetrics, 2(2), 136-148. doi:10.1016/j.joi.2007.12.003Schreiber, M. (2015). Restricting the h-index to a publication and citation time window: A case study of a timed Hirsch index. Journal of Informetrics, 9(1), 150-155. doi:10.1016/j.joi.2014.12.00

    Vector-valued impact measures and generation of specific indexes for research assessment

    Full text link
    A mathematical structure for defining multi-valued bibliometric indices is provided with the aim of measuring the impact of general sources of information others than articles and journals-for example, repositories of datasets. The aim of the model is to use several scalar indices at the same time for giving a measure of the impact of a given source of information, that is, we construct vector valued indices. We use the properties of these vector valued indices in order to give a global answer to the problem of finding the optimal scalar index for measuring a particular aspect of the impact of an information source, depending on the criterion we want to fix for the evaluation of this impact. The main restrictions of our model are (1) it uses finite sets of scalar impact indices (altmetrics), and (2) these indices are assumed to be additive. The optimization procedure for finding the best tool for a fixed criterion is also presented. In particular, we show how to create an impact measure completely adapted to the policy of a specific research institution.Calabuig, JM.; Ferrer Sapena, A.; Sánchez Pérez, EA. (2016). Vector-valued impact measures and generation of specific indexes for research assessment. Scientometrics. 108(3):1425-1443. doi:10.1007/s11192-016-2039-6S142514431083Aleixandre Benavent, R., Valderrama Zurián, J. C., & González Alcaide, G. (2007). Scientific journals impact factor: Limitations and alternative indicators. El Profesional de la Información, 16(1), 4–11.Alguliyev, R., Aliguliyev, R. & Ismayilova, N. (2015). Weighted impact factor (WIF) for assessing the quality of scientific journals. arXiv:1506.02783Beauzamy, B. (1982). Introduction to Banach spaces and their geometry. Amsterdam: North-Holland.Beliakov, G., & James, S. (2011). Citation-based journal ranks: the use of fuzzy measures. Fuzzy Sets and Systems, 167, 101–119.Buela-Casal, G. (2003). Evaluating quality of articles and scientific journals. Proposal of weighted impact factor and a quality index. Psicothema, 15(1), 23–25.Diestel, J., & Uhl, J. J. (1977). Vector measures. Providence: Am. Math. Soc.Dorta-González, P., & Dorta-González, M. I. (2013). Comparing journals from different fields of science and social science through a JCR subject categories normalized impact factor. Scientometrics, 95(2), 645–672.Dorta-González, P., Dorta-González, M. I., Santos-Penate, D. R., & Suarez-Vega, R. (2014). Journal topic citation potential and between-field comparisons: The topic normalized impact factor. Journal of Informetrics, 8(2), 406–418.Egghe, L., & Rousseau, R. (2002). A general frame-work for relative impact indicators. Canadian Journal of Information and Library Science, 27(1), 29–48.Ferrer-Sapena, A., Sánchez-Pérez, E. A., González, L. M., Peset, F. & Aleixandre-Benavent, R. (2016). The impact factor as a measuring tool of the prestige of the journals in research assessment in mathematics. Research Evaluation, 1–9. doi: 10.1093/reseval/rvv041 .Ferrer-Sapena, A., Sánchez-Pérez, E. A., González, L. M., Peset, F., & Aleixandre-Benavent, R. (2015). Mathematical properties of weighted impact factors based on measures of prestige of the citing journals. Scientometrics, 105(3), 2089–2108.Gagolewski, M., & Mesiar, R. (2014). Monotone measures and universal integrals in a uniform framework for the scientific impact assessment problem. Information Sciences, 263, 166–174.Habibzadeh, F., & Yadollahie, M. (2008). Journal weighted impact factor: A proposal. Journal of Informetrics, 2(2), 164–172.Klement, E., Mesiar, R., & Pap, E. (2010). A universal integral as common frame for Choquet and Sugeno integral. IEEE Transactions on Fuzzy Systems, 18, 178–187.Leydesdorff, L., & Opthof, T. (2010). Scopus’s source normalized impact per paper (SNIP) versus a journal impact factor based on fractional counting of citations. Journal of the American Society for Information Science and Technology, 61, 2365–2369.Li, Y. R., Radicchi, F., Castellano, C., & Ruiz-Castillo, J. (2013). Quantitative evaluation of alternative field normalization procedures. Journal of Informetrics, 7(3), 746–755.Moed, H. F. (2010). Measuring contextual citation impact of scientific journals. Journal of Informetrics, 4, 265–277.Owlia, P., Vasei, M., Goliaei, B., & Nassiri, I. (2011). Normalized impact factor (NIF): An adjusted method for calculating the citation rate of biomedical journals. Journal of Biomedical Informatics, 44(2), 216–220.Pinski, G., & Narin, F. (1976). Citation influence for journal aggregates of scientific publications: Theory, with application to the literature of physics. Information Processing and Management, 12, 297–312.Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159–159.Pudovkin,A.I., & Garfield, E. (2004). Rank-normalized impact factor: A way to compare journal performance across subject categories. In Proceedings of the 67th annual meeting of the American Society for Information science and Technology, 41, 507-515.Rousseau, R. (2002). Journal evaluation: Technical and practical issues. Library Trends, 50(3), 418–439.Ruiz Castillo, J., & Waltman, L. (2015). Field-normalized citation impact indicators using algorithmically constructed classification systems of science. Journal of Informetrics, 9, 102–117.Torra, V., & Narukawa, Y. (2008). The h-index and the number of citations: Two fuzzy integrals. IEEE Transactions on Fuzzy Systems, 16, 795–797.Waltman, L., & van Eck, N. J. (2008). Some comments on the journal weighted impact factor proposed by Habibzadeh and Yadollahie. Journal of Informetrics, 2(4), 369–372.Waltman, L., & van Eck, N. J. (2010). The relation between Eigenfactor, audience factor, and influence weight. Journal of the American Society for Information Science and Technology, 61, 1476–1486.Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ’alternative metrics’ in scientific publications. Scientometrics, 101(2), 1491–1513.Zitt, M. (2010). Citing-side normalization of journal impact: A robust variant of the Audience Factor. Journal of Informetrics, 4(3), 392–406.Zitt, M. (2011). Behind citing-side normalization of citations: Some properties of the journal impact factor. Scientometrics, 89, 329–344.Zitt, M., & Small, H. (2008). Modifying the journal impact factor by fractional citation weighting: The audience factor. Journal of the American Society for Information Science and Technology, 59, 1856–1860.Zyczkowski, K. (2010). Citation graph, weighted impact factors and performance indices. Scientometrics, 85(1), 301–315

    Correlation between impact factor and public availability of published research data in information science & library science journals

    Get PDF
    Trabajo presentado a la 15th International Society of Scientometrics and Informetrics Conference, celebrada en EstambĂşl (Turquia) del 29 de junio al 4 de julio de 2015.Scientists continuously generate research data but only a few part of them are published. If these data were accessible and reusable, researchers could examine them and generate new knowledge. Currently, the barriers to data sharing are phased out and public research organizations are demanding ever more insistently that publications resulting from publicly funded projects and data that support them should be published in open (Savage & Vickers, 2009). The purpose of this work is: a) to analyse policies concerning open availability of raw research data in journals in the Information Science & Library Science (ISLS); and b) to determine whether there is a correlation between the impact factor and policies of these journals concerning storage and reuse of scientific data.This work has benefited from assistance by the National R+D+I of the Ministry of Economy and Competitiveness of the Spanish Government (CSO2012-39632-C02-01) and Prometeo Program for excellent research groups of Generalitat Valenciana (GVPROMETEO2013-041).Peer Reviewe

    Open data, big data: where do we go?

    Get PDF
    The "open data" movement is becoming more popular in different contexts: public administration, science and companies; therefore, it is important to know its scope. The term has generated a lot of expectations related to transparency at all levels, data reuse and free access. This review discusses some of the expectations generated by this movement

    Information Management in Healthcare and Environment: Towards an Automatic System for Fake News Detection

    Full text link
    [EN] Comments and information appearing on the internet and on different social media sway opinion concerning potential remedies for diagnosing and curing diseases. In many cases, this has an impact on citizens' health and affects medical professionals, who find themselves having to defend their diagnoses as well as the treatments they propose against ill-informed patients. The propagation of these opinions follows the same pattern as the dissemination of fake news about other important topics, such as the environment, via social media networks, which we use as a testing ground for checking our procedure. In this article, we present an algorithm to analyse the behaviour of users of Twitter, the most important social network with respect to this issue, as well as a dynamic knowledge graph construction method based on information gathered from Twitter and other open data sources such as web pages. To show our methodology, we present a concrete example of how the associated graph structure of the tweets related to World Environment Day 2019 is used to develop a heuristic analysis of the validity of the information. The proposed analytical scheme is based on the interaction between the computer tool-a database implemented with Neo4j-and the analyst, who must ask the right questions to the tool, allowing to follow the line of any doubtful data. We also show how this method can be used. We also present some methodological guidelines on how our system could allow, in the future, an automation of the procedures for the construction of an autonomous algorithm for the detection of false news on the internet related to health.The first-named and the forth-named authors were supported by the Spanish Ministry for Science, Innovation and Universities, the Spanish State Research Agency and the European Regional Development Fund under Research Grant SO2015-65594-C2-1R Y 2R, and to the Catedra de Transparencia y Gestion de Datos UPV/GVA. The third-named author was supported by the Spanish Ministry for Science, Innovation and Universities, the Spanish State Research Agency and the European Regional Development Fund under Research Grant MTM2016-77054-C2-1-P.Lara-Navarra, P.; Falciani, H.; Sánchez Pérez, EA.; Ferrer Sapena, A. (2020). Information Management in Healthcare and Environment: Towards an Automatic System for Fake News Detection. International Journal of Environmental research and Public Health (Online). 17(3):1-12. https://doi.org/10.3390/ijerph17031066S112173Sato, A. P. S. (2018). What is the importance of vaccine hesitancy in the drop of vaccination coverage in Brazil? Revista de Saúde Pública, 52, 96. doi:10.11606/s1518-8787.2018052001199Los ministerios de Sanidad y Ciencia realizan un primer listado de 73 pseudoterapiashttp://www.rtve.es/noticias/20190228/ministerios-sanidad-ciencia-realizan-primer-listado-73-pseudoterapias/1892081.shtmlPsoriasis, lupus, alergia… Enfermedades autoinmunes crónicas, o no?https://tunaturopata.es/psoriasis-lupus-autoinmune-tratamiento-natural/Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., … Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094-1096. doi:10.1126/science.aao2998Zannettou, S., Sirivianos, M., Blackburn, J., & Kourtellis, N. (2019). The Web of False Information. Journal of Data and Information Quality, 11(3), 1-37. doi:10.1145/3309699McClain, C. R. (2017). Practices and promises of Facebook for science outreach: Becoming a «Nerd of Trust». PLOS Biology, 15(6), e2002020. doi:10.1371/journal.pbio.2002020Social-H2-2018-report Global Web Index Reporthttps://www.globalwebindex.com/reports/socialWhite Paper: Redefining Financial Risk and Compliance Practiceshttps://neo4j.com/whitepapers/financial-risk-reporting/Akoglu, L., Tong, H., & Koutra, D. (2014). Graph based anomaly detection and description: a survey. Data Mining and Knowledge Discovery, 29(3), 626-688. doi:10.1007/s10618-014-0365-yUnsupervised Profiling Methods for Fraud Detection. Unpublishedhttps://www.semanticscholar.org/paper/Unsupervised-Profiling-Methods-for-Fraud-Detection-Bolton-Hand/5b640c367ae9cc4bd072006b05a3ed7c2d5f496dGao, X., Xiao, B., Tao, D., & Li, X. (2009). A survey of graph edit distance. Pattern Analysis and Applications, 13(1), 113-129. doi:10.1007/s10044-008-0141-yWhiting, D. G., Hansen, J. V., McDonald, J. B., Albrecht, C., & Albrecht, W. S. (2012). MACHINE LEARNING METHODS FOR DETECTING PATTERNS OF MANAGEMENT FRAUD. Computational Intelligence, 28(4), 505-527. doi:10.1111/j.1467-8640.2012.00425.xNgai, E. W. T., Hu, Y., Wong, Y. H., Chen, Y., & Sun, X. (2011). The application of data mining techniques in financial fraud detection: A classification framework and an academic review of literature. Decision Support Systems, 50(3), 559-569. doi:10.1016/j.dss.2010.08.00

    Index spaces and standard indices in metric modelling

    Get PDF
    We analyze the basic structure of certain metric models, which are constituted by an index I acting on a metric space (D; d) representing a relevant property of the elements of D. We call such a structure (D; d; I) an index space and define on it normalization and consistency constants that measure to what extent I is compatible with the metric d. The “best” indices are those with such constants equal to 1 (standard indices), and we show an approximation method for other indices using them. With the help of Lipschitz extensions, we show how to apply these tools: a new model for the triage process in the emergency department of a hospital is presented

    Design Trend Forecasting by Combining Conceptual Analysis and Semantic Projections: New Tools for Open Innovation

    Full text link
    [EN] In this paper, we describe a new trend analysis and forecasting method (Deflexor), which is intended to help inform decisions in almost any field of human social activity, including, for example, business, art and design. As a result of the combination of conceptual analysis, fuzzy mathematics and some new reinforcing learning methods, we propose an automatic procedure based on Big Data that provides an assessment of the evolution of design trends. The resulting tool can be used to study general trends in any field¿depending on the data sets used¿while allowing the evaluation of the future acceptance of a particular design product, becoming in this way, a new instrument for Open Innovation. The mathematical characterization of what is a semantic projection, together with the use of the theory of Lipschitz functions in metric spaces, provides a broad-spectrum predictive tool. Although the results depend on the data sets used, the periods of updating and the sources of general information, our model allows for the creation of specific tools for trend analysis in particular fields that are adaptable to different environments.This research was funded by Istituto Europeo di Design and Generalitat Valenciana, Cátedra de Transparencia y Gestión de Datos, Universitat Politècnica de València (PID2019-105708RBC21 (MICIU/FEDER,UE)).Manetti, A.; Ferrer Sapena, A.; Sánchez Pérez, EA.; Lara-Navarra, P. (2021). Design Trend Forecasting by Combining Conceptual Analysis and Semantic Projections: New Tools for Open Innovation. Journal of Open Innovation: Technology, Market, and Complexity. 7(1):1-26. https://doi.org/10.3390/joitmc7010092S1267

    Aplicaciones de la tecnologĂ­a blockchain en la documentaciĂłn cientĂ­fica: situaciĂłn actual y perspectivas

    Full text link
    [EN] Blockchain technology is being introduced in all areas and procedures of verification, identification and recording of the virtual world. Some of its applications are simple translations of methods that are already being developed in other technical supports, but most of them provide new utilities that can change the technological context of scientific documentation. In this article we present an updated panorama of the initiatives based on blockchain for the management of scientific and technological documentation, highlighting those aspects that make this new technology a differentiated and authentically novel tool. A critical analysis with theoretical and practical considerations on its potential use completes the content of this work.[ES] La tecnología blockchain está introduciéndose en todos los ámbitos y procedimientos de verificación, identificación y registro del mundo virtual. Algunas de sus aplicaciones son simples traslaciones de métodos que se desarrollan ya en otros soportes técnicos, pero la mayor parte aporta nuevas utilidades que pueden revolucionar el contexto tecnológico de la documentación científica. Se presenta un panorama actualizado de las iniciativas basadas en blockchain para la gestión de la documentación científica y tecnológica, destacando aquellos aspectos que hacen de esta nueva tecnología una herramienta diferenciada y auténticamente novedosa. Un análisis crítico con consideraciones teóricas y prácticas sobre su uso potencial completa el contenido de este trabajo.Subvención del Ministerio de Ciencia, Innovación y Universidades de España, Agencia Estatal de Investigaciones y Feder. E.A. Sánchez-Pérez: Proyecto MTM2016-77054-C2-1-P. Ferrer-Sapena: Proyecto CSO2015-65594-C2-1R y 2R (Mineco/Feder, UE).Ferrer Sapena, A.; Sánchez Pérez, EA. (2019). Applications of blockchain technology in scientific documentation: current situation and perspectives. El profesional de la información. 28(2):1-11. https://doi.org/10.3145/epi.2019.mar.10S11128

    Special characteristics of the dissemination of mathematics research

    Get PDF
    [ES] Exponemos y discutimos algunos aspectos diferenciales de la matemática pura relativos a las publicaciones científicas, en relación con otras ciencias, que hacen de esta disciplina un caso especial en lo que respecta al análisis de su estructura desde el punto de vista de las ciencias de la información.[EN] We present and discuss some differential aspects of scientific publications on mathematics in relation to other sciences. This discipline should be treated as a distinct case from the point of view of bibliometrics, information science, and productivity assessment of mathematics researchers.Sánchez Pérez, EA.; Ferrer Sapena, A. (2015). Características especiales de la difusión científica de matemáticas. Anuario ThinkEPI. 9:196-200. doi:10.3145/thinkepi.2015.45S196200

    Correlation between impact factor and public availability of published research data in Information Science and Library Science journals

    Full text link
    The final publication is available at Springer via http://dx.doi.org/10.1007/s11192-016-1868-7[EN] Scientists continuously generate research data but only a few of them are published. If these data were accessible and reusable, researchers could examine them and generate new knowledge. Our purpose is to determine whether there is a relationship between the impact factor and the policies concerning open availability of raw research data in journals of Information Science and Library Science (ISLS) subject category from the Web of Science database. We reviewed the policies related to public availability of papers and data sharing in the 85 journals included in the ISLS category of the Journal Citation Reports in 2012. The relationship between public availability of published data and impact factor of journals is analysed through different statistical tests. The variable "statement of complementary material" was accepted in 50 % of the journals; 65 % of the journals support "reuse"; 67 % of the journals specified "storage in thematic or institutional repositories"; the "publication of the manuscript in a website" was accepted in 69 % of the journals. We have found a 50 % of journals that include the possibility to deposit data as supplementary material, and more than 60 % accept reuse, storage in repositories and publication in websites. There is a clear positive relationship between being a top journal in impact factor ranking of JCR and having an open policy.This work has benefited from assistance by the National R+D+I of the Ministry of Economy and Competitiveness of the Spanish Government (CSO2012-39632-C02).Aleixandre-Benavent, R.; Moreno-Solano, L.; Ferrer Sapena, A.; Sánchez Pérez, EA. (2016). Correlation between impact factor and public availability of published research data in Information Science and Library Science journals. Scientometrics. 107(1):1-13. https://doi.org/10.1007/s11192-016-1868-7S1131071Aleixandre-Benavent, R., Vidal-Infer, A., Alonso-Arroyo, A., Valderrama-Zurián, J. C., Bueno-Cañigral, F., & Ferrer-Sapena, A. (2014). Public availability of published research data in substance abuse journals. International Journal of Drug Policy, 25, 1143–1146.Alsheikh-Ali, A. A., Qureshi, W., Al-Mallah, M. H., & Ioannidis, J. P. A. (2011). Public availability of published research data in high-impact journals. PLoS One, 6(9), e24357.Anderson, B. J., & Merry, A. F. (2009). Data sharing for pharmacokinetic studies. Paediatr Anaesthes, 19(10), 1005–1010.Blahous, B., Gorraiz, J., Gumpenberger, C., Lehner, O., Stein, B., & Ulrych, U. (2015). Research data policies in scientific journals—An empirical study. Zeitschrift fur Bibliothekswesen und Bibliographie, 62(1), 12–24.Borrego, A., & Garcia, F. (2013). Provision of supplementary materials in Library and Information Science scholarly journals. Aslib Proceedings, 65(5), 503–514.Cech, T. R. (2003). Sharing publication-related data and materials: responsibilities of authorship in the life sciences. www.nap.edu/books/0309088593/html . Accessed 24 November 2015CODATA. (2015). http://www.codata.org . Accessed 21 February 2015Conradie, P., & Choenni, S. (2014). On the barriers for local government releasing open data. Government Information Quarterly, 31, S10–S17.De Castro, P., Calzolari, A., Napolitani, F., Maria Rossi, A., Mabile, L., Cambon-Thomsen, A., & Bravo, E. (2013). Open data sharing in the context of bioresources. Acta Informatica Medica, 21(4), 291–292.Digital Curation Centre (DCC). (2015). http://www.dcc.ac.uk . Accessed 4 March 2015European Commission. (2015). Guidelines on open access to scientific publications and research data in Horizon 2020. European Commission, 2013. http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf . Accessed 15 March 2015Fear, K. (2015). Building outreach on assessment: Researcher compliance with journal policies for data sharing. Bulletin of the Association for Information Science and Technology, 41(6), 18–21.González, L. M., Saorín, T., Ferrer, A., Aleixandre-Benavent, R., & Peset, F. (2013). Gestión de datos de investigación: infraestructuras para su difusión. Professional Information, 22, 414–423.Jones, R. B., Reeves, D., & Martinez, C. S. (2012). Overview of electronic data sharing: Why, how, and impact. Current Oncology Reports, 14(6), 486–493.Kaye, J. (2012). The tension between data sharing and the protection of privacy ingenomics research. Annual Review of Genomics and Human Genetics, 13, 415–431.Leonelli, S., Smirnoff, N., Moore, J., Cook, C., & Bastow, R. (2013). Making open data work for plant scientists. Journal of Experimental Botany, 64(14), 4109–41017.National Institutes of Health (NIH). (2015). Data sharing policy. http://grants.nih.gov/grants/policy/data_sharing/index.htm . Accessed 3 March 2015National Science Foundation (NSF). (2014). Dissemination and sharing of research results. NSF Data Sharing Policy. http://www.nsf.gov/bfa/dias/policy/dmp.jsp . Accessed 21 November 2014Nelson, B. (2009). Data sharing: Empty archives. Nature, 461(7261), 160–163.Open Knowledge Foundation. (2015). https://okfn.org/ . Accessed 3 March 2015Pisani, E., & AbouZahr, C. (2010). Sharing health data: Good intentions are not enough. Bulletin of the World Health Organization, 88(6), 462–466.Piwowar, H. A., Day, R. S., & Fridsma, D. B. (2007). Sharing detailed research data is associated with increased citation rate. PLoS One, 2(3), e308.Piwowar, H. A., & Chapman, W.W. (2008). A review of journal policies for sharing research data. http://precedings.nature.com/documents/1700/version/1.hdl:10101/npre.2008.1700.1 . Accessed 11 December 2015Piwowar, H. A., & Todd, J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175.Registry of Research Data Repositories (re3data). (2015). http://www.re3data.org/2014/03/re3data-org-from-funding-to-growing/ . Accessed 5 March 2015Savage, C. J., & Vickers, A. J. (2009). Empirical study of data saharing by authors publishing in PLOS journals. PLoS One, 4(9), e7078.Sayogo, D. S., & Pardo, T. A. (2013). Exploring the determinants of scientific data sharing: Understanding the motivation to publish research data. Government Information Quarterly, 30, S19–S31.Spencer, H. (2015). Thoughts on the sharing of data and research materials and the role of journal policies. http://web.stanford.edu/~vcs/Nov21/hilary_spencer_rdcscsJan2010.pdf . Accessed 11 December 2015Sturges, P., Bamkin, M., Anders, J., & Hussain, A. (2014). Journals and their policies on research data sharing. https://jordproject.wordpress.com/reportsand-article/journals-and-their-policies-on-research-data-sharing/ . Accessed 24 November 2015Tenenbaum, J. D., Sansone, S. A., & Haendel, M. (2007). A sea of standards for omics data: sink or swim? Journal of the American Medical Informatics Association, 21(2), 200–203.Tenopir, C., Allard, S., Douglass, K., Aydinoglu, A. U., Wu, L., Read, E., et al. (2011). Data sharing by scientists: Practices and perceptions. PLoS One, 6(6), e21101.The Royal Society Publishing. (2015). http://royalsocietypublishing.org/data-sharing . Accessed 15 March 2015Toronto International Data Release Workshop Authors. (2009). Prepublication data sharing. Nature, 461(7261), 168–170.Van Noorden, R. (2013). Data-sharing: Everything on display. Nature, 500, 243–245.Wellcome Trust. (2015). Data sharing. http://www.wellcome.ac.uk/About-us/Policy/Spotlight-issues/Data-sharing/ . Accessed 21 January 201
    corecore