11,569 research outputs found

    Editors and Reviewers Acknowledgement, 3(1), January-June, 2019

    Get PDF
    The Journal of Medical Research and Innovation would like to thank each and every one who has helped us to review and edit the articles. As a small token of appreciation, we would like to mention the names of all the editors and reviewers in random order here who have edited or reviewed the articles for the January, 2019 issue. Editors Varshil Mehta Shakti Goel Sojib Bin Zaman   Reviewers (List includes reviewers of both accepted and rejected papers) Shyam Vora Ankit Nayak Hemant Chouhan Ruby Aikat Pravin Padalkar Nishu Tyagi Rajesh Sharawat Harsha Makwana Chinmay Jani Nishtha Agarwal Jitendra Singh Vishal Kamra Sakshi Shandilya Dyuti Mittal Mehrdad Ghorbanlou SSSN Rajasekhar Sharmin Majumder Raihan Khan Ishpreet Biji Rahul Kotia

    Letter from the Editor

    Full text link
    In this issue, Tatiana Cevallos describes her journey from Ecuador to the United States and her journey of faith development and how those journeys influence her work as a teacher educator at a Christian Institution of Higher Education (IHE). Geoff Beech explores the intersection of Christian belief with secular constructs and philosophies, examining how Christian teacher educators navigate these intersections with confidence and grace. Marion Shields and David Bolton report the findings of a five-year study, revealing the attitudes of teacher candidates at an Australian Christian IHE toward students with disabilities. In addition to these three pieces, our editorial team asked two authors of past essays that have been well received by our readership over the years to provide an update to their original essays. Nyaradzo Mvududu examines the command of Jesus that we love others and what the implications are for working with a culturally diverse student population. David Anderson examines the notion of servant leadership from a Christian point of view

    Business Meeting Report (Secretary\u27s and Treasurer\u27s Report)

    Get PDF

    Peer Review Analyze: A Novel Benchmark Resource for Computational Analysis of Peer Reviews

    Get PDF
    Peer Review is at the heart of scholarly communications and the cornerstone of scientific publishing. However, academia often criticizes the peer review system as non-transparent, biased, arbitrary, a flawed process at the heart of science, leading to researchers arguing with its reliability and quality. These problems could also be due to the lack of studies with the peer-review texts for various proprietary and confidentiality clauses. Peer review texts could serve as a rich source of Natural Language Processing (NLP) research on understanding the scholarly communication landscape, and thereby build systems towards mitigating those pertinent problems. In this work, we present a first of its kind multi-layered dataset of 1199 open peer review texts manually annotated at the sentence level (*17k sentences) across the four layers, viz. Paper Section Correspondence, Paper Aspect Category, Review Functionality, and Review Significance. Given a text written by the reviewer, we annotate: to which sections (e.g., Methodology, Experiments, etc.), what aspects (e.g., Originality/Novelty, Empirical/Theoretical Soundness, etc.) of the paper does the review text correspond to, what is the role played by the review text (e.g., appreciation, criticism, summary, etc.), and the importance of the review statement (major, minor, general) within the review. We also annotate the sentiment of the reviewer (positive, negative, neutral) for the first two layers to judge the reviewer’s perspective on the different sections and aspects of the paper. We further introduce four novel tasks with this dataset, which could serve as an indicator of the exhaustiveness of a peer review and can be a step towards the automatic judgment of review quality. We also present baseline experiments and results for the different tasks for further investigations. We believe our dataset would provide a benchmark experimental testbed for automated systems to leverage on current NLP state-of-the-art techniques to address different issues with peer review quality, thereby ushering increased transparency and trust on the holy grail of scientific research validation

    Improving peer-review by developing reviewers’ feedback literacy

    Get PDF
    There is a need to train journal peer reviewers to provide professional, constructive, and actionable feedback: i.e., develop their feedback literacy.Journals and publishers can improve the way they support peer reviewers' feedback literacy by raising awareness and providing guidance and exemplars of good practice.Existing online peer‐review training resources developed by major publishers only focus on content of feedback but neglect the socio‐emotional aspect of feedback.Resources to develop peer reviewers' feedback literacy should be formulated by adopting knowledge‐based, skills‐based, and community‐based approaches.Publisher PDFPeer reviewe

    Transparency of reporting practices in quantitative field studies: The transparency sweet spot for article citations

    Get PDF
    Intuitively, there would appear to be a direct positive link between the transparency with which research procedures get reported and their appreciation (and citation) within the academic community. It is therefore not surprising that several guidelines exist, which demand the reporting of specific features for ensuring transparency of quantitative field studies. Unfortunately, it is currently far from clear which of these features do get reported, and how this affects the articles’ citations. To rectify this, we review 200 quantitative field studies published in five major journals from the field of management research over a period of 20 years (1997–2016). Our results reveal that there are significant gaps in the transparent reporting of even the most basic features. On the other hand, our results show that copious reporting of transparency is productive only up to a certain degree, after which more transparent articles get cited less, pointing to a ‘transparency sweet spot’ that can be achieved by reporting mindfully

    To walk on the Penrose stairs of science

    Get PDF
    How we have learnt to overcome our shortcomings, face our fears, and grow as competent researchers in the harsh academic world. Published in Behavioural and Social Sciences at Nature Research

    Editorial and Notes from the Editor

    Get PDF
    Non
    • 

    corecore