7 research outputs found

    A Comparative Study of International and American Study Abroad Students’ Expectations and Experiences with Host Countries in Selected Institutions of Higher Education

    Get PDF
    This was a comparative study of international and American study abroad students’ experiences and expectations with the host countries. The rationale for this study was to acquire a deeper understanding of different experiences of students who study abroad and to understand whether their expectations of the host country have an impact on their experiences. An opportunity sample of American study abroad and international students was selected from the United States student population and their expectations and experiences of the host country compared. The study addressed 6 research questions, using a mixed-method approach. The principal instrument for the investigation was the Cross-Cultural Participant Questionnaire conducted online. Associated hypotheses with the research questions were analyzed using Independent sample t-tests and Paired samples t-tests at an alpha level of .05 and the results were described using descriptive statistics. The open-ended questions were analyzed according to established qualitative techniques. The survey was completed by 421 respondents comprised of 155 international students, 252 American study abroad students, and 14 unknown labeled as others. The results of this study identified language fluency, building relationships with the host nationals, learning about a new culture, and personal change as significant expectations of the students. Overall, the students reported being satisfied with the services provided. International students were slightly more satisfied with access to support services than the American study abroad students. American study abroad students had experiences that closer matched their expectations of study abroad than was the case for international students

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    A long Atlantic in a wider world

    No full text

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    No full text

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    No full text
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical science. © The Author(s) 2019. Published by Oxford University Press

    Annual Selected Bibliography

    No full text
    corecore