Looks at the South African Department of Education’s new recommendations for the evaluation of higher education research in South Africa, and examines two primary aspects: the use of pre-compiled journal lists from overseas, and the apparent reliance on peer review as a guarantee of quality. Pointing out that these are tried and tested standards of quality, the authors argue that there are nonetheless disciplinary differences between experimental sciences – such as physics or chemistry – and other disciplines that make these measures difficult to apply across the spectrum. They present an analysis of library and information science publications in the chosen lists and point to the weakness of the selection of titles in this discipline. In addition, there are extra difficulties for scientists from South Africa and the developing world in securing publication in premier international library and information science journals. The authors conclude by calling for the employment of other, additional evaluation measures in an integrated system