4 research outputs found

    The implications of a newly developed oral test in Business English: Are we heading in the right direction?

    Get PDF
    The paper presents an overview of the testing and assessment standardization process at the Language Centre of Masaryk University. It exemplifies the process by analysing a two-year development of the C1 Business English oral test administered to students at the Faculty of Economics and Administration (FEA), which resulted in the formation of a completely new testing procedure. The transition from the original teacher–student interview format to a monological discourse and a peer-to-peer discussion, with the roles of the interlocutor and rater split between two teachers using analytic rating scales to evaluate performance, is described, along with its implications on the validity and reliability of assessment. Students’ perception of the test importance is also examined. The second part deals with the analysis of a questionnaire on feedback collected from students taking the test in Spring 2014. The preliminary look into the merit of the efforts exerted indicates a noticeable enhancement in quality, reliability, validity and prestige of the oral test.Příspěvek představí proces standardizace hodnocení jazykových kompetencí studentů Masarykovy univerzity, konkrétně vývoj ústní části zkoušky obchodní angličtiny na úrovni C1 na Ekonomicko-správní fakultě. Druhá část analyzuje dotazník pro zpětnou vazbu, vyplněný studenty, kteří zkoušku absolvovali na jaře 2014. Předběžné hodnocení celé této snahy naznačuje, že se jedná o posun směrem k vyšší kvalitě, spolehlivosti, validitě i prestiži ústní části zkoušky.The paper presents an overview of the testing and assessment standardization process at the Language Centre of Masaryk University. It exemplifies the process by analysing a two-year development of the C1 Business English oral test administered to students at the Faculty of Economics and Administration (FEA), which resulted in the formation of a completely new testing procedure. The transition from the original teacher–student interview format to a monological discourse and a peer-to-peer discussion, with the roles of the interlocutor and rater split between two teachers using analytic rating scales to evaluate performance, is described, along with its implications on the validity and reliability of assessment. Students’ perception of the test importance is also examined. The second part deals with the analysis of a questionnaire on feedback collected from students taking the test in Spring 2014. The preliminary look into the merit of the efforts exerted indicates a noticeable enhancement in quality, reliability, validity and prestige of the oral test

    The implications of a newly developed oral test in Business English: Are we heading in the right direction?

    Get PDF
    The paper presents an overview of the testing and assessment standardization process at the Language Centre of Masaryk University. It exemplifies the process by analysing a two-year development of the C1 Business English oral test administered to students at the Faculty of Economics and Administration (FEA), which resulted in the formation of a completely new testing procedure. The transition from the original teacher–student interview format to a monological discourse and a peer-to-peer discussion, with the roles of the interlocutor and rater split between two teachers using analytic rating scales to evaluate performance, is described, along with its implications on the validity and reliability of assessment. Students’ perception of the test importance is also examined. The second part deals with the analysis of a questionnaire on feedback collected from students taking the test in Spring 2014. The preliminary look into the merit of the efforts exerted indicates a noticeable enhancement in quality, reliability, validity and prestige of the oral test.

    Assessing speaking at C1 level business English at Masaryk University Brno

    No full text
    This paper deals with the standardisation process in specifying a new format of the Business English spoken test at the Faculty of Economics and Administration of Masaryk University, which was previously based on the assessment of integrated skills of reading and speaking. With a growing awareness of the CEFR levels, a need to enhance fairness, reliability and validity of testing arose and was embraced by the English language department team initiating a series of discussions resulting in an overhaul of the oral test format. The major change lies in a radical reduction of instruction input, a switch from a teacher-student interview to a peer-to-peer discussion pattern, thereby splitting the role of an interlocutor and rater and the use of analytical rating scales. A year from the implementation of this system, another round of discussions was held aiming to adjust the new format in order to enhance a more autonomous and higher-standard of language production. Recordings of real live tests have been collected for benchmarking purposes and to evaluate the result of the whole process. All signs indicate that the change was for the better with the new format being more capable of eliciting desired language and teachers’ ratings more reliably and consistently.This paper deals with the standardisation process in specifying a new format of the Business English spoken test at the Faculty of Economics and Administration of Masaryk University, which was previously based on the assessment of integrated skills of reading and speaking. With a growing awareness of the CEFR levels, a need to enhance fairness, reliability and validity of testing arose and was embraced by the English language department team initiating a series of discussions resulting in an overhaul of the oral test format. The major change lies in a radical reduction of instruction input, a switch from a teacher-student interview to a peer-to-peer discussion pattern, thereby splitting the role of an interlocutor and rater and the use of analytical rating scales. A year from the implementation of this system, another round of discussions was held aiming to adjust the new format in order to enhance a more autonomous and higher-standard of language production. Recordings of real live tests have been collected for benchmarking purposes and to evaluate the result of the whole process. All signs indicate that the change was for the better with the new format being more capable of eliciting desired language and teachers’ ratings more reliably and consistently
    corecore