Source: Asia-Pacific Journal of Teacher Education, Vol. 41, No. 4, 441–456. 2013
(Reviewed by the Portal Team)
This paper reports findings from the first phase of an outcome-based innovation within one higher education institute in Hong Kong. Specifically, this research seeks to:
(1) confirm the properties of a survey instrument designed specifically to explore an outcomes model of course implementation;
(2) report preliminary findings regarding students’ course perceptions.
Method
The article reported on a pilot study carried out at the Hong Kong Institute of Education.
The participants were 89 enrolled first-year students and three instructors.
Students were divided into three groups with one instructor per group.
All three groups had English as the mode of instruction.
The SEOBLS version 1 survey was administered simultaneously across all three groups, at the end of the course.
The SEOBLS version 1 course evaluation instrument was designed to address three areas: course intended learning outcomes, teaching & learning activities, and assessment tasks.
In response to the first intention of confirming the properties of the instrument, the two statistical analyses identified strengths and improvement needs for the SEOBLS questionnaire itself.
Despite proceeding from different assumptions, both Rasch and factor analyses clearly indicated that two scales were needed to meaningfully understand student evaluations of the course.
Both approaches generally agreed on which items belonged to the two dimensions, so just using a summed total score would be inadequate for understanding student responses.
Furthermore, it was found that for these students, their experience in the OBE course was not a radical departure from a “regular” course.
Students seemed, for the most part, satisfied with course quality in the areas predicted as having maximum OBE impact, but neither did they seem to consider the course easier or harder than other courses within their experience.
Hence, the principal preliminary finding of an outcome-based course innovation was that students did not perceive the OBE course as something significantly different.
The authors argue that students may not have been able to tell the difference between this OBE-designed course and other courses they took in the same semester, because they were not made aware of the difference.
The instructors did not explain to the students that OBE was being experimented with and hence, denied them the opportunity to become directly aware of the innovation.
Thus, although significant changes were made in the design and implementation of the course, in the absence of preparing students to make informed critiques of design and implementation, students may subsequently not have noted, reported, or reflected on differences in course characteristics.
The authors conclude that a preliminary but viable premise may be drawn from these findings: in implementing an OBE initiative in a teacher education program, students have to be explicitly informed of OBE principles and their intended impact on planning and implementation.
This preliminary finding was that the SEOBLS, with some enhancement, may provide a viable framework for informing course enhancement from our students’ perspective.
However, a lack of transparency in providing students with an awareness of the OBE innovation may have had a negative effect on students’ ability to fully judge the planning and changes that OBE represented in this course.
Transparency is important, as well in a more general sense for evaluating curricular change in teacher education and, more broadly, tertiary education.
Designing greater transparency into an OBE process and, more generally, any curricular change process would seem an important step in enhancing the value and quality of student evaluation.
Add comment: