söndag 18 mars 2018

On the uncertain value of course evaluations

.
I don't think I've written about it before but I don't work at the KTH School of Computer Science and Communication any longer. That school has been merged with two other schools (ICT and Electrical Engineering) and as of the first of January I now work at the much larger (and presumably better) School of Electrical Engineering and Computer Science (EECS). The new school had its first "Teachers' day" this past week and I was asked to attend since my colleague Elina couldn't.

I understand it might be hard to plan, but my critique is that the program for these kinds of events is to a higher extent shaped by the ideas of someone else (higher up in the hierarchy) about what I "need to know" rather than by teachers' ideas about what we together need to discuss, know and do to become a better teacher and improve the quality of our courses (and the educational programmes they are part of). For further critique, see my blog posts from back in 2011; "The folly of our Education Assessment Exercises", "Top-down vs. bottom-up" and "The paradox of planning".

The day was, as is usually the case, characterised by a solid top-down perspective and by chunks of one-way communication about various visions (KTH Vision 2027), plans (development plan, operational plan) and administrative systems (LADOK3) that are "good for us to know about". I'm not saying it's not important but I'm also not saying it's not boring. I am however saying that it doesn't really help me solve any of the problems that I encounter in my everyday life as a university teacher.

I will here mainly discuss one aspect of the day and that is the discussion we had about securing the quality of our university courses and of our educational programmes at KTH. This, for some reason, immediately slid into a discussion about course analyses - as if they are the same thing. The speaker started by saying that he "is a fan of course analyses". I thought that was a very strange statement as I'm personally a fan of improving the quality of my courses. I am, as apart from the speaker and most others who adopt a top-down perspective, very careful about not mixing up means (improving the frequency and quality of course analyses) with the ends (improving the quality of teaching, of courses and of students' learning outcomes). A course analysis could be a useful tool for some stakeholders and for some purposes, but it could also be the case that there are other tools that (for other purposes and for other stakeholders) are more suitable. We should of course maintain an open mind as to what is the most suitable means (tools) if the goal is to increase the quality of our course rather than the "quality" of the course analysis as a tool.

I in fact don't know what would constitute a "high-quality course analysis" unless it resulted in a better course - but I'm sure someone who wants to "get an overview" of an educational programme and "needs" course analyses to do that could help specify it. They would of course fool themselves if they are gullible enough to imagine that "a good course analysis" is the equivalent to "a good course" and it's furthermore a futile task; it seems misguided to try to get an "overview" of an educational programme by turning turning to bunch of papers rather than by talking to the teachers and the students. Course analyses should instead be seen as an instrument of power and an attempt to create the mythical quality of "transparence". This particular quality should be equated with other ritual, talismanic, fetishistic, occult terms that are hailed as indexes of "quality" and "excellence" in higher education today. Course analyses for each course in the educational programme might be useful if you are responsible for the program, but course analyses might simultaneously not be particularly useful for the teachers and the students in that programme.

So I thought the whole discussion was misdirected from the get-go since the main brunt of the talk concerned how me might get people (e.g. teachers - the people sitting in the room) to always carry through and report course analysis results. This is obviously something that has been problematic at times or we wouldn't even have bothered to talk about it at this event.

My main problem is that the idea of improving courses by making the instrument better or by encouraging/forcing each teacher to carry thought with it builds on a flawed idea of where quality comes from. Imagine a restaurant with several cooks in the kitchen. The menu is uneven; there are both highlights and courses that are not particularly good. The owner then imagines that the best way to improve the customers' experiences is to throw out the menus and get now ones with nicer fonts, a nicer layout and images of plates that look delicious. It would seem that a better alternative would be to ask the cooks what their visions are and support them, or, to fire the worst cook in the kitchen - instead of mixing up means and ends (form and function). Are we putting the cart in front of the carriage when we believe that we can improve our courses by having better questions on our course evaluations or by making sure a course evaluation is perfunctorily generated for each course each year? I hope the answer is obvious.

My "radical" suggestion is that we instead systematically should ask teachers what support they need to improve their courses. This should be combined with special attention being paid to courses that for some reasons just don't work (it's of course almost always the same courses/teachers that are deemed problematic year after year).

My top suggestion for how to do this in on a practical level is to:
- encourage (not force) teachers to hand in course evaluations (consisting of students' evaluation of the course and the teacher's analysis of the course).
- hand that course evaluation off to a "pedagogical expert" who get paid for two hours of his/her time to 1) read the course evaluation and 2) the teacher's suggestion for a course-related topic he/she wants to discuss.
- reward the teacher with a one-hour meeting ("consultation") where the focus is on some part of the course (examination, group work, peer learning) that the teacher wants to discuss, exchange, improve or get inspired by what others do (at other schools or at other universities)).

I've suggested this on and off for more than 10 years. Everyone thinks it's a good (or a great) idea but then we always seem to fall back on the unimaginative one-size-fits-all idea of improving education through more comprehensive course evaluations. It's like commissioning an official inquiry that should deliver a report three years from now on (there's no time to lose!) how to reshape higher education so as to encourage creativity and increase the speed at which we adapt to a fast-changing world.

I also suggested that a "gripe session" could be an interesting alternative to course evaluations and I wrote a long, analytical blog post about that back in 2011. I believe that a gripe session is a better alternative for the teacher and for the students but it might result in fewer student evaluations and might decrease the necessity for teachers to' write course analysis. So would gripe sessions be a "problem" or a "solution" to a problem? Perhaps the course analysis could be replaced by (in my case) a gripe session in combination with a couple of reflective blog posts about a course as I'm teaching it or just after it's finished (like thisthis or this recent blog post).

I also suggested that the "program-integrating course" we have in our educational programme is an excellent source of information for understanding which courses students for various reasons find problematic. As apart from course evaluations, close to 100% of our students write a blurb about each course they have taken during the last quarter and they write about the courses for each other. This is a much more comprehensive (everybody writes) and most probably also a more honest evaluation of each course so why not go for that instead of the clunky student evaluations that hardly ever garners many answers at all?

We all know the reason. Such alternative forms of working with the quality of our courses are harder to standardise and harder to point at when we are "audited". So let's at least be honest; this has more to do with covering our asses and less to do with improving the quality of our education. Please see anthropologist David Graeber's 2013 influential text "On the Phenomenon of Bullshit Jobs". It will be published as a book in May this year as "Bullshit Jobs: A Theory". From the promotional text:

"Does your job make a meaningful contribution to the world? In the spring of 2013, David Graeber asked this question in a playful, provocative essay titled "On the Phenomenon of Bullshit Jobs." It went viral. After a million online views in seventeen different languages, people all over the world are still debating the answer."

Do course evaluations make a meaningful contribution to increasing the quality of university courses? Perhaps. Is the work effort proportional to the benefits? Uuuh, perhaps, but that's hard to say since teachers hardly ever get any feedback on course evaluations and often suspect no one ever reads them. So are there other ways to improve the quality of university courses? Surely there are! Actually doing something about the 3-5-10-20% worst courses could be a good start. Supporting those who like to teach in order to help them do an even better job would also be a great start.

Here's another dirt-simple way to increase the quality of courses: have a pedagogical calendar/checklist for all KTH courses with entries like: this needs to happen one year in advance of the course start, one month in advance, one week in advance, when the course starts, one week after the course starts and three weeks after the course is finished. Such a calendar would be especially useful for new faculty. I suggested we should develop one when I started to work at KTH 15 years ago and was told I had the permission to create such a calendar in my own free time.

Meeting colleagues, old and new is always nice. That is one of the main benefits of attending an event like the Teachers' day. I had an interesting group discussion about "Educational quality and university resources" where we among other things discussed the depressing questions "How could we streamline our activities so as to lower the resource requirements while maintaining the quality of education?". Do note that the question assumes that it is possible to do both at the same time, e.g. to increase quality while decreasing resource use (for example by spending less time with each student during each course etc.). That seems like an amazing balancing act to say the least.

Our group discussed how we could compete with "Google University" in 2025. Here are some of our suggestions:
- Our niche will not primarily be to teach but to assess and authenticate knowledge and educational programmes and award exams (that the industry trusts).
- We should cater to the student’s personal experience; opportunities for students to network and grow as persons by:
   - Maximising on what we do best (whatever that is); the personal meeting, (personal) feedback etc. Automate what can be automated to free up time so the teacher can spend more time with the student.
   - Look a bit closer at & take a hint from what sects, (religious) cults, and other tight-knit communities (Hell’s Angels etc.) do to rope people in.
   - Sell more and better merchandize.
   - Maximise the student experience and student life; encourage/support the student Quarneval, student theatre (spex), the KTH job fair "Armada", the reception for new students (“nollning”), the different student "sections" etc.

The main question to ask is; what is easy and inexpensive for us (KTH) to do - that is simultaneously highly valued by our students? We should obviously maximise on this (whatever it is) and hope it can't be automated.
.

Inga kommentarer:

Skicka en kommentar