Pandemic Pedagogy – Beyond essays and exams: changing the rules of the assessment game

This post is part of History UK’s Pandemic Pedagogy project. For more about the initiative, follow HUK’s blog and Twitter feed.


Assessment, carrots and sticks

Assessment is an integral part of instruction, as it determines whether or not the goals of education are being met.’ (Edutopia, 2008)

The centrality of assessment to learning in higher education is rarely questioned. Experience has taught many of us that nothing motivates students quite like a looming deadline or an upcoming exam. Students channel their energies into activities that  determine final results – a strong motivating factor. And educational theory stresses that well-designed assessments can encourage students to engage in deep learning (Briggs, 2015). But overemphasis on grades can lead to problems.

We sometimes imagine that it is because exams and essays encourage students to engage deeply and to hone their skills that the students see them as important. But it is much more likely that we are encouraging an instrumental approach to education. Students jump through the hoops that we have put there in order to secure the grades that they want – they learn in spite of our approach to assessment, not because of it. If deeper learning takes place, it is pretty much an accidental side effect of that process.   

Some students are very good at playing the exam-essay game that we have devised for them, although there have long been indications that exams favour certain demographic groups over others (Brookings Institute, 2017). We can do better, devising assessment regimes that include all students. Indeed, there are some exciting examples of innovative assessment in history out there already (e.g. see Lucinda Matthews Jones’ 2019 blog post on creative assessments and the #unessay; Chris Jones’s 2018 post talks about using the approach in an early American history course too). The shift to online assessments is an opportunity for more of us to embrace more innovative forms of assessment that will engage students in meaningful learning, develop their skills, knowledge and identity as historians, and better prepare them for future study and/or the world of work. 

Moving online: emergency (traditional) assessment 

I’m well aware that this call-to-arms may not be to everyone’s taste and I’m certainly not suggesting that the essay and the exam have no pedagogic value (well, I’m not really sure about the exam, but it’s not a hill I’m willing to die on). What I’m saying is that our students will be better nourished by a more varied and interesting assessment diet that blends traditional and innovative forms. 

Of course, traditional forms of assessment can easily be reconfigured so that they can be done and administered online. Indeed, aside from the odd visit to the library, the majority of essays are largely researched, written and submitted digitally nowadays. Exams are a bit more problematic, but the ‘emergency’ phase of online instruction from March-June 2020 showed that even they can be done online, as take-home tests (the fancy name is ‘time-constrained assignments’) or using online tools such as ProctorU. Essays and exams can therefore be done online unproblematically. We can do more, though.  

Engaging constructively with the digital world

There are already cases of historians developing innovative assessments that promote meaningful learning about the digital world. A good example is Charles West’s MA module in medieval history at the University of Sheffield, in which students learn about how information is presented on Wikipedia, before researching and writing (or improving) pages on topics that they have studied (West, 2018). Over the past few years, increasing numbers of historians have recognised the value of engaging students in learning that involves exploring how knowledge is presented and constructed online and even contributing to digital knowledge creation within the discipline. 

This represents something of a step-change because until fairly recently many course handbooks and introductory lectures began with dire warnings for students of the consequences of using websites such as Wikipedia as sources. 

no wikipedia image

Image source: http://medias.liberation.fr/photo/592891-arton2684.jpg?modified_at=1200590185&width=960

It has always struck me as unrealistic to expect students not to use such a readily available source and in some sense dishonest (as a student I often found it a useful starting point when moving into unfamiliar territory, and continue to do so). A more productive way forward is to devise activities that develop students’ digital literacies (for more see Doug Belshaw’s #neverendingthesis; plus some quick hints and tips), to enable them to filter ‘good’ from substandard online sources. Library staff have played a big role in many institutions and are experts we should turn to for support and as collaborators in this endeavour (see the 2017 statement from the International Federation of Library Associations and Institutions). 

The key point here is that online sources are not ‘inappropriate’ or ‘unacademic’ in and of themselves. Rather, it is important to educate students in how to engage with and make use of the internet productively. Given that source evaluation and analysis is what historians do, it seems to me that it shouldn’t be too difficult to figure out how to do this. 

Building in the digital world: a new way forward?   

But perhaps we  can go even further, beyond assessments that cultivate students’ digital capabilities (i.e. making them better at doing stuff online and navigating the digital world) and engage them with assessments that ask them to actively create things online. Historians have been using blogs and wikis in their teaching for well over a decade (Russell Olwell’s 2008 blog offers reflections from an early adopter of blogging). Several of my colleagues at the University of Lincoln have experimented successfully with using blogs to present the results of student work to the public (e.g. on Jade Shepherd’s Mad or Bad? module).  

But what happens if we ask students to create and curate online resources and spaces for themselves? Even deeper engagement and learning, perhaps. A good example of an here is Arthur Burns’ module, At the Court of King George III, at King’s College London that has students present the results of their research into the Georgian Papers using Xerte, an open educational resource. 

Such projects require students to think about audiences beyond the marker and their fellow students. (In the case of essays and exams, not even their peers will get to see what students have done.) The focus shifts to encouraging students to consider how to present information engagingly, how to incorporate visual and other media in their work, and how to write in different registers (i.e. it’s still developing writing skills, just not in the 2-3,000-word-essay format). Innovative online assessments thus open up opportunities for students to engage with audiences beyond the academy and to develop a far wider range of relevant disciplinary and career skills. 

Such opportunities for students to engage in knowledge production for consumption beyond the academy are not any less ‘rigorous’ than essays or exams. In fact, they rely on many of the same research and writing skills as traditional forms of assessment. Better yet, they also encourage a new awareness of how and why knowledge artifacts turn out the way they do. The more creatively we engage with the digital landscape and the more actively and systematically we explore its pedagogic potential, the  better our chances of supporting our students to become capable and highly-skilled historians in the present and future.  


My previous post as part of the Pandemic Pedagogy project was on designing learning. Next week, I’ll be writing one about feedback.

Posted in active learning, digital history, digital literacy, E-learning, essay writing, exams, lockdown, online learning, pandemic, student as producer, Student research, students | Leave a comment

Pandemic Pedagogy – Redesigning for online teaching, or Why learning objectives aren’t a waste of time

This post is part of History UK’s Pandemic Pedagogy project. For more about the initiative, follow HUK’s blog and Twitter feed.


In this post I want to spend a little bit of time thinking in quite general terms about the process for turning face-to-face teaching into an online or blended format. I’ll suggest that rethinking learning objectives with the concept of ‘constructive alignment’ in mind might be one way of helping us to think through this process. I’ll then outline some ideas about how this might be put into practice for those redesigning their teaching for 2020-21.

So, the problem (pandemic aside). One of the main problems that faces many lecturers as they start to rethink their teaching for online or blended delivery is that the way educationalists and ed-tech specialists (= the people we’re looking to for support and advice right now) speak about teaching and learning isn’t always very well aligned with how academics think about it.

Learning outcomes are fundamental to educationalists – they are supposed to drive learning, to inform assessment and to structure how courses actually run. They operate at the level of the programme, the module and, in some instances, individual teaching sessions. However, unfortunately, the language of learning outcomes is alienating for many academics.*

Despite these issues (and I’m certainly not suggesting that the fault here lies entirely with educationalists), I’ll now suggest that reimagining learning outcomes within the framework of ‘constructive alignment’ could help us with the redesign process as we move our teaching online.

Alignment is all

First, I’ve found it helpful to think about LOs in relation to the concept of’constructive alignment’. For a short and helpful introduction to constructive alignment, watch this YouTube video:

Constructive alignment is based on the sensible premise that students learn best when all of the elements of the teaching that they receive are synchronised – that the learning activities, objectives and assessments are pointing in the same direction.** Now, it may seem that this is blindingly obvious, but it’s surprising how often this doesn’t happen, for whatever reason. A good example of misalignment might be using an exam to assess a module on which the learning outcome requires that students develop research skills or their writing skills.

I think of it as the triple-A approach to designing learning. AAA: Aims (= LOs), Activities (= what you and the students do), and Assessments (and feedback = how you evaluate what they have done and help them to improve) should all be in alignment.

So, first tip as you’re considering what to change when shifting online – have a good think about whether all of these elements are actually lined up with one another. If anything isn’t in line, then think about changing that first of all.

Language

Second, it’s important to understand that learning outcomes are (or should be) really just a statement of what you want to students to get out of your module, your seminar (online or otherwise) or whatever it is that you’re asking them to do. Try to make LOs specific, not generic edu-speak, and in language that the students can understand (e.g. what does ‘critical thinking’ actually mean, in practice?). If you don’t really know what the LOs mean, then how can you expect the students to do so?

Second tip: ideally, rewrite them as you refocus for online delivery and make them meaningful for teacher and student; if that’s not possible because of institutional administrative processes (the forms had to be filled in 3 years ago to make a change for next semester) or whatever, then provide explanatory notes for your students, or make a little video to explain what the generic LOs really mean. Maybe don’t even mention LOs, but just tell them what they will get out of it. 

Less is more

I’m struck by how long the lists of LOs are on some modules (even on relatively low credit-bearing courses). If your LOs look like you’re trying to train an 18-year-old into professional historian in 10 weeks, then you’ve probably got too many. You and the students need to be able to focus on what is really important. For a 15-credit module, can you realistically expect anyone to master (rather than touch upon) more than 3 (or maybe 4) key take-aways in any kind of depth?

Tip no. 3: ideally, pare the LOs back to represent what you really think is important for the module; use this to help you focus for online teaching. Again, if institutional structures don’t allow this, then do it informally. 

Some other key (possibly slightly repetitive) considerations

  • WORK BACKWARDS – Think about what you want the students to get out of any session/ block of interactions/ module (= your Aims/ LOs) and then consider whether Activities and Assessments are helping you to achieve those goals.
  • BREAK IT DOWN – Focus on individual session/s as well as the course as a whole because it can otherwise become detached from the actual delivery and to some extent the assessment. In online teaching this may become even more of an issue.
  • INPUTS AS WELL AS OUTPUTS – Put yourself in your students’ position as you think about all of this. How will they (not you or you when you were a student, but them) react/ experience what you’re asking them to do?
  • TESTING, TESTING, TESTING – Related to the last point, test everything A LOT, and do so from a student perspective. If possible, get a colleague to look through and offer constructive criticism.
  • THINK BEYOND ACTIVITIES – In my opinion, activities include learning resources (e.g. reading lists, handbooks, handouts, etc.) and support mechanisms (e.g. office hours, personal tutoring). Think about how these support (or not) the overall construction of the module.

I hope that this might be helpful to at least some of you as you think about redesigning learning for online delivery. Basically, we’re stuck with learning outcomes, they can be helpful, so why not just use them to our advantage!?


*Criticisms might include any or all of the following: (1) that they encourage a mechanistic/ instrumentalist approach among students and teachers; (2) that they are underpinned by quality-driven box-ticking processes; (3) that they are so jargonistic and abstract that they don’t have much to do with the actual discipline that’s being taught; (4) that they are about generic skills (= careers and employability) rather than disciplinary practice; (5) that they are so detached from the actual practice of teaching and learning as to be meaningless (= e.g. they operate at the administrative level of the module rather than the pedagogic level of the classroom).

**Sidebar: for me, constructive alignment is also a nod to the idea that students learn best when they are actively engaged – i.e. doing stuff, including reading and thinking – rather than passive consumers of educational experiences.

Posted in constructive alignment, learning design, learning outcomes | Leave a comment

Reflections on a survey of History students’ experiences of lockdown learning

Last month, alongside our survey of staff experiences of teaching during lockdown, we surveyed UG and PGT students in the School of History and Heritage at the University of Lincoln. I mentioned this a couple of weeks ago in History UK’s #PandemicPedagogy Twitter chat.

 

I’ve had a bit of time this morning to process a summary of the results that was kindly put together by my excellent colleagues in the School, Michele Vescovi and Giustina Monti. Nguyen Grace, of the Lincoln Academy of Learning and Teaching, also did an analysis, which has fed into this summary. Thanks to them all!

We had 117 responses from students across the School, from all programmes, bar one. Most responses (71) were from the BA in History, which is the biggest programme by far. Overall, this represents a response rate of about 16.5% of the total UG and PGT students in the School. This may seem like quite a low number, but given that term had finished and many students won’t have been checking their uni emails, I think it does give us some limited indication of how the student body as a whole experienced lockdown learning (which I’ve just seen does have its own hashtag! – #LockdownLearning).

Here are some key points I took away from looking over the survey and the summaries, with thoughts about their implications for future teaching under post-lockdown conditions:

  • VARIETY – unsurprisingly students (even within the same programmes, year groups, modules, seminars) had very different experiences of learning during lockdown, in both a positive and a negative sense. No one-size-fits-all solution will work for the next semester so we’re going to have to be flexible and adopt a range of approaches/ tools.
  • SYNCHRONOUS T&L – most students were enthusiastic about tools that supported ‘live’ interaction with tutors (this aligns rather nicely with the results of the staff survey), such as Blackboard Collaborate and, to a lesser extent MS Teams, though there some also praise for structured asynchronous tools, such as discussion boards and Talis Elevate. It will be important, even for courses that are delivered entirely online (or in a blended way) to maintain this sense of ‘presence’ and interaction with students.
  • CONTENT DELIVERY – linked to their love for ‘live’ teaching, many students liked Panopto as it gave them (a sense of) direct connection to lecturers, and flexible access to content, which were understandably important during such a disrupted period. Given that the consensus seems to be that online lectures are best delivered in short bursts (no more than 20 mins), there may be some work to do here in recalibrating student (and teacher) expectations and practices.
  • ACCESSIBILITY (also a concern on the staff survey) – it was clear that access to kit, working space, time, ‘head-space’, etc. were all big issues that affected students’ experience of lockdown learning.* Much needs to be done by institutions to support students in overcoming these accessibility challenges. Learning materials and activities will have to be similarly accessible in a number of senses (again, flexibility and asychronosity may help here).
  • COMMUNICATION – this was probably the key point that came out of the survey. Students were well aware that this was an unprecendented situation and appreciated the efforts that had been made to support them, but most criticisms related to (lack of) communication and/or could probably have been mitigated by clearer communication from the University, the School and/or the tutor. Coordinating clear and consistent (and definitely not contradictory) communication strategies at all levels will be important as we prepare for the new semester and actually get on with teaching.
  • CONSISTENCY – of experience was a key issue of a number of students too and ensuring that there is rough equivalency in this regard across (and even within) modules will be important.

Now, I don’t thing any of this is particularly mind-blowing, but it does reinforce a lot of what I’ve been thinking (and reading) over the past few weeks. There will be a lot of work to do in order to ensure that students have equitable and effective access to opportunities as we move beyond the lockdown learning phase.

A final point – after reviewing both the staff and student surveys, reflecting on my own experience, and doing a bit more reading around the topic, I’m less convinced that the emergency phase of teaching was actually the success that was depicted at the time or since (anywhere – I’m not just talking about my home institution). Institutional narratives that paint this period as one of largely successful ’emergency’ adaptation aren’t very useful in the longer term, even if they may have helped to maintain morale at the time.

We (or at least most of us) certainly got through it and (many of) the students got through it and in that sense it was successful. Many of us learnt a lot, which is good. That’s it.

But it’s clear to me that quite a lot of students and colleagues found it really difficult, if not impossible. The sooner we admit that, reflect on what we’ve learnt and feed that reflection into planning for the next semester, the better. Luckily, there’s lot of great work going on right now in that direction, including History UK’s #PandemicPedagogy initiative, as well as work by the Royal Historical Society and the Institute of Historical Research. Collectively, there’s a lot of good practice out there and if it can be harnessed to meet the challenges of as we move beyond lockdown learning.

 

(*Unsurprisingly, most students (over 90%) used laptops to access online learning, often in combination with PCs, mobile phones, and/or tablets.)

Posted in asynchronous, Lincoln, lockdown, Reflections, research, students, survey, synchronous | Leave a comment

Making online teaching more accessible: 5 useful resources

Over the past couple of weeks, I’ve come across a number of useful (and generally short) resources on making online teaching and learning more accessible and equitable. Here are 5 that I’ve found particularly useful. They range from suggestions about small tweaks that can be made at the level of the individual instructor/ class, to more strategic/ systemic considerations. I hope that you find them useful!

Posted in Accessibility, equity, online learning, resources, support | Leave a comment

Results of survey of online teaching in History during lockdown

In a Twitter post last week, I mentioned a survey that we conducted in the School of History and Heritage at the University of Lincoln of staff experiences of teaching in the lockdown context (i.e. a short turnaround move to 100% online delivery).

As I said last week, many more responses came in from colleagues across the school than I had expected, from all subject areas and from temporary as well as permanent staff. Much of the feedback was institution-/department-specific, but much of it seems to resonate with accounts of experiences in the discipline (and sector) more broadly that I’ve seen over the past few weeks. So, I wanted to share a summary of the findings and my overall impressions. It was a fairly rough-and-ready survey, but has helped us to think about the challenges that we face in the next academic year.

Here are a few key points that emerged from the survey, with my musings:

  1. SUCCESSES. Many reported adapting successfully to the shift online, though not without some hiccups and usually involving A LOT of work. There was some innovative use of the tech that we’ve got available at Lincoln. We’re hoping to encourage those who had a good experience to write up case studies or do short presentations to share good practice. So, pretty positive overall.
  2. STUDENT ADAPTATION (OR NOT). There was a sense that students generally adapted well, if they were able to engage, including some who didn’t really get involved in F2F sessions; conversely, some who usually participated well in F2F teaching didn’t do so once teaching shifted online. Some of these instances related to issues of access (see below) and structure/ timetabling.
  3. SUPPORTING STUDENTS. Colleagues were concerned about the impact of the immediate shift (and perhaps longer-term changes to delivery patterns) on their ability to supporting struggling/ disengaged students.
  4. INDUCTION AND (RE)ORIENTATION. When asked about 2020-21 delivery, there was particular concern relating to the induction of new students (first years, but also new PG students). Beyond orientation, induction and socialisation, the issue of the reorientation of our existing students also loomed large.
  5. ACCESS. Access issues were reported for a significant number of students. This ranged from access to the necessary kit and wifi connection to physical workspace and time (when caring and/or family commitments may conflict with the university schedule).
  6. THE F2F FACE OFF. There was a clear tension between synchronous and asynchronous instruction. Although the colleagues that seem to have had the best experiences balanced these two modes of delivery (= blended learning), the survey results suggest that those technologies that best mapped against F2F delivery were generally favoured. Interestingly, the student survey seems to replicate these findings (though we’ve not yet finished analysing that), with students generally expressing a preference for tools like Panopto and Blackboard Collaborate Ultra, although some asynchronous tools such as Talis Elevate did receive positive comments.
  7. TIME. As you’ll see from the word clouds below, time seems to have been an issue because there was a swift turnaround to online delivery and the new modes of delivery seem to have been more time-consuming. There was also a sense that preparing and doing teaching in 2020-21 academic year would take up more time than usual.

Rather than walk you through each of the questions in detail (I don’t have time anyway!), I generated some word clouds, indicating frequency of responses to questions on the survey. I think these give a decent impression of the kinds of things that were on colleagues’ minds as the filled the survey in during the early part of May.  I’ve added my own very brief summary of responses in italics after each question.

Describe your experience of the transition to online learning (briefly). stressful but successful (mostly)

Word cloud - transition to online teaching

Which approaches to online teaching and learning have you found most useful? whatever worked! 

Word cloud for which approaches to online learning worked for you

Which of these tools worked best for you? generally, ones that supported the efficient transmission of information to students (whether synchronous or asynchronous)

Which specific elements of your modules are best suited to online delivery? overwhelmingly, seminars and tasks that would be done in small group sessions (e.g. answering questions, preparing for assessments, socialising student groups)

word cloud - which specific elements question

How do you think that your students coped with the shift to online teaching, learning and assessment? well, in general, but there were problems

Word cloud on student adaptation to online learning

What are your absolute priorities if time for face-to-face teaching is limited at the start of the 2020-21 academic year? preparing for assessments, inducting and orientating students

word cloud - priorities if time for face to face learning limited

What are your main concerns about the delivery of teaching in 2020-21? disengaged/ vulnerable students being left behind (+ a distant second, having enough time!)

main concerns about delivery of teaching next year

Any comments (and questions) on this welcome – I’d be interested to know how it matches with experiences in other institutions.

As I said above, we’re currently looking through the student survey results (over 100 responses) and hope to be able to share some analysis in the next week or two.

Posted in Accessibility, asynchronous, Lincoln, pandemic, research, survey, synchronous | Leave a comment