An interview with Katherine Fennelly – Digital Mapping of the 18th and 19th-century British Landscape

Unfortunately, it’s taken me a while to get the second interview in this series written up. In any case, I’m very please to be able to share my discussions with Dr Katherine Fennelly, an historical archaeologist at the University of Sheffield, about her use of digital mapping technologies when teaching in the School of History and Heritage at the University of Lincoln. The activity had a notable impact on students’ visual literacy and their ability to engage with a wider range of sources in written work.


JW: Could you provide a bit of background about the module on which you were getting students to make things digitally and to share the results of what they’d done?

KF: The module was about 18th and 19th century improvement. So it was about urban landscape improvement in the British Isles from about 1750 until 1850. And it was supposed to cover the agricultural revolution, to include things like enclosure in the landscape and also rapid urbanisation. The idea was to get students to use historic maps as much as they would use written sources in order to talk about how the landscape changed rapidly over time. So really, it was about kind of introducing students to historic map progressions, which in archaeology is very much a professional skill. So, I was getting history students to think about using professional skills that are used in the heritage industry and to think about sources beyond the written word. It was about applying some of the things that they’d do when analysing writing to examining figures. One of the main motivations was to develop students’ skills and awareness of these different types of sources and how to use them. There was an employability aspect to it, too: teaching history students professional skill of historic map progression, which they might use if they ever go and work in, for instance, an historic environment record office or an archaeological consultancy. So, it was about developing professional skills that they could use beyond university in an assessment that utilised their existing skills in text-based source analysis to maps and images.

JW: What year group were the students?

KF: They were second-year, second semester. Most of them would have looked at material culture in at least one first year module, but most of what they’ve done in the past is text-based. At the time they would have been formulating their dissertation plans and I was interested in getting them to think about using resources for the dissertation projects that they might not have thought of before.

JW: Thinking more about the specifics of what you ask them to do, what did they make and how did they make? 

KF: They made maps and figures to illustrate a short essay. They used resources like Digimap,  the National Library of Scotland’s mapping database, and a few other things, like the Ordnance Survey and Street Map to create two comparative maps that could be either overlaid on each other or on modern Ordnance Survey maps. So, it is about them making them up and annotating the maps, using a variety of online resources. The main thing they used was Digimap to create one base map, and then they had to make a different map of the same area using a different resource. It was about getting them to think about how to source historic maps from somewhere other than the Ordnance Survey,

First, they had to pick their own area. Then they had to use the maps to answer questions such as how processes of enclosure or urbanisation or transport infrastructure changed a particular village or town. They had to go and find it and then create a map or two maps that could be compared to show how there was change over time. So one of the maps could have been from an earlier period and the other from a later period. There had to be at least one change in the maps that they can annotate in order to signpost the change. Alongside the maps, they had to submit a 500-word map progression description explaining the differences between them, the processes of change that accounted for the difference between them (so, the assignment included two maps as figures to illustrate their 500 word analysis). They had to engage in historical analysis, by using the maps to identify changes and then to pick them to pieces, as they would with a standard (text-based) source analysis, explaining and describing the differences between the maps through reference to scholarship, as they would do in class. Reading about things like enclosure and urbanisation were supposed to support their reading of the map source.

Map progression sample

Image: Example of a map progression

JW: Did they work together or individually? How did they share the results of their work and who did they share them with?

KF: In class, they worked together. We did workshops where they were learning how to use the resources and especially Digimap. I encouraged them to work together to try to find places that interested them, but they submitted their own individual source analysis. So, when it came to actually doing the assessment, which is really the only way that they shared the results, they did so individually. I asked them to discuss it in class to check that they were confident with and interested in the material. So, it was mostly class-based.

JW: Did you have to use computer labs or was it done in a kind of standard seminar room or other kind of space?

KF: We used a standard seminar room. If I was doing it again, I would do it in a computer lab because I did encourage the students to bring their own machines if they wanted to. But the workshop where I told them how to use Digimap was mostly paper based. I gave them a handout to show them all of the steps and then I walked them through the different steps on the screen. And the idea with using something like Digimap is that you can use a tablet or a mobile phone to do it if you want. So I encouraged the students to use a variety of mobile devices. But if I was doing it again or if anyone else was doing this, I would encourage them to do it in a computer lab because it’s definitely the most straightforward way of explaining to the students how the technology works. After that, they could play with it on their mobile devices. 

JW: How did you assess their work?

KF: As I said, there was the 500-word source analysis that I marked that like any other essay, judging whether or not they had referenced scholarship, had picked out relevant elements on the maps that they’d selected, and whether or not they had engaged with the maps. 

JW: Did you find them using the skills of map analysis in other assessments? 

KF: I encouraged them to make use of as many of the resources that we’d used in class as possible when they were doing their final essay. I think that maybe half the class did so to some degree, even if it was to use a figure to illustrate their essay even if they didn’t use a map. So, they were thinking beyond just written sources by that stage, which was one of my aims in devising the assessment.

JW: What were the key challenges for engaging the students with this kind of activity?

KF: The assessment. I asked them to use two different mapping sources. It was getting them to think beyond Digimap to identify other sources that would enable them to engage with historic maps. Digimap is a very intuitive and easy platform for annotating historic maps but it’s not the only one. I wasn’t expecting students to learn GIS, but I was expecting them to use things like the annotation features in Microsoft Word, even if just to illustrate the changes on the maps themselves. So, the biggest challenge I found was getting the students to use a wider range of resources because Digimap is so easy to use. It was kind of a curse and a blessing that all of them seemed to ‘get’ Digimap. They all technically answered the question, but getting them to use different resources would have developed their skills in historic map progression and their analytical work.

JW: So, how would you get them to do that in future?

KF: In terms of assessment, I guess I would be a lot stricter. Doing more work on different tools for accessing and analysing historic maps would be useful. One of the things that I would develop in relation to this approach to assessment would be a dedicated class or a lab in which students would follow step-by-step the exercises on the screen for a range of tools beyond Digimap, such as QGIS. So, introducing them to the tools that would be necessary to build their own historic maps, including Geographical Information System databases if they wanted to. That would be the next step: getting them to use their own skills in order to develop maps of their own rather than just using what’s available in Digimap.

I only did that particular module one, but I used what I learnt from it on another module, Digital Heritage, which was also a second-year second-semester module. I had to think about using online mapping resources like Digimap. So setting up a lab was something that I did for the next two years. It gives students a chance to mess up in class so they can help each other and they can ask questions. And they’re a lot more relaxed about asking questions when you’re doing it on the screen with them. So, giving them a kind of space to play in an experiment.

JW: What do you think that the students got out of doing this exercise and the subsequent kind of iterations of the exercise on a different module?

KF: I think the students, the History students in particular, gained a sense of how to use different sources: different archival sources and different data sets in particular. They were exposed to a different way of looking at sources that they wouldn’t have necessarily approached before. I know that some students went on to apply mapping or Geographical Information Systems in their dissertation projects. So, it helped them to think spatially about the past and a few of them applied that to their dissertation projects as they progressed in their studies, which I think was a really positive take away. 

JW: And what did you get out of doing it?

KF: I gained some quite valuable skills in lab instruction because what I do is mostly kind of, I guess, Arts and Humanities based. This is a good pedagogical skill to develop. It was also really interesting and I learnt a lot about the different places that the students were working on in their own case studies and that made for much more interesting reading than a standard assignment. If I’d just given them a list of case studies or places to look at, it probably would have been more focussed from the start, but also much less interesting for the students and for me. A lot of the students, for instance, picked the places that they were from or the places they were familiar with. And that meant that I got a sense of who they were and what they were interested in. This may have been motivating for them because they chose the topic rather than being given a list of resources or questions by me. It showed them that they can use their own skills in a practical way rather than me just telling them this is an interesting landscape, go and look at it. They were really tasked with going and finding what was interesting in their landscape.

As I said, a few of them used areas they were from; so the villages or places that they knew about. And they talked about hey weren’t aware that, for instance, that the road that they lived on was as new as it was. I think that the English landscape tends to look quite old to younger eyes and applying some of the skills that they were learning about how the landscape has changed rapidly in the last 300 years and seeing how that’s worked on the page gave them a sense of how industrialised the rural landscape really is.

JW: Would you do it again? And, if so, what advice would you give to someone else who is going to do it?

KF: I would encourage anyone who is going to use a resource like Digimap for assessment or for teaching to make sure that you walk the students through the process in class. Digimap is relatively intuitive, but if a student is not familiar with using things like illustration software or graphics software, there are things like kind of what an icon looks like that does a certain thing that students might not grasp readily without help. I think it’s essential to walk the students through the process to take the time and do a workshop with them rather than just telling them to use certain resources. I would also encourage anyone doing this to encourage students to share their findings: so, setting up a class blog or maybe tweeting about it and sharing some of the maps that they’re making. These are really underused resources and I think it would be really good for students to see how much interest their findings can inspire. Platforms like Twitter or blogs can generate a broad readership and I think that students would be surprised at how interesting their work might be to local historians or local communities.

JW: My final question is about how you think this approach might be applicable in the current, pandemic context? Would it be possible to do this as an online activity that the students do remotely? And how might that work? 

KF: I might even just do it on Blackboard Collaborate Ultra (rather than doing it in a classroom), because I think that sharing a screen while the students are already on a computer might even be easier than trying to do it in a computer lab. They can see exactly what it is that you’re looking at. And I think it’s kind of important to make use of some of these resources now because physical archives are closed. An undergraduate history student, for instance, can’t go into an archive. They can’t go into an historic environment record office. So, being made aware of the different resources that are available to them online and getting walked through workshops and labs on how to use those resources would be really valuable for them at the moment. And I think that from a disciplinary perspective it’s going to be really interesting in the future as more and more people use these kinds of online resources. Developing that kind of digital literacy early in their academic career might actually be quite valuable, I think. It would also develop useful for employment or for further study.

Links and further resources:

Posted in assessment, digital history, digital literacy, E-learning, history, Humanities, mapping, online learning, Visual literacy | Leave a comment

An interview with Charles West – Using Wikipedia to Teach Medieval History and Digital Literacy

The Making Digital History project is particularly concerned with approaches to teaching history online that involve students in constructing things for themselves (including their own knowledge and understanding via more ‘traditional’ text-based approaches) in digital spaces and sharing the results of their endeavours beyond their tutors and peers.

I thought it would be interesting to gather some insights from colleagues who have developed expertise in this area. So, over the next few months, I’ll be releasing a series of ‘interviews’ with historians who have, in different ways, been asking students to ‘make’ history online.

Dr Charles WestI was delighted that Dr Charles West, Reader in Medieval History at the University of Sheffield, agreed to be the first interviewee and must apologise that it’s taken me so long to get it online.

Charles has done a lot of work in engaging students with Wikipedia over the past few years. In the following interview, he discusses his approach, what the students have gotten out of it and what he’s learnt.

At the foot of this page, I’ll share a few additional resources for those of you who might be interested in trying this out for yourselves.


BACKGROUND

  • Can you give me a bit of background about the module/ project? What motivated you to do it?

This module’s for MA students in the Department of History in Sheffield; it’s been running in different guises since 2018. The students are trained to edit Wikipedia, and we read and discuss some of the growing research literature about the encyclopedia. They make edits to a page of their own choosing, but preferably relating to the Middle Ages. And then they write an essay reflecting on their experience, and relating it back to some of the research literature. You can see last year’s module guide here which details the weekly readings.

My motives for setting up the module were threefold. I wanted to improve Wikipedia’s coverage of the Middle Ages, as public outreach as well as to support my own undergraduate students. Since they’re going to use it anyway, why not make it better for them, and for everyone else at the same time? Secondly, I wanted to give our students the opportunity to apply their specialist knowledge in an authentic, meaningful and grounded way. And finally, I wanted them to reflect on how the internet age is changing the conditions of knowledge production, distribution, and consumption.


MAKING DIGITAL HISTORY

  • What do your students ‘make’? How do they make it? Do they work together or individually? Do they share the results of their work? If so, how and with whom? How do you assess their work?

This module is all about making and sharing with the entire world online, as well as reflecting on the process, and what it means for historians today. The students edit Wikipedia directly. Sometimes they create new pages from scratch (for instance on the Somniale Danielis, Bishop Adventius of Metz or Bernard the Pilgrim), but most often they work on pages that already exist (for instance the Battle of Tlatelolco or the Council of Hertford). So far the module has improved around twenty-five different pages, which between them have been read over three million times(!) since the module first ran.

image of medieval scribe with a laptopThe students focus on their own individual pages, since I encourage them to link their edits to their own interests and expertise, relating to their undergrad or MA dissertations for instance. That doesn’t mean it’s not collaborative working, just that they are collaborating with other (anonymous) Wikipedia editors online rather than each other. Of course it’d be perfectly possible for the class to work jointly on a single page or set of pages. That would have some pay-offs. But it’d limit their capacity to reflect on the differences between their deep knowledge of a topic, acquired in an academic environment, and its representation on Wikipedia, the encyclopedia that anyone can edit.

And that is the main ‘learning outcome’ I’m looking for. That’s why the module assessment isn’t based on the Wikipedia pages the students have edited, but on a reflective essay. For me, it’s not enough just to feed the machine, and to assess how good students are at following the conventions for doing so. Ultimately I want them to think about the conceptual implications of a free, comprehensive online encyclopedia that to some extent ‘truths the internet’, from the perspective of an expert who’s personally contributed to it. Really this is a module about the function and transmission of different forms of historical knowledge in the twenty-first century.


DEVELOPMENT

  • What were the key challenges? Has your approach evolved over time? If so, why/ how?

There haven’t been as many challenges as you might imagine! I had to persuade my Department to let me run the module, which was the first of its kind in the University (and as far as I know the first in any UK History department), but that was pretty straightforward. We had to set up new marking criteria for reflective essay writing but again that wasn’t too hard, thanks to the generous help of colleagues.

Occasionally a student’s edits are reverted by protective or defensive Wikipedia editors. It’s a risk you run when working with a public page ‘in the wild’, which isn’t under anyone’s direct control. It can knock the student’s confidence. But I can reassure them that really this is just grist for the mill for their essay. That’s an added advantage of assessing not the page edits, but the reflections on them: it makes the assessment bullet-proof, so to speak.

The module has changed a fair bit over the years. In its first incarnation it was twin-track, teaching specific historical content (on the theme of medieval exemption, which I was working on at the time) alongside the Wikipedia component. The idea was that students would find some topic relating to medieval exemption, drawing on our seminar discussions, and then work on a related Wikipedia page. However, I found this structure unsatisfactory. It pulled the students in different directions, and didn’t allow them to use their own historical expertise freely. Ultimately I think modules like this need to have one focus, not two. So, having grown in confidence myself, I changed it to focus more squarely on Wikipedia and the wider digital turn. I still encourage the edits to be on a medieval topic, as a service to the field I work in, but it’s not really essential to the nature of the module any more.


TAKEAWAYS

  • What do you think that the students get out of it? What have you got out of doing it? What advice do you have for anyone else wanting to do a similar project?

Students like showing their friends and relatives (and perhaps future employers) that their work is being read by people across the world on a daily basis, being spoken aloud in people’s homes by Amazon’s Alexa, and indeed being consulted by other students in Sheffield and globally. That’s got an immediacy and real-world impact that artificial exercises or wikis in a VLE can’t match. Some students have become prolific Wikipedia editors in their spare time. A few finish the module more cynical about Wikipedia, and its role in underpinning the Silicon Valley knowledge economy. That’s fair enough: after all, as I point out, students enrolled on the module are adding value, in an infinitesimal way, to Google and Amazon, and they are even paying the university for the privilege of doing so! But it opens their eyes to a way the world works that they might not have considered before. By the end of the module, they are calling for more research on how academic history, public history and Wikipedia all intersect (happily colleagues such as Mike Horswell are on the case).

I enjoy teaching the course too. It’s great to see students acquiring new skills in digital literacy, which might open doors for them in future. It’s great to graduate a new generation of Wikipedia editors, public historians with the know-how and confidence to engage actively with this huge knowledge resource which, for all its flaws and compromises, is a powerful instrument for the democratisation of knowledge. And it’s satisfying to see pages on medieval history being updated at a scale that wouldn’t be practical for a lone researcher, helping to bridge some of the gaps between universities and the wider world, and helping to address some of the diversity problems in Wikipedia that scholars such as Victoria Leonard have identified. It’s fun to teach a module that’s also an ongoing public history project.

If you’re thinking about running a similar module, I’d say – go for it! For me, it works best at a postgraduate level, since I’m interested in the juxtaposition between a student’s existing historical expertise and the forms of knowledge represented by the encyclopedia, and that’s most straightforward when the students have already completed a History degree. That’s the place in the curriculum where my colleague Matt Graham ran it at Dundee, too. But Ed Roberts at Kent successfully integrated Wikipedia into his teaching with History undergraduates, just pitched a bit differently, and that’s something I’m weighing up doing too. Matt, Ed and I have all found Wikimedia UK to be invaluable allies and supporters, helpful beyond expectation. You’ll also find a wide range of advice and encouragement on Twitter. So I’d encourage you to take the Wikipedia plunge – I haven’t regretted it yet!


If you want to know more about using Wikipedia in your teaching, here are some useful links: 

Posted in active learning, digital history, digital literacy, E-learning, Making Digital History, Medieval, online learning, student as producer, Student research, students, University of Sheffield, wikipedia | Leave a comment

Pandemic Pedagogy – Beyond essays and exams: changing the rules of the assessment game

This post is part of History UK’s Pandemic Pedagogy project. For more about the initiative, follow HUK’s blog and Twitter feed.


Assessment, carrots and sticks

Assessment is an integral part of instruction, as it determines whether or not the goals of education are being met.’ (Edutopia, 2008)

The centrality of assessment to learning in higher education is rarely questioned. Experience has taught many of us that nothing motivates students quite like a looming deadline or an upcoming exam. Students channel their energies into activities that  determine final results – a strong motivating factor. And educational theory stresses that well-designed assessments can encourage students to engage in deep learning (Briggs, 2015). But overemphasis on grades can lead to problems.

We sometimes imagine that it is because exams and essays encourage students to engage deeply and to hone their skills that the students see them as important. But it is much more likely that we are encouraging an instrumental approach to education. Students jump through the hoops that we have put there in order to secure the grades that they want – they learn in spite of our approach to assessment, not because of it. If deeper learning takes place, it is pretty much an accidental side effect of that process.   

Some students are very good at playing the exam-essay game that we have devised for them, although there have long been indications that exams favour certain demographic groups over others (Brookings Institute, 2017). We can do better, devising assessment regimes that include all students. Indeed, there are some exciting examples of innovative assessment in history out there already (e.g. see Lucinda Matthews Jones’ 2019 blog post on creative assessments and the #unessay; Chris Jones’s 2018 post talks about using the approach in an early American history course too). The shift to online assessments is an opportunity for more of us to embrace more innovative forms of assessment that will engage students in meaningful learning, develop their skills, knowledge and identity as historians, and better prepare them for future study and/or the world of work. 

Moving online: emergency (traditional) assessment 

I’m well aware that this call-to-arms may not be to everyone’s taste and I’m certainly not suggesting that the essay and the exam have no pedagogic value (well, I’m not really sure about the exam, but it’s not a hill I’m willing to die on). What I’m saying is that our students will be better nourished by a more varied and interesting assessment diet that blends traditional and innovative forms. 

Of course, traditional forms of assessment can easily be reconfigured so that they can be done and administered online. Indeed, aside from the odd visit to the library, the majority of essays are largely researched, written and submitted digitally nowadays. Exams are a bit more problematic, but the ‘emergency’ phase of online instruction from March-June 2020 showed that even they can be done online, as take-home tests (the fancy name is ‘time-constrained assignments’) or using online tools such as ProctorU. Essays and exams can therefore be done online unproblematically. We can do more, though.  

Engaging constructively with the digital world

There are already cases of historians developing innovative assessments that promote meaningful learning about the digital world. A good example is Charles West’s MA module in medieval history at the University of Sheffield, in which students learn about how information is presented on Wikipedia, before researching and writing (or improving) pages on topics that they have studied (West, 2018). Over the past few years, increasing numbers of historians have recognised the value of engaging students in learning that involves exploring how knowledge is presented and constructed online and even contributing to digital knowledge creation within the discipline. 

This represents something of a step-change because until fairly recently many course handbooks and introductory lectures began with dire warnings for students of the consequences of using websites such as Wikipedia as sources. 

no wikipedia image

Image source: http://medias.liberation.fr/photo/592891-arton2684.jpg?modified_at=1200590185&width=960

It has always struck me as unrealistic to expect students not to use such a readily available source and in some sense dishonest (as a student I often found it a useful starting point when moving into unfamiliar territory, and continue to do so). A more productive way forward is to devise activities that develop students’ digital literacies (for more see Doug Belshaw’s #neverendingthesis; plus some quick hints and tips), to enable them to filter ‘good’ from substandard online sources. Library staff have played a big role in many institutions and are experts we should turn to for support and as collaborators in this endeavour (see the 2017 statement from the International Federation of Library Associations and Institutions). 

The key point here is that online sources are not ‘inappropriate’ or ‘unacademic’ in and of themselves. Rather, it is important to educate students in how to engage with and make use of the internet productively. Given that source evaluation and analysis is what historians do, it seems to me that it shouldn’t be too difficult to figure out how to do this. 

Building in the digital world: a new way forward?   

But perhaps we  can go even further, beyond assessments that cultivate students’ digital capabilities (i.e. making them better at doing stuff online and navigating the digital world) and engage them with assessments that ask them to actively create things online. Historians have been using blogs and wikis in their teaching for well over a decade (Russell Olwell’s 2008 blog offers reflections from an early adopter of blogging). Several of my colleagues at the University of Lincoln have experimented successfully with using blogs to present the results of student work to the public (e.g. on Jade Shepherd’s Mad or Bad? module).  

But what happens if we ask students to create and curate online resources and spaces for themselves? Even deeper engagement and learning, perhaps. A good example of an here is Arthur Burns’ module, At the Court of King George III, at King’s College London that has students present the results of their research into the Georgian Papers using Xerte, an open educational resource. 

Such projects require students to think about audiences beyond the marker and their fellow students. (In the case of essays and exams, not even their peers will get to see what students have done.) The focus shifts to encouraging students to consider how to present information engagingly, how to incorporate visual and other media in their work, and how to write in different registers (i.e. it’s still developing writing skills, just not in the 2-3,000-word-essay format). Innovative online assessments thus open up opportunities for students to engage with audiences beyond the academy and to develop a far wider range of relevant disciplinary and career skills. 

Such opportunities for students to engage in knowledge production for consumption beyond the academy are not any less ‘rigorous’ than essays or exams. In fact, they rely on many of the same research and writing skills as traditional forms of assessment. Better yet, they also encourage a new awareness of how and why knowledge artifacts turn out the way they do. The more creatively we engage with the digital landscape and the more actively and systematically we explore its pedagogic potential, the  better our chances of supporting our students to become capable and highly-skilled historians in the present and future.  


My previous post as part of the Pandemic Pedagogy project was on designing learning. Next week, I’ll be writing one about feedback.

Posted in active learning, digital history, digital literacy, E-learning, essay writing, exams, lockdown, online learning, pandemic, student as producer, Student research, students | Leave a comment

Pandemic Pedagogy – Redesigning for online teaching, or Why learning objectives aren’t a waste of time

This post is part of History UK’s Pandemic Pedagogy project. For more about the initiative, follow HUK’s blog and Twitter feed.


In this post I want to spend a little bit of time thinking in quite general terms about the process for turning face-to-face teaching into an online or blended format. I’ll suggest that rethinking learning objectives with the concept of ‘constructive alignment’ in mind might be one way of helping us to think through this process. I’ll then outline some ideas about how this might be put into practice for those redesigning their teaching for 2020-21.

So, the problem (pandemic aside). One of the main problems that faces many lecturers as they start to rethink their teaching for online or blended delivery is that the way educationalists and ed-tech specialists (= the people we’re looking to for support and advice right now) speak about teaching and learning isn’t always very well aligned with how academics think about it.

Learning outcomes are fundamental to educationalists – they are supposed to drive learning, to inform assessment and to structure how courses actually run. They operate at the level of the programme, the module and, in some instances, individual teaching sessions. However, unfortunately, the language of learning outcomes is alienating for many academics.*

Despite these issues (and I’m certainly not suggesting that the fault here lies entirely with educationalists), I’ll now suggest that reimagining learning outcomes within the framework of ‘constructive alignment’ could help us with the redesign process as we move our teaching online.

Alignment is all

First, I’ve found it helpful to think about LOs in relation to the concept of’constructive alignment’. For a short and helpful introduction to constructive alignment, watch this YouTube video:

Constructive alignment is based on the sensible premise that students learn best when all of the elements of the teaching that they receive are synchronised – that the learning activities, objectives and assessments are pointing in the same direction.** Now, it may seem that this is blindingly obvious, but it’s surprising how often this doesn’t happen, for whatever reason. A good example of misalignment might be using an exam to assess a module on which the learning outcome requires that students develop research skills or their writing skills.

I think of it as the triple-A approach to designing learning. AAA: Aims (= LOs), Activities (= what you and the students do), and Assessments (and feedback = how you evaluate what they have done and help them to improve) should all be in alignment.

So, first tip as you’re considering what to change when shifting online – have a good think about whether all of these elements are actually lined up with one another. If anything isn’t in line, then think about changing that first of all.

Language

Second, it’s important to understand that learning outcomes are (or should be) really just a statement of what you want to students to get out of your module, your seminar (online or otherwise) or whatever it is that you’re asking them to do. Try to make LOs specific, not generic edu-speak, and in language that the students can understand (e.g. what does ‘critical thinking’ actually mean, in practice?). If you don’t really know what the LOs mean, then how can you expect the students to do so?

Second tip: ideally, rewrite them as you refocus for online delivery and make them meaningful for teacher and student; if that’s not possible because of institutional administrative processes (the forms had to be filled in 3 years ago to make a change for next semester) or whatever, then provide explanatory notes for your students, or make a little video to explain what the generic LOs really mean. Maybe don’t even mention LOs, but just tell them what they will get out of it. 

Less is more

I’m struck by how long the lists of LOs are on some modules (even on relatively low credit-bearing courses). If your LOs look like you’re trying to train an 18-year-old into professional historian in 10 weeks, then you’ve probably got too many. You and the students need to be able to focus on what is really important. For a 15-credit module, can you realistically expect anyone to master (rather than touch upon) more than 3 (or maybe 4) key take-aways in any kind of depth?

Tip no. 3: ideally, pare the LOs back to represent what you really think is important for the module; use this to help you focus for online teaching. Again, if institutional structures don’t allow this, then do it informally. 

Some other key (possibly slightly repetitive) considerations

  • WORK BACKWARDS – Think about what you want the students to get out of any session/ block of interactions/ module (= your Aims/ LOs) and then consider whether Activities and Assessments are helping you to achieve those goals.
  • BREAK IT DOWN – Focus on individual session/s as well as the course as a whole because it can otherwise become detached from the actual delivery and to some extent the assessment. In online teaching this may become even more of an issue.
  • INPUTS AS WELL AS OUTPUTS – Put yourself in your students’ position as you think about all of this. How will they (not you or you when you were a student, but them) react/ experience what you’re asking them to do?
  • TESTING, TESTING, TESTING – Related to the last point, test everything A LOT, and do so from a student perspective. If possible, get a colleague to look through and offer constructive criticism.
  • THINK BEYOND ACTIVITIES – In my opinion, activities include learning resources (e.g. reading lists, handbooks, handouts, etc.) and support mechanisms (e.g. office hours, personal tutoring). Think about how these support (or not) the overall construction of the module.

I hope that this might be helpful to at least some of you as you think about redesigning learning for online delivery. Basically, we’re stuck with learning outcomes, they can be helpful, so why not just use them to our advantage!?


*Criticisms might include any or all of the following: (1) that they encourage a mechanistic/ instrumentalist approach among students and teachers; (2) that they are underpinned by quality-driven box-ticking processes; (3) that they are so jargonistic and abstract that they don’t have much to do with the actual discipline that’s being taught; (4) that they are about generic skills (= careers and employability) rather than disciplinary practice; (5) that they are so detached from the actual practice of teaching and learning as to be meaningless (= e.g. they operate at the administrative level of the module rather than the pedagogic level of the classroom).

**Sidebar: for me, constructive alignment is also a nod to the idea that students learn best when they are actively engaged – i.e. doing stuff, including reading and thinking – rather than passive consumers of educational experiences.

Posted in constructive alignment, learning design, learning outcomes | Leave a comment

Reflections on a survey of History students’ experiences of lockdown learning

Last month, alongside our survey of staff experiences of teaching during lockdown, we surveyed UG and PGT students in the School of History and Heritage at the University of Lincoln. I mentioned this a couple of weeks ago in History UK’s #PandemicPedagogy Twitter chat.

 

I’ve had a bit of time this morning to process a summary of the results that was kindly put together by my excellent colleagues in the School, Michele Vescovi and Giustina Monti. Nguyen Grace, of the Lincoln Academy of Learning and Teaching, also did an analysis, which has fed into this summary. Thanks to them all!

We had 117 responses from students across the School, from all programmes, bar one. Most responses (71) were from the BA in History, which is the biggest programme by far. Overall, this represents a response rate of about 16.5% of the total UG and PGT students in the School. This may seem like quite a low number, but given that term had finished and many students won’t have been checking their uni emails, I think it does give us some limited indication of how the student body as a whole experienced lockdown learning (which I’ve just seen does have its own hashtag! – #LockdownLearning).

Here are some key points I took away from looking over the survey and the summaries, with thoughts about their implications for future teaching under post-lockdown conditions:

  • VARIETY – unsurprisingly students (even within the same programmes, year groups, modules, seminars) had very different experiences of learning during lockdown, in both a positive and a negative sense. No one-size-fits-all solution will work for the next semester so we’re going to have to be flexible and adopt a range of approaches/ tools.
  • SYNCHRONOUS T&L – most students were enthusiastic about tools that supported ‘live’ interaction with tutors (this aligns rather nicely with the results of the staff survey), such as Blackboard Collaborate and, to a lesser extent MS Teams, though there some also praise for structured asynchronous tools, such as discussion boards and Talis Elevate. It will be important, even for courses that are delivered entirely online (or in a blended way) to maintain this sense of ‘presence’ and interaction with students.
  • CONTENT DELIVERY – linked to their love for ‘live’ teaching, many students liked Panopto as it gave them (a sense of) direct connection to lecturers, and flexible access to content, which were understandably important during such a disrupted period. Given that the consensus seems to be that online lectures are best delivered in short bursts (no more than 20 mins), there may be some work to do here in recalibrating student (and teacher) expectations and practices.
  • ACCESSIBILITY (also a concern on the staff survey) – it was clear that access to kit, working space, time, ‘head-space’, etc. were all big issues that affected students’ experience of lockdown learning.* Much needs to be done by institutions to support students in overcoming these accessibility challenges. Learning materials and activities will have to be similarly accessible in a number of senses (again, flexibility and asychronosity may help here).
  • COMMUNICATION – this was probably the key point that came out of the survey. Students were well aware that this was an unprecendented situation and appreciated the efforts that had been made to support them, but most criticisms related to (lack of) communication and/or could probably have been mitigated by clearer communication from the University, the School and/or the tutor. Coordinating clear and consistent (and definitely not contradictory) communication strategies at all levels will be important as we prepare for the new semester and actually get on with teaching.
  • CONSISTENCY – of experience was a key issue of a number of students too and ensuring that there is rough equivalency in this regard across (and even within) modules will be important.

Now, I don’t thing any of this is particularly mind-blowing, but it does reinforce a lot of what I’ve been thinking (and reading) over the past few weeks. There will be a lot of work to do in order to ensure that students have equitable and effective access to opportunities as we move beyond the lockdown learning phase.

A final point – after reviewing both the staff and student surveys, reflecting on my own experience, and doing a bit more reading around the topic, I’m less convinced that the emergency phase of teaching was actually the success that was depicted at the time or since (anywhere – I’m not just talking about my home institution). Institutional narratives that paint this period as one of largely successful ’emergency’ adaptation aren’t very useful in the longer term, even if they may have helped to maintain morale at the time.

We (or at least most of us) certainly got through it and (many of) the students got through it and in that sense it was successful. Many of us learnt a lot, which is good. That’s it.

But it’s clear to me that quite a lot of students and colleagues found it really difficult, if not impossible. The sooner we admit that, reflect on what we’ve learnt and feed that reflection into planning for the next semester, the better. Luckily, there’s lot of great work going on right now in that direction, including History UK’s #PandemicPedagogy initiative, as well as work by the Royal Historical Society and the Institute of Historical Research. Collectively, there’s a lot of good practice out there and if it can be harnessed to meet the challenges of as we move beyond lockdown learning.

 

(*Unsurprisingly, most students (over 90%) used laptops to access online learning, often in combination with PCs, mobile phones, and/or tablets.)

Posted in asynchronous, Lincoln, lockdown, Reflections, research, students, survey, synchronous | Leave a comment