Tuesday, 17 April 2018

Large Group Teaching

This session on 17.04.18 formed part of a series of events by the Leicester Learning Institute in in 2017/18 focused around large group teaching. Volko Straub and Alan Cann attended and the participants worked in interdisciplinary groups to explore how small group teaching strategies can be developed for large cohorts and design approaches to use with students.

The session was divided into a series of short activities:

  • Data gathering
  • Idea generation
  • Design
  • Sharing
  • Detailed planning

The intention is that each participant has come away with a fairly small scale implementation that they will introduce to their teaching next session and that the ideas will be supported and followed up by LLI as time progresses.

Friday, 13 April 2018

Active learning and student engagement: A MOOC Approach

On 12.04.18 Shaun Heaphy and Alan Cann attended the LLI event: Active learning and student engagement: A MOOC Approach, run by Nicola Gretton and Rachael Tunstall, LLI.

The session started with discussion of Leicester's contributions to the Futurelearn platform. To date there have been 77,000 people who started a Leicester MOOC (Massive Open Online Course), but as with all MOOCs the drop-off rate is steep:

  • 50% of starters complete 50% of the course
  • 34% of starters complete 90% of the course

However, this is a rich source of data about online engagement which can be translated into campus-based courses.
  • Likes: Variety, brevity (multimedia length, writing for screen based reading), strong structure/narrative.
  • Dislikes: Audio without video, tasks without focus.

Evidence of the impact of MOOCs at Leicester:
https://www2.le.ac.uk/offices/lli/developing-learning-and-teaching/enhance/strategies/evidence-of-the-impact-of-moocs-at-leicester

After the introduction participants carried out workshop activities on engagement, based on six active learning approaches:

  1. Co-operative & Collaborative Learning
  2. Peer Learning
  3. Games & playful learning
  4. Enquiry-based learning – self-directed enquiry of research by the student
  5. Inquiry-based learning – problem-based learning
  6. Conversational learning



Monday, 19 March 2018

Thinking About Group Work

It's a bit mechanical in places, but this new evidence-based teaching guide on group work from CBE Life Sciences Education has some useful features, in particular the interactive online chart which is a good place to start when thinking about group work assessments:




K.J. Wilson, P. Brickman and C.J. Brame. Group Work. (2018) CBE Life Sci Educ,17: fe1; doi:10.1187/cbe.17-12-0258
Science, technology, engineering, and mathematics faculty are increasingly incorporating both formal and informal group work in their courses. Implementing group work can be improved by an understanding of the extensive body of educational research studies on this topic. This essay describes an online, evidence-based teaching guide published by CBE—Life Sciences Education (LSE). The guide provides a tour of research studies and resources related to group work (including many articles from LSE). Instructors who are new to group work, as well as instructors who have experienced difficulties in implementing group work, may value the condensed summaries of key research findings. These summaries are organized by teaching challenges, and actionable advice is provided in a checklist for instructors. Education researchers may value the inclusion of empirical studies, key reviews, and meta-analyses of group-work studies. In addition to describing key features of the guide, this essay also identifies areas in which further empirical studies are warranted.


Wednesday, 21 February 2018

Digital by default? People may be our most valuable asset

Commentary:
I often read education research papers and discard them because the conclusions claimed are not fully supported by the data – either the experimental design is poor and open to interpretation (it's difficult to design good education experiments for a range of reasons, including ethics), or the statistics simply don't support the argument. I feel this paper is worthy of discussion though because it contains a rigorous statistical analysis and a thoughtful discussion of possible confounding factors, and because although it does not necessarily apply to the University of Leicester or the School of Biological Sciences, it does raise some interesting issues to consider.


Guest, R., Rohde, N., Selvanathan, S., & Soesmanto, T. (2018). Student satisfaction and online teaching. Assessment & Evaluation in Higher Education, 1-10. DOI: 10.1080/02602938.2018.1433815 https://doi.org/10.1080/02602938.2018.1433815
Abstract: This paper presents an analysis of the response of student satisfaction scores to online delivery for courses at a large Australian university. Taking data from 2653 courses delivered since 2011 we employ a difference-in-differences estimator approach to evaluate the impact of a transition from face-to-face learning to online learning of courses on student satisfaction. We estimate that, on a five-point scale, conversion to online learning lowers course satisfaction by about 0.2 points and instructor satisfaction by about 0.15 points. These correspond to shifts relative to the underlying distributions of about 25–30% of a standard deviation. Some implications of the (slight) relative unpopularity of online learning are discussed.
Authors Conclusion:
"Predictions have often been presented that transition to online delivery would likely reduce course and teaching satisfaction. This paper, employing a difference-in-difference technique on a large set of data taken from 2653 courses at a large, multi-campus Australian university, has presented an analysis of student satisfaction scores of courses and their instructors for online and face-to-face modes of delivery. We have found that instructors of online courses are less popular than face-to-face instructors, and that converting a course from face-to-face to online seem to diminish student satisfaction with that course. The effect sizes are meaningful (around 25–30% of a standard deviation) but hardly staggering. It seems that online teaching is less well received by students, but hardly overwhelmingly so, conditional upon the assumption that our satisfaction scores are meaningfully comparable across delivery modes. A conservative interpretation of our findings is that, while it is still in its infancy, online instruction is probably considered to be a less satisfying learning experience for students. Educational administrators should weigh this moderate disadvantage to online education against some of its well-known advantages."


But if you're not completely convinced by this paper, here's another one:

Webb, O. J., & Cotton, D. (2018). Early withdrawal from higher education: A focus on academic experiences. Teaching in Higher Education 06 Feb 2018
http://doi.org/10.1080/13562517.2018.1437130
Abstract: Early withdrawal from higher education (HE) programmes can be detrimental for the students and institutions involved. Quantitative research has often concentrated on demographic and social antecedents (e.g. gender, prior education). Other factors may be more open to intervention e.g. students’ academic experiences in HE. Using data from an institutional survey (N = 1170), logistic regression tested a range of academic experiences, regarding their relationship to contemplation of withdrawal (‘COW’: a recognised marker for actual withdrawal). COW was associated with student perceptions of low one-to-one contact with staff; non-traditional delivery methods; low peer-interaction; and high assessment load. Interestingly, COW was not associated with overall contact hours, large classes, or personal tutoring. The contributing factors explained 5.1%–8.6% of variance in COW, suggesting they may be meaningful levers for optimising retention. The paper discusses links to existing literature, future research directions, and applied implications for institutions.
Authors Conclusion:
"Controlling for known demographic predictors, several factors were associated with contemplation of withdrawal, which was itself a significant predictor of actual withdrawal. These were:
  • Students’ perception that one-to-one contact with teaching staff was low.
  • Students’ reports that lectures were not the main teaching format.
  • Students’ perception that opportunities to interact with fellow students were low.
  • Students’ reports that the volume of assessment was excessive."




A.J. Cann





Thursday, 15 February 2018

Rubrics North of the Border

https://upload.wikimedia.org/wikipedia/commons/8/83/Rubric.jpg
Earlier this week I was in Edinburgh to give a seminar on GradeMark and feedback. While I was there I had an interesting conversation with Paul McLaughlin (Associate Director of Teaching) about using rubrics to speed up marking and the delivery of feedback. Paul has used GradeMark rubrics extensively with large class sizes and passed on a number of useful tips.

As anyone who has ever tried to write a GradeMark rubric knows, the format is rigid and rather limiting since it demands a square matrix, e.g. 4x4 or 5x5, which isn't always what is required. Paul's first tip is that it is possible to write much more flexible assessment schemes by padding the matrix with dummy marks. For example, you may wish to grade four criteria of a written submission using five grades (each with associated feedback), but to grade an oral presentation which forms part of the same assessment, you may need a different shaped matrix, e.g. three criteria (clarity, content and responses to questions) with six or eight grade descriptors. It is possible to construct such a matrix and therefore to generate a mark for a multipart assessment. This is done by padding out the matrix to fill the unused grades by simply duplicating the same mark several times and leaving the feedback section black, since it won't be used. If you've never used GradeMark rubrics, this description may be as clear as mud, but if you're ever tried to write a rubric which covers the assessment scheme you want rather than the rigid one GradeMark forces on you, it's a life saver, and also makes it much easier to adapt existing marking schemes to a rubric.


Paul's second tip involves structuring QuickMarks to speed up the marking of assignments. For a well structured written submission such as a lab report, Paul writes a series of QuickMarks which follow the structure of the assessment. For example, this might be something like:
01.1 Introduction Feedback comment (Excellent…)
01.2 Introduction Feedback comment (Very good…)
01.3 Introduction Feedback comment (Good…)
01.4 Introduction Feedback comment (Satisfactory…)
01.5 Introduction Feedback comment (Weak…)
02.1 Mathods Feedback comment (Excellent…)
etc.
By numbering the QuickMarks in this way it is easy to work through them in sequence and this greatly reduces the time spent typing repetitive Bubble or other inline comments. The feedback is rounded out by a more individualized comment in the GradeMark Text Box.


Lastly, if you've ever sat down to write a rubric from scratch, you may, like me, have run head on into the lack of inspiration brick wall. Paul has made extensive use of the large collection of rubrics available online at: https://www.rcampus.com/indexrubric.cfm
It's unlikely that you'll find a rubric which fits your exact needs but this resource is a great place to start and by adapting pre-existing scheme you can save a lot of time.


Alan Cann






Thursday, 1 February 2018

Stages of SoTL - Reading the Literature

As we heard from Anne Tierney's talk, the first stage of SoTL is to develop a theoretical basis for your work. In part, this comes from reflection, but mostly it comes from reading and engaging with (including contributing to) the education research literature - standing on the shoulders of giants.
But reading the education research literature is not easy. For a start, where do you look? PubMed won't help you with this. A good place to start is of course to talk to other people and ask for recommendations, but more generally, the default database for PedR is Google Scholar. Here's an example:



If you're completely in the dark, citations will give you some clues about what you might read:



And help you find it:



After that – talk to someone and discuss your ideas, or ask them to read your draft conference abstract, grant application, or manuscript, for suggestions. Approach a colleague directly or ask for help via: bs-sotl@lists.le.ac.uk









Wednesday, 17 January 2018

Students’ emotional responses to feedback

It's not just what you say, it's the way that you say it...

There is a pre-existing research literature on students’ emotional responses to academic feedback, and like previous articles, this new paper indicates both how strong reactions can be, and how likely feedback is to be absorbed. Our (Alan Cann, Tessa Webb, Robin Green, Caroline Smith) current TEPF-funded project is focusing on how to write better feedback by closing the gaps in the understanding between students and staff. Over the next few weeks we'll be running student-staff workshops to explore this issue and to devise strategies for better communication and therefore, more effective feedback. You may be invited to participate in one of these workshops, but if you're particularly keen to be there, book your place now by emailing one of the project team.


Tracii Ryan & Michael Henderson (2017) Feeling feedback: students’
emotional responses to educator feedback, Assessment & Evaluation in Higher Education, DOI:
10.1080/02602938.2017.1416456

Assessment feedback allows students to obtain valuable information about how they can improve their future performance and learning strategies. However, research indicates that students are more likely to reject or ignore comments if they evoke negative emotional responses. Despite the importance of this issue, there is a lack of research exploring if certain types of students are more likely to experience negative emotional responses than others. This study builds on extant qualitative studies through a quantitative examination of two previously identified student variables: different citizenship backgrounds (domestic and international) and different grade expectations (higher or lower than expected). The participants were 4514 students from two Australian universities. Analysis of survey data revealed that, regardless of language experience, international students were more likely than domestic students to find feedback comments to be discouraging, upsetting and too critical. Students who received grades lower than they expected on a particular assessment task were more likely than students who received grades higher than they expected to feel sad, shameful and angry as a result of the feedback comments. This paper concludes with several recommendations, including the need to modify assessment feedback practices in order to be sensitive to different student cohorts.