Friday, 4 May 2018

Journal of Learning and Teaching in Higher Education

The first issue of the University of Leicester Journal of Learning and Teaching in Higher Education has been published. In this issue:

  • Editorial for Volume 1, Issue 1: Reimagining a Journal. Frances Deepwell
  • Reading Between the Lines. Matthew J Coombes
  • Active Learning in Physics, Astronomy and Engineering with NASA’s General Mission Analysis Tool. Nigel Paul Bannister
  • The benefits of sustained undergraduate inter-programme collaborations between international partners. Cheryl P Hurkett, Sarah L Symons, Sarah N Gretton, Chad T Harvey, Phillipa E Lock, Dylan P Williams, Derek J Raine
  • Student-contributed podcasts to support transition to higher education. Palitha Edirisingha, Robert Cane, Mengjie Jiang, Chris Cane
  • Exploring Complex Learning Spaces. Philip Wood, Paul Warwick
  • Reinterpreting 'Exploring Complex Learning Spaces' (Wood & Warwick, 2018). Shaimaa Ragab Abdelkarim, Tabitha Watson, Alsahira Alkhayer
  • Games as Education in the United Kingdom. Alison Harvey
  • More than skills: What can approaches to Digital Literacies learn from Academic Literacies? Stephen Walker, Alexandra Patel
  • Critical reflections on staff-student partnership and ‘re-interpreting’ journal submissions. Lorna Mary Cork
  • Reinterpreting ‘Critical reflections on staff-student partnership and ‘re-interpreting’ journal submissions’ by Cork (2018). Najima Mohamed, Nicola Blacklaws, Zainab Mustafa
  • Book review: Precarious Workers Brigade (2017) Training for Exploitation? Politicising Employability & Reclaiming Education. Stephen Rooney


Wednesday, 2 May 2018

Student Experience Conference 02.05.2018

Several staff from the School of Biological Sciences attended the University Student Experience Conference.

Introduction by Professor Jon Scott, Pro-Vice Chancellor Student Experience
In his introduction Jon discussed transitions throughout the student experience and the national regulatory climate.
Keywords for a positive student experience:
Expectation - Incorporation - Inclusivity - Belonging - Learning - Resilience - Support - Continuation - Progression - Employment

Focus On: Students’ Union Strategy
Gareth Oughton, Chief Executive and Mollie Henstock, Wellbeing Officer
The Your Union Your Voice survey was discussed along with the new three year development plan.

This was followed by workshop sessions on:

  • Continuous Improvement, Hasan Ebrahim and Amna Shehzadi, Continuous Improvement team
  • Student panel discussion, Current University of Leicester students
  • Data and power business intelligence, Becky Johnson, Director of Strategic Planning and Performance

Afternoon sessions:

Focus On: Employability and Graduate Outcomes

Richard Wilcock, Deputy Director of Career Development Service
Richard discussed supporting graduate success through academic programmes.

Stephen Isherwood, Chief Executive, Association of Graduate Recruiters
Stephen discussed current graduatre employment trends.


Focus On: Education Excellence, Graham Wynn, Director, Education Excellence
Graham discussed TEF Metrics and the launch of the University Education Excellence programme.


The conference finished with a Q&A session.

Tuesday, 17 April 2018

Large Group Teaching

This session on 17.04.18 formed part of a series of events by the Leicester Learning Institute in in 2017/18 focused around large group teaching. Volko Straub and Alan Cann attended and the participants worked in interdisciplinary groups to explore how small group teaching strategies can be developed for large cohorts and design approaches to use with students.

The session was divided into a series of short activities:

  • Data gathering
  • Idea generation
  • Design
  • Sharing
  • Detailed planning

The intention is that each participant has come away with a fairly small scale implementation that they will introduce to their teaching next session and that the ideas will be supported and followed up by LLI as time progresses.

Friday, 13 April 2018

Active learning and student engagement: A MOOC Approach

On 12.04.18 Shaun Heaphy and Alan Cann attended the LLI event: Active learning and student engagement: A MOOC Approach, run by Nicola Gretton and Rachael Tunstall, LLI.

The session started with discussion of Leicester's contributions to the Futurelearn platform. To date there have been 77,000 people who started a Leicester MOOC (Massive Open Online Course), but as with all MOOCs the drop-off rate is steep:

  • 50% of starters complete 50% of the course
  • 34% of starters complete 90% of the course

However, this is a rich source of data about online engagement which can be translated into campus-based courses.
  • Likes: Variety, brevity (multimedia length, writing for screen based reading), strong structure/narrative.
  • Dislikes: Audio without video, tasks without focus.

Evidence of the impact of MOOCs at Leicester:
https://www2.le.ac.uk/offices/lli/developing-learning-and-teaching/enhance/strategies/evidence-of-the-impact-of-moocs-at-leicester

After the introduction participants carried out workshop activities on engagement, based on six active learning approaches:

  1. Co-operative & Collaborative Learning
  2. Peer Learning
  3. Games & playful learning
  4. Enquiry-based learning – self-directed enquiry of research by the student
  5. Inquiry-based learning – problem-based learning
  6. Conversational learning



Monday, 19 March 2018

Thinking About Group Work

It's a bit mechanical in places, but this new evidence-based teaching guide on group work from CBE Life Sciences Education has some useful features, in particular the interactive online chart which is a good place to start when thinking about group work assessments:




K.J. Wilson, P. Brickman and C.J. Brame. Group Work. (2018) CBE Life Sci Educ,17: fe1; doi:10.1187/cbe.17-12-0258
Science, technology, engineering, and mathematics faculty are increasingly incorporating both formal and informal group work in their courses. Implementing group work can be improved by an understanding of the extensive body of educational research studies on this topic. This essay describes an online, evidence-based teaching guide published by CBE—Life Sciences Education (LSE). The guide provides a tour of research studies and resources related to group work (including many articles from LSE). Instructors who are new to group work, as well as instructors who have experienced difficulties in implementing group work, may value the condensed summaries of key research findings. These summaries are organized by teaching challenges, and actionable advice is provided in a checklist for instructors. Education researchers may value the inclusion of empirical studies, key reviews, and meta-analyses of group-work studies. In addition to describing key features of the guide, this essay also identifies areas in which further empirical studies are warranted.


Wednesday, 21 February 2018

Digital by default? People may be our most valuable asset

Commentary:
I often read education research papers and discard them because the conclusions claimed are not fully supported by the data – either the experimental design is poor and open to interpretation (it's difficult to design good education experiments for a range of reasons, including ethics), or the statistics simply don't support the argument. I feel this paper is worthy of discussion though because it contains a rigorous statistical analysis and a thoughtful discussion of possible confounding factors, and because although it does not necessarily apply to the University of Leicester or the School of Biological Sciences, it does raise some interesting issues to consider.


Guest, R., Rohde, N., Selvanathan, S., & Soesmanto, T. (2018). Student satisfaction and online teaching. Assessment & Evaluation in Higher Education, 1-10. DOI: 10.1080/02602938.2018.1433815 https://doi.org/10.1080/02602938.2018.1433815
Abstract: This paper presents an analysis of the response of student satisfaction scores to online delivery for courses at a large Australian university. Taking data from 2653 courses delivered since 2011 we employ a difference-in-differences estimator approach to evaluate the impact of a transition from face-to-face learning to online learning of courses on student satisfaction. We estimate that, on a five-point scale, conversion to online learning lowers course satisfaction by about 0.2 points and instructor satisfaction by about 0.15 points. These correspond to shifts relative to the underlying distributions of about 25–30% of a standard deviation. Some implications of the (slight) relative unpopularity of online learning are discussed.
Authors Conclusion:
"Predictions have often been presented that transition to online delivery would likely reduce course and teaching satisfaction. This paper, employing a difference-in-difference technique on a large set of data taken from 2653 courses at a large, multi-campus Australian university, has presented an analysis of student satisfaction scores of courses and their instructors for online and face-to-face modes of delivery. We have found that instructors of online courses are less popular than face-to-face instructors, and that converting a course from face-to-face to online seem to diminish student satisfaction with that course. The effect sizes are meaningful (around 25–30% of a standard deviation) but hardly staggering. It seems that online teaching is less well received by students, but hardly overwhelmingly so, conditional upon the assumption that our satisfaction scores are meaningfully comparable across delivery modes. A conservative interpretation of our findings is that, while it is still in its infancy, online instruction is probably considered to be a less satisfying learning experience for students. Educational administrators should weigh this moderate disadvantage to online education against some of its well-known advantages."


But if you're not completely convinced by this paper, here's another one:

Webb, O. J., & Cotton, D. (2018). Early withdrawal from higher education: A focus on academic experiences. Teaching in Higher Education 06 Feb 2018
http://doi.org/10.1080/13562517.2018.1437130
Abstract: Early withdrawal from higher education (HE) programmes can be detrimental for the students and institutions involved. Quantitative research has often concentrated on demographic and social antecedents (e.g. gender, prior education). Other factors may be more open to intervention e.g. students’ academic experiences in HE. Using data from an institutional survey (N = 1170), logistic regression tested a range of academic experiences, regarding their relationship to contemplation of withdrawal (‘COW’: a recognised marker for actual withdrawal). COW was associated with student perceptions of low one-to-one contact with staff; non-traditional delivery methods; low peer-interaction; and high assessment load. Interestingly, COW was not associated with overall contact hours, large classes, or personal tutoring. The contributing factors explained 5.1%–8.6% of variance in COW, suggesting they may be meaningful levers for optimising retention. The paper discusses links to existing literature, future research directions, and applied implications for institutions.
Authors Conclusion:
"Controlling for known demographic predictors, several factors were associated with contemplation of withdrawal, which was itself a significant predictor of actual withdrawal. These were:
  • Students’ perception that one-to-one contact with teaching staff was low.
  • Students’ reports that lectures were not the main teaching format.
  • Students’ perception that opportunities to interact with fellow students were low.
  • Students’ reports that the volume of assessment was excessive."




A.J. Cann





Thursday, 15 February 2018

Rubrics North of the Border

https://upload.wikimedia.org/wikipedia/commons/8/83/Rubric.jpg
Earlier this week I was in Edinburgh to give a seminar on GradeMark and feedback. While I was there I had an interesting conversation with Paul McLaughlin (Associate Director of Teaching) about using rubrics to speed up marking and the delivery of feedback. Paul has used GradeMark rubrics extensively with large class sizes and passed on a number of useful tips.

As anyone who has ever tried to write a GradeMark rubric knows, the format is rigid and rather limiting since it demands a square matrix, e.g. 4x4 or 5x5, which isn't always what is required. Paul's first tip is that it is possible to write much more flexible assessment schemes by padding the matrix with dummy marks. For example, you may wish to grade four criteria of a written submission using five grades (each with associated feedback), but to grade an oral presentation which forms part of the same assessment, you may need a different shaped matrix, e.g. three criteria (clarity, content and responses to questions) with six or eight grade descriptors. It is possible to construct such a matrix and therefore to generate a mark for a multipart assessment. This is done by padding out the matrix to fill the unused grades by simply duplicating the same mark several times and leaving the feedback section black, since it won't be used. If you've never used GradeMark rubrics, this description may be as clear as mud, but if you're ever tried to write a rubric which covers the assessment scheme you want rather than the rigid one GradeMark forces on you, it's a life saver, and also makes it much easier to adapt existing marking schemes to a rubric.


Paul's second tip involves structuring QuickMarks to speed up the marking of assignments. For a well structured written submission such as a lab report, Paul writes a series of QuickMarks which follow the structure of the assessment. For example, this might be something like:
01.1 Introduction Feedback comment (Excellent…)
01.2 Introduction Feedback comment (Very good…)
01.3 Introduction Feedback comment (Good…)
01.4 Introduction Feedback comment (Satisfactory…)
01.5 Introduction Feedback comment (Weak…)
02.1 Mathods Feedback comment (Excellent…)
etc.
By numbering the QuickMarks in this way it is easy to work through them in sequence and this greatly reduces the time spent typing repetitive Bubble or other inline comments. The feedback is rounded out by a more individualized comment in the GradeMark Text Box.


Lastly, if you've ever sat down to write a rubric from scratch, you may, like me, have run head on into the lack of inspiration brick wall. Paul has made extensive use of the large collection of rubrics available online at: https://www.rcampus.com/indexrubric.cfm
It's unlikely that you'll find a rubric which fits your exact needs but this resource is a great place to start and by adapting pre-existing scheme you can save a lot of time.


Alan Cann