Assessment & Feedback – from reluctance to emotional response

At the recent JISC Assessment & Feedback programme meeting, I ran a session with the Strand B projects in which we revisited some questions we first discussed a year ago. Thus, instead of ‘What do you want to happen as a result of your project’s assessment and feedback innovation?’ we talked about what has happened. And, rather than ‘How will you know the intended outcomes have been achieved?’ we discussed the indicators and evidence that projects have actually gathered over the last year. These are particularly relevant given that Strand B projects are all about Evidence and Evaluation of assessment and feedback related innovations.

The questions were really just to get us started, although the Strand B project teams are such a keen group they didn’t need much encouragement! In fact, we had a very open discussion, and what emerged were some of the issues and benefits of evaluating large-scale changes in assessment and feedback using technology, as well as some interesting findings.

All the project teams want to gather a balanced view of the changes being implemented within their institutions, but many had issues with collecting data from ‘reluctant users’. In other words, individuals who are reluctant to use a given technology can also be difficult to involve in the evaluation process. This is by no means unique to this context, or to evaluation. Indeed, some projects found that reluctant users also tended to be less likely to take up training opportunities, something that might only be picked up later, when difficulties with using the technology arose. This really underpins Ros Smith’s reflections from the programme meeting on the need to open a dialogue with course teams, so that implementing these kinds of changes is as much about working with people and cultures as with technology. Being ready to capture the views of those who are having difficulties, or offering a light touch evaluation alternative for reluctant users might be options that provide a more balanced stakeholder perspective.

For some projects, the evaluation process itself had provided the push for lecturers to engage with online assessment and feedback tools. In one case, a lecturer who had previously noted that ‘my students don’t want me to use this approach’ took part in a focus group. During this, the lecturer heard direct from students that they did want to use online tools for assessment. Needless to say the project team were delighted that the lecturer went on to trial the tools.

Effective training of staff was also picked up as essential, particularly as how lecturers go on to communicate use of tools to students influences student uptake and use. This led on to discussions about the importance of training students, and how evaluation activity can help in understanding how well students interpret feedback. Essentially ensuring that students are gaining the most from the feedback process itself and not having difficulties with the tools used to support the process.

What surprised a number of projects was how the evaluations had picked up strong emotional reactions to assessment and feedback both from students and staff. There is a wider literature that looks at “Assessment as an ’emotional practice’” (Steinberg, 2008) and this is underpinned by studies into the links between learning identities, power and social relationships (such as this paper by Higgins, 2000). While the Strand B projects might not have set out to study emotional reactions, it seems there will be some interesting findings in this area.

The importance of relationships was also reflected in findings of a mismatch between students and lecturers in terms of perceptions of the intimacy afforded by online and hard copy assessment and feedback. Staff felt closer to students and more in a dialogue with them when marking hard copy. They wanted to sign or add a personal note to a physical piece of paper. While students felt more enabled to engage in a dialogue online, perhaps because this was felt to be less intimidating.

During the meeting we also discussed the methods and tools projects have been using for their evaluations, but that will be the subject of another blog post.

*Amended from a post on the Inspire Research blog*

A view of the assessment and feedback landscape

The Assessment and Feedback programme has recently published a report which synthesises the baseline reviews of 8 institutional change projects. The purpose of the baseline process was to gain a picture of the current state-of-play of assessment and feedback processes and practices within these organisations and to provide a starting point against which to evaluate the effectiveness of technology interventions in bringing about change.

Collectively these reports paint a picture of a sector facing some significant issues, many of which will come as no surprise to many. For example:
• practice remains largely resistant to change despite pockets of excellence
• responsibility for assessment and feedback is highly devolved within institutions and there are considerable variations in practice
• formal documentation does not usually reflect the reality of assessment and feedback practices
• workload and time constraints mean that academic staff have little space to discover new approaches and innovate
• assessment approaches often do not reflect the reality of the work place and the way professionals undertake formative development during their careers;
• despite significant investment in the technical infrastructure to support assessment and feedback, resource efficiencies are not being delivered due to localised variations in underlying processes

These are just some of the findings of the report which goes into more detail around key themes such as: assessment and feedback strategy and policy; education principles as a framework for change; approaches to and types of assessment and feedback; employability; learner engagement; supporting processes (e.g. quality, submission and marking); staff development; and accessibility.
In terms of addressing these issues, “it is to be hoped that the assessment and feedback programme, by supporting some examples of large-scale change and by developing effective channels for sharing lessons learned will serve to build roads and bridges between these two provinces. There is a sense that the sector is approaching something of a tipping point where a combination of the various pressures for change and the growing body of evidence for the benefits of new approaches will deliver the momentum for a significant shift.”

Thanks to Dr Gill Ferrell for producing the synthesis report.

Assessment and Feedback – new JISC programme underway

The JISC Assessment and Feedback programme was launched in October and projects are now well underway. The programme is focused on supporting large-scale changes in assessment and feedback practice, supported by technology, with the aim of enhancing the learning and teaching process and delivering efficiencies and quality improvements. There are three programme strands with projects of different lengths running for the next 3 years:

Strand A is focused around institutional change where 8 project will redesign assessment and feedback practices, making best use of technology to deliver significant change at programme, school or institutional level.
(See Rowin Young’s blog about the strand).

Evidence and Evaluation is the focus of strand B. These 8 projects will evaluate assessment and feedback related innovations which are already underway in a faculty or institution, and report on lessons for the sector.
Rowin’s blog

The four projects in strand C will package a technology innovation in assessment and feedback for re-use (with associated processes and practice), and support its transfer to two or more named external institutions.
Rowin’s blog

You can keep up with the programme through this blog (don’t forget to subscribe to the feeds). All the project blogs are aggregated in Netvibes .

You can also keep up to date with the programme through Twitter #jiscassess and we are also running some open webinars – see the JISC e-Learning webinar calendar for details.