Assessment and feedback tool development lessons

Wilbert Kraan from Jisc CETIS reflects on common findings which have emerged from Strand C of the Assessment and Feedback programme. These have been the more ‘techy’ projects in the programme as they’ve taken open source tools and adapted them for use beyond the organisations they were originally developed in. Read more….

Assessment & Feedback – from reluctance to emotional response

At the recent JISC Assessment & Feedback programme meeting, I ran a session with the Strand B projects in which we revisited some questions we first discussed a year ago. Thus, instead of ‘What do you want to happen as a result of your project’s assessment and feedback innovation?’ we talked about what has happened. And, rather than ‘How will you know the intended outcomes have been achieved?’ we discussed the indicators and evidence that projects have actually gathered over the last year. These are particularly relevant given that Strand B projects are all about Evidence and Evaluation of assessment and feedback related innovations.

The questions were really just to get us started, although the Strand B project teams are such a keen group they didn’t need much encouragement! In fact, we had a very open discussion, and what emerged were some of the issues and benefits of evaluating large-scale changes in assessment and feedback using technology, as well as some interesting findings.

All the project teams want to gather a balanced view of the changes being implemented within their institutions, but many had issues with collecting data from ‘reluctant users’. In other words, individuals who are reluctant to use a given technology can also be difficult to involve in the evaluation process. This is by no means unique to this context, or to evaluation. Indeed, some projects found that reluctant users also tended to be less likely to take up training opportunities, something that might only be picked up later, when difficulties with using the technology arose. This really underpins Ros Smith’s reflections from the programme meeting on the need to open a dialogue with course teams, so that implementing these kinds of changes is as much about working with people and cultures as with technology. Being ready to capture the views of those who are having difficulties, or offering a light touch evaluation alternative for reluctant users might be options that provide a more balanced stakeholder perspective.

For some projects, the evaluation process itself had provided the push for lecturers to engage with online assessment and feedback tools. In one case, a lecturer who had previously noted that ‘my students don’t want me to use this approach’ took part in a focus group. During this, the lecturer heard direct from students that they did want to use online tools for assessment. Needless to say the project team were delighted that the lecturer went on to trial the tools.

Effective training of staff was also picked up as essential, particularly as how lecturers go on to communicate use of tools to students influences student uptake and use. This led on to discussions about the importance of training students, and how evaluation activity can help in understanding how well students interpret feedback. Essentially ensuring that students are gaining the most from the feedback process itself and not having difficulties with the tools used to support the process.

What surprised a number of projects was how the evaluations had picked up strong emotional reactions to assessment and feedback both from students and staff. There is a wider literature that looks at “Assessment as an ’emotional practice’” (Steinberg, 2008) and this is underpinned by studies into the links between learning identities, power and social relationships (such as this paper by Higgins, 2000). While the Strand B projects might not have set out to study emotional reactions, it seems there will be some interesting findings in this area.

The importance of relationships was also reflected in findings of a mismatch between students and lecturers in terms of perceptions of the intimacy afforded by online and hard copy assessment and feedback. Staff felt closer to students and more in a dialogue with them when marking hard copy. They wanted to sign or add a personal note to a physical piece of paper. While students felt more enabled to engage in a dialogue online, perhaps because this was felt to be less intimidating.

During the meeting we also discussed the methods and tools projects have been using for their evaluations, but that will be the subject of another blog post.

*Amended from a post on the Inspire Research blog*

Reflections on Assessment & Feedback Programme Meeting: October 2012

“What you’re supposed to do when you don’t like a thing is change it. If you can’t change it, change the way you think about it.” So said Maya Angelou, black American poet and author, referring, I guess, to intractable issues of prejudice and racism.

But these words ring equally true in any challenging situation. And for many further and higher education institutions, assessment and feedback are high up on the list of such challenges.

The JISC Assessment and Feedback Programme meeting on 17 October in Birmingham brought together projects from all three strands of the programme. Their combined presentations in the market place session clearly underlined a need for change. Making effective use of administrators’ and academics’ time, improving students’ response to feedback and doing so at scale were some of the difficult issues the projects are addressing.

Their accounts provided an insight into the difficulties of effecting change, even when all agree it’s needed. After all, assessment touches all bases, from stakeholder perceptions to curriculum design and administrative functions, and traditional practices are deeply embedded.

But the most enduring impression from our day in Birmingham was that lasting institution-wide improvements to assessment and feedback are beginning to take shape. However, achieving a step change in assessment and feedback means first making changes to the way we think and talk about them.

Technology, as always, provides the catalyst, but there are few technology-mediated solutions that do not require a supported approach to change. Take for example, e-submission and e-marking, aspects of assessment that are new to many academic staff.

These clearly offer benefits in terms of effective curriculum delivery as well as efficiency gains. Several project teams reported positive outcomes from their work on technology-supported assessment management: “Reports from GradeMark are helping tutors identify trends in strengths and weaknesses in student work which also has informed Curriculum Design in two modules.” EBEAM “The move to online submission, marking and feedback has produced efficiency savings and more effective feedback.” E-AFFECT.

Nonetheless, introducing such approaches on a wide scale involves people and cultures as much as technology. Without dialogue with course and programme teams about their individual assessment practices and needs, transformation can prove elusive: “One size does not fit all – you need a thorough analysis of needs. A roll out of assessment management tools is not about the tech, it’s about people, processes and clear benefits matched to need.” OCME. A similar message came from project teams evaluating and implementing electronic voting systems and developing standards-based question banks for e-assessment.

So changing the way things are done clearly involves changing the discourse as well as the tools of assessment. This important first step can be supported by staff development resources from the Viewpoints project in the Institutional Approaches to Curriculum Design programme. The projects in the Assessment and Feedback programme are telling us that more such resources and guidance will soon be on their way!

You can follow the work of projects in the JISC Assessment and Feedback programme which completes in 2013 on their blogs. You can also view their emerging outputs on the Design Studio.

Ros Smith (Synthesis consultant to the JISC Assessment and Feedback programme)

 

Assessment and feedback: developing a vision for technology-enhanced practices and processes

Birmingham was a hive of activity last week as the Assessment and Feedback programme came together to work towards the development of a shared vision around this theme. It had been a year since all the projects met face-to-face so the aim was to provide opportunities for networking and sharing project outputs and outcomes as well as considering issues such as strategies for managing and sustaining change, evaluation approaches and technical developments.

The day started with a ‘market place’ activity where projects shared their ‘wares’ i.e. specific outputs and resources that would be of use to other institutions. The ‘buyers’ were asked to consider what would be useful to them and what action they would take as a result of engaging with these resources (see the Design Studio). The institutional projects (strand A) also produced some short videos updating on the development of their work.

This was followed by an activity which explored what projects felt they had to contribute to a set of key emergent ‘Transforming Assessment and Feedback’ themes which are being developed in the Design Studio. These point to current understanding and resources around topics such as peer assessment and assessment for learning and the gaps this programme hopes to fill in these areas. It helped to surface and distinguish between what projects can tentatively or confidently say about these areas at this point in time.

We rounded off the visioning exercise at the end of the day through a ‘World Café’ inspired approach looking at four different lens on this vision from learners’, practitioners’, employers ‘and the institutions’ perspectives. Here is an example of how some of the discussion mapped out:

Sustaining and embedding change was also a separate topic of discussion for the institutional projects (strand A). Prof David Nicol revisited the area of educational principles as discourse which he has discussed previously with the project teams. The focus this time was very much on principles in practice – showing how principles-based strategies can bring about institutional change in assessment and feedback practice as demonstrated by the Viewpoints project at the University of Ulster. The project, led by Alan Masson, developed a tool based on the REAP principles and using the Hybrid Learning Model which engages curriculum teams in a conversation around effective assessment and feedback strategies. The principles (written on cards) are not directive but act simply as a tool for dialogue, decision-making and action planning. (See also the Viewpoints evalutation which shows the benefits of this approach).

A view of the assessment and feedback landscape

The Assessment and Feedback programme has recently published a report which synthesises the baseline reviews of 8 institutional change projects. The purpose of the baseline process was to gain a picture of the current state-of-play of assessment and feedback processes and practices within these organisations and to provide a starting point against which to evaluate the effectiveness of technology interventions in bringing about change.

Collectively these reports paint a picture of a sector facing some significant issues, many of which will come as no surprise to many. For example:
• practice remains largely resistant to change despite pockets of excellence
• responsibility for assessment and feedback is highly devolved within institutions and there are considerable variations in practice
• formal documentation does not usually reflect the reality of assessment and feedback practices
• workload and time constraints mean that academic staff have little space to discover new approaches and innovate
• assessment approaches often do not reflect the reality of the work place and the way professionals undertake formative development during their careers;
• despite significant investment in the technical infrastructure to support assessment and feedback, resource efficiencies are not being delivered due to localised variations in underlying processes

These are just some of the findings of the report which goes into more detail around key themes such as: assessment and feedback strategy and policy; education principles as a framework for change; approaches to and types of assessment and feedback; employability; learner engagement; supporting processes (e.g. quality, submission and marking); staff development; and accessibility.
In terms of addressing these issues, “it is to be hoped that the assessment and feedback programme, by supporting some examples of large-scale change and by developing effective channels for sharing lessons learned will serve to build roads and bridges between these two provinces. There is a sense that the sector is approaching something of a tipping point where a combination of the various pressures for change and the growing body of evidence for the benefits of new approaches will deliver the momentum for a significant shift.”

Thanks to Dr Gill Ferrell for producing the synthesis report.

Transforming assessment and feedback resources

This week sees the launch of a set of new pages in the JISC Design Studio around Transforming Assessment and Feedback. These now form a hub for existing and emergent work in this area where, under a series of themes, you can explore what we currently know about enhancing assessment and feedback practice with technology, find links to resources and keep up to date with outputs from the Assessment and Feedback and other current JISC programmes. This is a dynamic set of resources that will be updated as the programme progresses.

Seeking Feedback as well as researching the topic

Climbing frame

Throughout the life of the Assessment & Feedback programme the JISC support & synthesis team will be looking at baseline summaries, interim reports, blogs, websites and a host of other outputs from the projects and summarising them for the wider sector.

As a starting point for this synthesis activity we have developed an outline ‘framework’ that identifies some of the questions we are hoping to answer through the programme. We have decided to publish this outline in order that people with an interest in the subject can feed back to us on the draft questions. We know that the projects involved in the programme will have a particular interest in this but we hope also to gain perspectives from the wider community in order to ensure that we are tackling the key issues and addressing the questions people most want answered.

We also welcome your thoughts on what types of evidence you would find most compelling and what kind of outputs/resources (and in what formats) you would like to see from this programme.

The draft synthesis framework is available from the Design Studio here and you can use the comment facility on the page to send us your feedback (you will need to be logged in to do this). We will continue to review the approach throughout the life of the programme but we would welcome comments on this first draft by the end of January 2012.

Image CC BY-SA treehouse1977

Looking forward to events in 2012

Image of hollySeveral JISC e-Learning programmes started in late 2011, including the Assessment & Feedback and Digital Literacies programmes. As we’ve already flagged-up, building on the success of the Digital Visitors and Residents online session, we’ve got several free events coming up which may be of interest:

Outcomes of ALLE JISC Digital Literacies project
Lyn Greaves, Thames Valley University
(14:00 GMT, Friday 20th January 2012)

Making Assessment Count project
Peter Chatterton and Gunther Saunders, University of Westminster
(13:00 GMT, Friday 3rd February 2012)

e-Portfolios to support assessment and feedback
Emma Purnell and Geoff Rebbeck, University of Wolverhampton
(13:00 GMT, Friday 17th February 2012)

Digital Enhance Patchwork Text Assessment (DePTA) project
Caroline Macangelo, CDEPP
(13:00 FMT, Friday 24th February 2012)

Keep an eye out for further details of these events – or better yet, subscribe for free updates using RSS or email!

Assessment and Feedback – new JISC programme underway

The JISC Assessment and Feedback programme was launched in October and projects are now well underway. The programme is focused on supporting large-scale changes in assessment and feedback practice, supported by technology, with the aim of enhancing the learning and teaching process and delivering efficiencies and quality improvements. There are three programme strands with projects of different lengths running for the next 3 years:

Strand A is focused around institutional change where 8 project will redesign assessment and feedback practices, making best use of technology to deliver significant change at programme, school or institutional level.
(See Rowin Young’s blog about the strand).

Evidence and Evaluation is the focus of strand B. These 8 projects will evaluate assessment and feedback related innovations which are already underway in a faculty or institution, and report on lessons for the sector.
Rowin’s blog

The four projects in strand C will package a technology innovation in assessment and feedback for re-use (with associated processes and practice), and support its transfer to two or more named external institutions.
Rowin’s blog

You can keep up with the programme through this blog (don’t forget to subscribe to the feeds). All the project blogs are aggregated in Netvibes .

You can also keep up to date with the programme through Twitter #jiscassess and we are also running some open webinars – see the JISC e-Learning webinar calendar for details.