IMPORTANT Move to https://elearning.jiscinvolve.org

Please note that this blog site has been merged with https://elearning.jiscinvolve.org and will now no longer be updated. All content within this site has been transferred and will not continue to be updated so please update your bookmarks etc. All this is to help rationalise our communications across the Jisc e-Learning programme.

Developing digital literacies webinars

We are holding a series of webinars to disseminate and discuss the work of the Developing digital literacies projects. Each webinar will last up to an hour and will discuss an aspect of the programme’s work. The webinars are free and open to all, but please sign up to let us know you’re coming, and to get the right link to join. (If you’re not available on these dates, note that we will be recording the webinars, and making them available from here.)

The following webinars are confirmed:

Current issues and approaches in developing digital literacy
Tuesday 12 February, 13.00-14.00
This webinar will discuss what digital literacies are and why it is important for universities and colleges to develop the digital literacies of their students and staff. We will look at some of the issues to consider when planning an institutional approach to developing digital literacies, and projects from Jisc’s Developing Digital Literacies programme will highlight some of the approaches that they have found effective in their own contexts.
Sign up for this webinar

Rising to the digital literacy challenge in further education
Thursday 28 February, 11:00-12:00
This webinar, organised in conjunction with the Jisc regional support centres, provides three different perspectives on how to address the challenge of staff development needs in order to ensure that digital literacies are embedded across the organisation. The two Developing digital literacies projects based in FE colleges, WORDLE and PADDLE, will be speaking at the event.
Sign up for this webinar (note: sign up is via the RSC page for this event.)

Implementing the UKPSF in the digital university
Wednesday 17 April, 13.00-14.00
This webinar offers a guide to implementing the UK professional standards framework in the digital university. We look at how post-graduate certificates in teaching and learning in higher education (PGCertHE) courses and CPD processes are adapting to digital technologies, both in their design and operation and in the educational practices for which the PGCertHEs are preparing staff. We introduce a new wiki including case studies of technology-informed practice, indexed against the UKPSF areas of activity, core knowledge and values.
Sign up for this webinar

Other webinars will be added to this page as the dates are confirmed, including:

Assessment and feedback tool development lessons

Wilbert Kraan from Jisc CETIS reflects on common findings which have emerged from Strand C of the Assessment and Feedback programme. These have been the more ‘techy’ projects in the programme as they’ve taken open source tools and adapted them for use beyond the organisations they were originally developed in. Read more….

Assessment & Feedback – from reluctance to emotional response

At the recent JISC Assessment & Feedback programme meeting, I ran a session with the Strand B projects in which we revisited some questions we first discussed a year ago. Thus, instead of ‘What do you want to happen as a result of your project’s assessment and feedback innovation?’ we talked about what has happened. And, rather than ‘How will you know the intended outcomes have been achieved?’ we discussed the indicators and evidence that projects have actually gathered over the last year. These are particularly relevant given that Strand B projects are all about Evidence and Evaluation of assessment and feedback related innovations.

The questions were really just to get us started, although the Strand B project teams are such a keen group they didn’t need much encouragement! In fact, we had a very open discussion, and what emerged were some of the issues and benefits of evaluating large-scale changes in assessment and feedback using technology, as well as some interesting findings.

All the project teams want to gather a balanced view of the changes being implemented within their institutions, but many had issues with collecting data from ‘reluctant users’. In other words, individuals who are reluctant to use a given technology can also be difficult to involve in the evaluation process. This is by no means unique to this context, or to evaluation. Indeed, some projects found that reluctant users also tended to be less likely to take up training opportunities, something that might only be picked up later, when difficulties with using the technology arose. This really underpins Ros Smith’s reflections from the programme meeting on the need to open a dialogue with course teams, so that implementing these kinds of changes is as much about working with people and cultures as with technology. Being ready to capture the views of those who are having difficulties, or offering a light touch evaluation alternative for reluctant users might be options that provide a more balanced stakeholder perspective.

For some projects, the evaluation process itself had provided the push for lecturers to engage with online assessment and feedback tools. In one case, a lecturer who had previously noted that ‘my students don’t want me to use this approach’ took part in a focus group. During this, the lecturer heard direct from students that they did want to use online tools for assessment. Needless to say the project team were delighted that the lecturer went on to trial the tools.

Effective training of staff was also picked up as essential, particularly as how lecturers go on to communicate use of tools to students influences student uptake and use. This led on to discussions about the importance of training students, and how evaluation activity can help in understanding how well students interpret feedback. Essentially ensuring that students are gaining the most from the feedback process itself and not having difficulties with the tools used to support the process.

What surprised a number of projects was how the evaluations had picked up strong emotional reactions to assessment and feedback both from students and staff. There is a wider literature that looks at “Assessment as an ’emotional practice’” (Steinberg, 2008) and this is underpinned by studies into the links between learning identities, power and social relationships (such as this paper by Higgins, 2000). While the Strand B projects might not have set out to study emotional reactions, it seems there will be some interesting findings in this area.

The importance of relationships was also reflected in findings of a mismatch between students and lecturers in terms of perceptions of the intimacy afforded by online and hard copy assessment and feedback. Staff felt closer to students and more in a dialogue with them when marking hard copy. They wanted to sign or add a personal note to a physical piece of paper. While students felt more enabled to engage in a dialogue online, perhaps because this was felt to be less intimidating.

During the meeting we also discussed the methods and tools projects have been using for their evaluations, but that will be the subject of another blog post.

*Amended from a post on the Inspire Research blog*

Reflections on Assessment & Feedback Programme Meeting: October 2012

“What you’re supposed to do when you don’t like a thing is change it. If you can’t change it, change the way you think about it.” So said Maya Angelou, black American poet and author, referring, I guess, to intractable issues of prejudice and racism.

But these words ring equally true in any challenging situation. And for many further and higher education institutions, assessment and feedback are high up on the list of such challenges.

The JISC Assessment and Feedback Programme meeting on 17 October in Birmingham brought together projects from all three strands of the programme. Their combined presentations in the market place session clearly underlined a need for change. Making effective use of administrators’ and academics’ time, improving students’ response to feedback and doing so at scale were some of the difficult issues the projects are addressing.

Their accounts provided an insight into the difficulties of effecting change, even when all agree it’s needed. After all, assessment touches all bases, from stakeholder perceptions to curriculum design and administrative functions, and traditional practices are deeply embedded.

But the most enduring impression from our day in Birmingham was that lasting institution-wide improvements to assessment and feedback are beginning to take shape. However, achieving a step change in assessment and feedback means first making changes to the way we think and talk about them.

Technology, as always, provides the catalyst, but there are few technology-mediated solutions that do not require a supported approach to change. Take for example, e-submission and e-marking, aspects of assessment that are new to many academic staff.

These clearly offer benefits in terms of effective curriculum delivery as well as efficiency gains. Several project teams reported positive outcomes from their work on technology-supported assessment management: “Reports from GradeMark are helping tutors identify trends in strengths and weaknesses in student work which also has informed Curriculum Design in two modules.” EBEAM “The move to online submission, marking and feedback has produced efficiency savings and more effective feedback.” E-AFFECT.

Nonetheless, introducing such approaches on a wide scale involves people and cultures as much as technology. Without dialogue with course and programme teams about their individual assessment practices and needs, transformation can prove elusive: “One size does not fit all – you need a thorough analysis of needs. A roll out of assessment management tools is not about the tech, it’s about people, processes and clear benefits matched to need.” OCME. A similar message came from project teams evaluating and implementing electronic voting systems and developing standards-based question banks for e-assessment.

So changing the way things are done clearly involves changing the discourse as well as the tools of assessment. This important first step can be supported by staff development resources from the Viewpoints project in the Institutional Approaches to Curriculum Design programme. The projects in the Assessment and Feedback programme are telling us that more such resources and guidance will soon be on their way!

You can follow the work of projects in the JISC Assessment and Feedback programme which completes in 2013 on their blogs. You can also view their emerging outputs on the Design Studio.

Ros Smith (Synthesis consultant to the JISC Assessment and Feedback programme)

 

Assessment and feedback: developing a vision for technology-enhanced practices and processes

Birmingham was a hive of activity last week as the Assessment and Feedback programme came together to work towards the development of a shared vision around this theme. It had been a year since all the projects met face-to-face so the aim was to provide opportunities for networking and sharing project outputs and outcomes as well as considering issues such as strategies for managing and sustaining change, evaluation approaches and technical developments.

The day started with a ‘market place’ activity where projects shared their ‘wares’ i.e. specific outputs and resources that would be of use to other institutions. The ‘buyers’ were asked to consider what would be useful to them and what action they would take as a result of engaging with these resources (see the Design Studio). The institutional projects (strand A) also produced some short videos updating on the development of their work.

This was followed by an activity which explored what projects felt they had to contribute to a set of key emergent ‘Transforming Assessment and Feedback’ themes which are being developed in the Design Studio. These point to current understanding and resources around topics such as peer assessment and assessment for learning and the gaps this programme hopes to fill in these areas. It helped to surface and distinguish between what projects can tentatively or confidently say about these areas at this point in time.

We rounded off the visioning exercise at the end of the day through a ‘World Café’ inspired approach looking at four different lens on this vision from learners’, practitioners’, employers ‘and the institutions’ perspectives. Here is an example of how some of the discussion mapped out:

Sustaining and embedding change was also a separate topic of discussion for the institutional projects (strand A). Prof David Nicol revisited the area of educational principles as discourse which he has discussed previously with the project teams. The focus this time was very much on principles in practice – showing how principles-based strategies can bring about institutional change in assessment and feedback practice as demonstrated by the Viewpoints project at the University of Ulster. The project, led by Alan Masson, developed a tool based on the REAP principles and using the Hybrid Learning Model which engages curriculum teams in a conversation around effective assessment and feedback strategies. The principles (written on cards) are not directive but act simply as a tool for dialogue, decision-making and action planning. (See also the Viewpoints evalutation which shows the benefits of this approach).

Developing Digital Literacies programme meeting

We had a busy day on Tuesday at the Developing Digital Literacies programme meeting, looking at the wealth of resources projects and associations are producing, and trying to plan ahead for how these will work together as a programme output.

It didn’t get off to the best of starts: to lose one morning presenter may be regarded as a misfortune; to lose both looks like carelessness. Helen, our synthesis consultant, was lost in transit, and Jay, our evaluation consulant, was stuck in traffic, but thankfully Jay arrived before Myles and I had finished our introductions, and was able to step in and put a useful marketing slant on some of the work the programme support team have been doing around project outputs and messages, in terms of how to get these messages heard and how they help institutions address challenges. Slides

The main activity of the morning was an hour-long ‘trade fair’, at which the projects and professional associations involved in the programme displayed two of their outputs and shopped for others’ outputs which were useful to them. Everyone had plenty of interesting outputs to show, and a real interest in others’ work, and the activity generated a good buzz, as well as some useful collaborations. This is my first experience of working with lots of professional associations within an innovation programme, and I found their outputs, approaches and insights from their members really useful. I was only sorry I didn’t have a chance to get round to talk to all of the projects and associations.

I was interested to see what the panel discussion on digital literacy frameworks would offer: I’m normally very suspicious of any project who says they’re building a ‘framework’, as the term can cover a multitude of sins. However, I found the discussion of these digital literacy-related frameworks for professional development really useful. It was interesting to see that not many of the projects were using such frameworks, but those which had had found them useful as a starting point for discussions. The panellists all seemed to take a pragmatic view of frameworks, and there was general agreement with David Baume (SEDA) who stressed that the usefulness of frameworks lies in their use as climbing frames – take the bits that interest you and use them to get you where you want, rather than following them slavishly. I’ve certainly found such frameworks useful in getting my head round the digital literacy work, though as a couple of delegates warned, we need to be careful that they don’t perpetuate an over-homogenised view of the area – or the mistaken assumption of a common language or common practice where in fact these don’t exist.

After lunch delegates worked on the ‘promise’ and pack of resources the programme was making in four (or five) key areas: employability; self-assessment and self-development materials; the digitally literate organisation (and digitally literate senior management, which may or may not be the same thing); and tools for teaching and curriculum teams. The detailed outcomes are still on flip-chart paper and post-its, and will feature in a future post, but generally project outputs seemed to meet the promise in these areas fairly well, except for employability, where more work is needed to think through what the messages are here and what sort of outputs are most relevant. We also need to engage relevant professional organisations. Slides from the afternoon session.

Helen has updated the Design Studio pages to reflect the outputs coming out of the projects; see in particular staff development materials; materials designed for students; and organisational development materials.

Thanks very much to Dr Bex Lewis for creating the story of the day using storify.

Another remembrance of summer

In another of our series of blog posts reflecting on the SEDA summer school, Denise Sweeney finds that the experience has given her insights and tactics to take back to her day job, and introduced her to a network of like-minded people. Denise’s blog post

SEDA summer school reflections

I’ve been pulling together more blog posts from the people JISC sponsored to attend the SEDA summer school.

For Barbara Newland, the summer school was a useful prompt to consider how the blended learning policies she draws up impact on individual academics and learning technologists. She also found time to complete a draft project plan inspired by the summer school, which she will discuss with colleagues back at the ranch. Barbara’s blog post

Jane Secker found the summer school particularly useful, and produced a great series of blog posts about the event. Highlights for her were: a sense of professional identity as a learning technologist and a greater understanding of her role as a change agent; greater appreciation of all the things that need to be in place to translate a strategy to meaningful action on the ground; and lots of personal learning points. Jane’s blog posts

Jane O’Neill found much of the content and format useful, and came away with some useful questions to keep teams focussed on the final outcomes of projects: What would it look like if we were successful? What would people be doing differently? What would the impact be on the students? Jane’s blog post

Daniel Clark chose to focus on staff digital literacies, and used his experiences at the SEDA summer school to inform the development of the University of Kent’s e-Learning Summer School, to give staff a chance to share effective practice with technology and learn in context, in a supportive and hands-on environment. Daniel’s blog post

Overall, recurrent themes among the summer school bloggers were the usefulness of techniques such as action learning sets, reflections on their own practice, plans to incorporate changes into any staff and educational development they run, greater appreciation of change management and the impact of change on individuals. Ideas about whether ‘digital’ is really different (our own Lawrie Phipps) and on the benefits and challenges of open practice (Lindsay Jordan, DIAL project) also seem to have resonated.

Three wheels on my wagon, and other tales from the SEDA summer school

Our next summer school blog post is from Jakki Sheridan-Ross, who will dispel any illusions you may have had about the SEDA summer school being a bit of a jolly. I advise a cup of tea and a deep breath before launching into her post, which reflects on the whole (exhausting) summer school experience, and how powerful certain group activities and peer discussion can be. Her work is focussed on how hard it is to stop people in universities reinventing the wheel, especially when they’re not sure how many wheels they need…

Jakki’s blog post on the SEDA Digital Literacy Summer School 2012.

Next Page →