Summer school reflections

As part of the Developing Digital Literacies programme, JISC part-funded 12 scholarship places at the 2012 SEDA summer school, ‘Academic Development for the Digital University’. The people who took up these scholarships have been blogging their reflections from the event. Emma King, Learning and Development Advisor at the University of Warwick, reflects in her blog post on the importance of having confidence in your convictions when delivering staff development events.

Managing Change

Following conversations with representatives of a number of projects over the months since the Course Data programme began it has become increasingly apparent that there was a recognised need for some additional support and information around change management.

Change is at the core of so much project work and if it is not managed appropriately it can have major implications for the success or otherwise of a project.

We held a workshop for Course Data and other projects on Managing Change this week led by John Burke, a Senior Adviser at JISC infoNet.

A number of Course Data colleagues (some of whom also have Transformation project roles) participated in the workshop which took the form of a mix of theory and interactive elements. The main interactive element was a computer simulation depicting the realities in implementing change in the education environment.

One of the first activities we took part in involved identifying, from our own experiences, both positive and negative approaches and outcomes of change and change management. There were dozens of post-its filled in and stuck to the H forms on the wall with smiley faces to represent happy/good experiences and sad faces for the unhappier/bad experiences. Broadly the responses could be summarised as follows:

Bad change management as experienced by the workshop participants were the result of poor communication and consultation; changes happening too fast; increased pressure and workload experienced by those affected; lack of training to support the change; use of jargon which can confuse and alienate and ultimately lead to feelings of disempowerment; change for its own sake – lack of strategic vision and anything impacting negatively on staff morale.


Good and bad experiences of change management presented here on an H form


And the good…. a clear vision – so that even if you disagree with it you understand why it’s happening; good communication – someone happy to answer questions; structured change; obvious improvements made- bottlenecks removed, more appropriate and efficient systems resulting from the change; creative approaches used; fresh challenges and new opportunities resulting from the change and time for reflection available during the process.

John gave us an overview of Change theories – included drivers and strategies for change and he outlined some tools and approaches that could be used to help manage change successfully. Much of the content of the workshop relates to the Change Management infoKit which provides further details on the theories and tools that have been tried and tested within the sector.

The impact of change cannot be underestimated, John spoke about parallels between emotions of bereavement and emotions involved in change for some people.

The second half of the workshop was largely based around a computer simulation activity. The participants worked in pairs on a scenario which, whilst factual, had elements that were not too far away from the reality of managing change within a real educational environment, including having to be delivered within a certain timeframe.

This element of the workshop helped participants to consider the challenges involved in how stakeholders within an organisation can react to change, to consider cultural aspects of work environments, nurturing relationships, and to think about routes into influencing change. As the introduction to the exercise explained ‘changing the way people think and behave in organisations is not a simple task and often requires a combination of different tactics to be used at the right time with the right people’. Furthermore in this exercise, as in real-life, it is important to consider the context in which you are working; review and understand the different initiatives available; develop a strategy to guide you through the project – eg be it top-down or bottom-up; be resilient, things won’t always happen as quickly as you’d like and often not as planned; stay focused on the goal and review and evaluate progress at regular intervals.


Delegates worked in pairs on the Educhallenge computer simulation

Delegates worked in pairs on Educhallenge computer simulation


The simulation offers the user the chance to use ‘interventions’ in order to influence the progress of the project – it is important to nurture an understanding of the individual people that are involved in the project in order to get the best possible outcomes.  Working on the simulation in pairs offered delegates the chance to talk through  potential implications and consequences of making certain decisions.

On a side note I talked to John about the reason behind using pairs to work on the simulation and he explained that his experience has shown that when people have worked on the simulation by themselves they have been more likely to treat it as more of a game – they’ve taken it less seriously; and when he has tried the exercise in groups of three it is often the case that one team member feels that their voice is not being heard in the decision-making process.


Decisions were made in pairs and trainer John was on hand for additional advice

We used an instant feedback approach (which would be followed by a more formal post-event feedback survey) at the end of the session to find out what delegates thought about the day and the types of approaches that had been introduced.  Using pictures and post-its can be a very valuable way of getting an instant idea of what has been useful to the participant, what they really like, could use now, could use later and what isn’t relevant to them.


Instant feedback using pictures - trolley for ideas that participants will take back to their projects, shelf for ideas for the future, what to think about, what to discard in the litter bin, what was loved (heart) and what can be put 'in the pocket' for instant use.

The 'Shopper feedback' illustration filled with post-its at the Change Management event


Feedback on the exercise, and indeed the workshop, has so far been very good, and participants have already suggested ways in which some of the tools and techniques can be used within project activities.

A collection of Change Management related resources (including links to some of the tools and techniques used and the Change Management infoKit) was compiled that may prove useful to anyone dealing with change within projects and organisations more generally.

Thanks to everyone for participating so enthusiastically and to John for leading the day.

Assessment and feedback: Where are we now?

One year on for projects in the JISC Assessment and Feedback programme

Five projects from the JISC Assessment and Feedback programme took part in the 2012 International Blended Learning Conference at the University of Hertfordshire on 13-14 June, when representatives from Collaborate (University of Exeter), iTEAM and EEVS (University of Hertfordshire), FASTECH (Bath Spa University and the University of Winchester) and FAST (Cornwall College) presented and sought feedback on their emerging work.

Four of the projects combined in an interactive workshop entitled ‘Aligning assessment and feedback with 21st century needs’ to discuss their evolving work. Themes of sustainability and authenticity of assessment practices, plus the challenges presented by engaging stakeholders such as employers, staff and learners, featured strongly in the ensuing discussions, as has been the case for other projects in the programme. (See the baseline report for the programme: A view of the Assessment and Feedback Landscape (Ferrell, 2012).

Sustainability for many was a key area of interest. Ensuring outcomes are embedded in institutional practice has been one approach taken – for example, providing differentiated information and support on the effective use of electronic voting systems (EVS) and QuestionMark™ Perception™ at the University of Hertfordshire. However, sustainability also involved responding effectively and meaningfully to the views of all those involved. A challenge raised by iTEAM was ensuring the authenticity of the learner voice. Is the involvement of student representatives and the student union sufficient?

Another area for debate was the meaning and relevance to different stakeholders of ‘authentic assessment’. The Collaborate project proposed the term ‘work-integrated assessment’ as an alternative to the ill-defined concept of ‘authentic’ in relation to assessment, highlighting the need for a common language that is meaningful and acceptable to all stakeholders.

The FAST project engaged employers in discussions about the choice of technologies, but noted the fact that appropriate digital (and non-digital) literacy skills were key to the successful transition of learners into the workplace. Employers did not expect learners to be trained in technologies specific to their working environment but did expect entrants to be skilful and appropriate in their use of standard software and their communication with others.

Technology-enhanced formats could generate quality assurance issues, as emerged in the discussion led by the FASTECH project. Ensuring parity in the marking of tech-supported formats and traditional essays called for inventive approaches to writing assignments and marking criteria. The involvement of the many others on whom assessment and feedback innovation has an impact – from administrative teams, learners, external examiners and moderators to quality assurance teams – was an equally vital consideration. The Collaborate project’s star model provided a useful road map for cross-institutional collaboration.

(Collaborate project model of cross-institutional collaboration)

The opening day of the conference included a further presentation by Amanda Jefferies and Marija Cubric of the EEVS project of the emerging findings of a year-long evaluation of the University of Hertfordshire’s roll-out of electronic voting systems, adding to a rich and thought-provoking contribution by JISC teams to the International Blended Learning Conference.

We are now looking forward to ALT-C 2012 (Theme: A Confrontation with Reality) in September in Manchester where projects from the Evidence and Evaluation stand of the programme will be taking part in a symposium on ‘Effectiveness and Efficiency in Assessment and Feedback: Can technology really deliver both?

You can follow the work of projects in the JISC Assessment and Feedback programme on their blogs and find out more about their emerging outputs on the Design Studio.

(Ros Smith, Assessment and Feedback programme synthesis team)

New JISC Developing Digital Literacies briefing paper

JISC Developing Digital Literacies briefing paper

The JISC e-Learning team are pleased to announce a new briefing paper as part of the JISC Developing Digital Literacies programme. This is available in PDF format and can be downloaded from the following link:

Many thanks to Sarah Payton (@notyap) for her efforts in pulling this together.

A view of the assessment and feedback landscape

The Assessment and Feedback programme has recently published a report which synthesises the baseline reviews of 8 institutional change projects. The purpose of the baseline process was to gain a picture of the current state-of-play of assessment and feedback processes and practices within these organisations and to provide a starting point against which to evaluate the effectiveness of technology interventions in bringing about change.

Collectively these reports paint a picture of a sector facing some significant issues, many of which will come as no surprise to many. For example:
• practice remains largely resistant to change despite pockets of excellence
• responsibility for assessment and feedback is highly devolved within institutions and there are considerable variations in practice
• formal documentation does not usually reflect the reality of assessment and feedback practices
• workload and time constraints mean that academic staff have little space to discover new approaches and innovate
• assessment approaches often do not reflect the reality of the work place and the way professionals undertake formative development during their careers;
• despite significant investment in the technical infrastructure to support assessment and feedback, resource efficiencies are not being delivered due to localised variations in underlying processes

These are just some of the findings of the report which goes into more detail around key themes such as: assessment and feedback strategy and policy; education principles as a framework for change; approaches to and types of assessment and feedback; employability; learner engagement; supporting processes (e.g. quality, submission and marking); staff development; and accessibility.
In terms of addressing these issues, “it is to be hoped that the assessment and feedback programme, by supporting some examples of large-scale change and by developing effective channels for sharing lessons learned will serve to build roads and bridges between these two provinces. There is a sense that the sector is approaching something of a tipping point where a combination of the various pressures for change and the growing body of evidence for the benefits of new approaches will deliver the momentum for a significant shift.”

Thanks to Dr Gill Ferrell for producing the synthesis report.

Crystallising Evaluation Designs – A Reality Check for Developing Digital Literacies

by Jay Dempster, JISC Evaluation Associate

DL evaluation visuals

The JISC Developing Digital Literacies Programme team has been supporting its institutional projects to design and undertake a holistic evaluation. Projects are thinking critically about the nature, types and value of evidence as early indicators of change and several projects now have some useful visual representations of their project’s sphere of influence and evaluation strategy. A summary produced this month is now available from the JISC Design Studio at:

Structuring & mapping evaluation

The point is to reach a stage in designing the evaluation where we can clearly articulate and plan an integrated methodology that informs and drives a project towards its intended outcomes. Projects that have achieved a clearly structured evaluation strategy have:

  1. defined the purpose and outputs of each component;
  2. considered their stakeholder interests & involvement at each stage;
  3. identified (often in consultation with stakeholders) some early indicators of progress towards intended outcomes as well as potential measures of impact/success;
  4. selected appropriate methods/timing for gathering and analysing data;
  5. integrated ways to capture unintended/unexpected outcomes;
  6. identified opportunities to act upon emerging findings (e.g. report/consult/revise), as well as to disseminate final outcomes.

Iterative, adaptive methodologies for evaluation are not easy, yet are a good fit for the these kinds of complex change initiatives. Approaches projects are taking in developing digital literacies across institutions include:

What is meant by ‘evidence’?

Building into the evaluation ways to capture evidence both from explicit, formal data-gathering activities with stakeholders and from informal, reflective practices on the project’s day-to-day activities can offer a continuous development & review cycle that is immensely beneficial to building an evidence base.

However, it can be unclear to projects what is meant by ‘evidence’ in the context of multi-directional interactions and diverse stakeholder interests. We have first considered who the evidence is aimed at and second, to clarify its specific value to them.

This is where evaluation can feed into dissemination, and vice versa, both being based upon an acute awareness of one’s target audience (direct and indirect beneficiaries/stakeholders) and leading to an appropriate and effective “message to market match” for dissemination.

In the recent evaluation support webinar for projects, we asked participants to consider the extent to which you can rehearse their ‘evidence interpretation’ BEFORE they collect it, for instance, by exploring:

Who are your different stakeholders and what are they most likely to be interested in?

What questions or concerns might they have?

What form of information/evidence is most likely to suit their needs?

An evaluation reality-check

We prefaced this with an  ‘evaluation reality-check’ questionnaire, which proved a useful tool both for projects’ self-reflection and for the support team to capture a snapshot of where projects are with an overall design for their evaluations. What can we learn from these collective strategies, how useful is the data that is being collected?

By projects sharing and discussing their evaluation strategies, we are developing a collective sense of how projects are identifying, using and refining their indicators and measures for the development of digital literacies in relation to intended aims . We are also conscious of the need to build in mechanisms for capturing unexpected outcomes.

Through representing evaluation designs visually and reflecting on useful indicators and measures of change, we are seeing greater clarity in how projects are implementing their evaluation plans. Working with the grain of those very processes they aim to support for developing digital literacies in their target groups, our intention is that:

[VIDEO] JISC Developing Digital Literacies projects’ updates

(click here if video not embedded!)

At last week’s JISC Developing Digital Literacies programme meeting, funded projects were asked to produce a short video showing their progress to date.

These were excellent as can be seen by viewing them via the video playlist above! Project blogs can be found at

(note that Greenwich’s video will be included in the playlist shortly)

[RECORDING] A History of Digital Literacy in the UK and EU

YouTube Preview Image

We were delighted to last week welcome Tabetha Newman and Sarah Payton to run a free, public webinar as part of the JISC Developing Digital Literacies programme.

In addition to being able to watch the recording within Blackboard Collaborate, we’ve also exported the webinar as video and audio files along with the chat transcript. You can find these below:

Speaker icon Listen to recording

Blackboard icon Watch recording (Blackboard Collaborate)

PDF icon Read chat transcript

Bonus: You may find this blog post by Tabetha helpful about digital literacy in the EU

[RECORDING] Mozilla and web literacies

Last Friday we were privileged to have Erin Knight and Michelle Levesque from Mozilla run a free, public webinar as part of the JISC Developing Digital Literacies programme.

In addition to being able to watch the recording within Blackboard Collaborate, we’ve also exported the webinar as video and audio files along with the chat transcript. Please do let us know if you find them useful!

Speaker icon Listen to recording

Blackboard icon Watch recording (Blackboard Collaborate)

YouTube Watch recording (YouTube)

PDF icon Read chat transcript

Update: Erin Knight has added her reflections (and answers to some questions that came up) on her blog.

Transforming assessment and feedback resources

This week sees the launch of a set of new pages in the JISC Design Studio around Transforming Assessment and Feedback. These now form a hub for existing and emergent work in this area where, under a series of themes, you can explore what we currently know about enhancing assessment and feedback practice with technology, find links to resources and keep up to date with outputs from the Assessment and Feedback and other current JISC programmes. This is a dynamic set of resources that will be updated as the programme progresses.

← Previous PageNext Page →