IMPORTANT Move to https://elearning.jiscinvolve.org

Please note that this blog site has been merged with https://elearning.jiscinvolve.org and will now no longer be updated. All content within this site has been transferred and will not continue to be updated so please update your bookmarks etc. All this is to help rationalise our communications across the Jisc e-Learning programme.

Managing Change

Following conversations with representatives of a number of projects over the months since the Course Data programme began it has become increasingly apparent that there was a recognised need for some additional support and information around change management.

Change is at the core of so much project work and if it is not managed appropriately it can have major implications for the success or otherwise of a project.

We held a workshop for Course Data and other projects on Managing Change this week led by John Burke, a Senior Adviser at JISC infoNet.

A number of Course Data colleagues (some of whom also have Transformation project roles) participated in the workshop which took the form of a mix of theory and interactive elements. The main interactive element was a computer simulation depicting the realities in implementing change in the education environment.

One of the first activities we took part in involved identifying, from our own experiences, both positive and negative approaches and outcomes of change and change management. There were dozens of post-its filled in and stuck to the H forms on the wall with smiley faces to represent happy/good experiences and sad faces for the unhappier/bad experiences. Broadly the responses could be summarised as follows:

Bad change management as experienced by the workshop participants were the result of poor communication and consultation; changes happening too fast; increased pressure and workload experienced by those affected; lack of training to support the change; use of jargon which can confuse and alienate and ultimately lead to feelings of disempowerment; change for its own sake – lack of strategic vision and anything impacting negatively on staff morale.

 

Good and bad experiences of change management presented here on an H form

 

And the good…. a clear vision – so that even if you disagree with it you understand why it’s happening; good communication – someone happy to answer questions; structured change; obvious improvements made- bottlenecks removed, more appropriate and efficient systems resulting from the change; creative approaches used; fresh challenges and new opportunities resulting from the change and time for reflection available during the process.

John gave us an overview of Change theories – included drivers and strategies for change and he outlined some tools and approaches that could be used to help manage change successfully. Much of the content of the workshop relates to the Change Management infoKit which provides further details on the theories and tools that have been tried and tested within the sector.

The impact of change cannot be underestimated, John spoke about parallels between emotions of bereavement and emotions involved in change for some people.

The second half of the workshop was largely based around a computer simulation activity. The participants worked in pairs on a scenario which, whilst factual, had elements that were not too far away from the reality of managing change within a real educational environment, including having to be delivered within a certain timeframe.

This element of the workshop helped participants to consider the challenges involved in how stakeholders within an organisation can react to change, to consider cultural aspects of work environments, nurturing relationships, and to think about routes into influencing change. As the introduction to the exercise explained ‘changing the way people think and behave in organisations is not a simple task and often requires a combination of different tactics to be used at the right time with the right people’. Furthermore in this exercise, as in real-life, it is important to consider the context in which you are working; review and understand the different initiatives available; develop a strategy to guide you through the project – eg be it top-down or bottom-up; be resilient, things won’t always happen as quickly as you’d like and often not as planned; stay focused on the goal and review and evaluate progress at regular intervals.

 

Delegates worked in pairs on the Educhallenge computer simulation

Delegates worked in pairs on Educhallenge computer simulation

 

The simulation offers the user the chance to use ‘interventions’ in order to influence the progress of the project – it is important to nurture an understanding of the individual people that are involved in the project in order to get the best possible outcomes.  Working on the simulation in pairs offered delegates the chance to talk through  potential implications and consequences of making certain decisions.

On a side note I talked to John about the reason behind using pairs to work on the simulation and he explained that his experience has shown that when people have worked on the simulation by themselves they have been more likely to treat it as more of a game – they’ve taken it less seriously; and when he has tried the exercise in groups of three it is often the case that one team member feels that their voice is not being heard in the decision-making process.

 

Decisions were made in pairs and trainer John was on hand for additional advice

We used an instant feedback approach (which would be followed by a more formal post-event feedback survey) at the end of the session to find out what delegates thought about the day and the types of approaches that had been introduced.  Using pictures and post-its can be a very valuable way of getting an instant idea of what has been useful to the participant, what they really like, could use now, could use later and what isn’t relevant to them.

 

Instant feedback using pictures - trolley for ideas that participants will take back to their projects, shelf for ideas for the future, what to think about, what to discard in the litter bin, what was loved (heart) and what can be put 'in the pocket' for instant use.

The 'Shopper feedback' illustration filled with post-its at the Change Management event

 

Feedback on the exercise, and indeed the workshop, has so far been very good, and participants have already suggested ways in which some of the tools and techniques can be used within project activities.

A collection of Change Management related resources (including links to some of the tools and techniques used and the Change Management infoKit) was compiled that may prove useful to anyone dealing with change within projects and organisations more generally.

Thanks to everyone for participating so enthusiastically and to John for leading the day.

Assessment and feedback: Where are we now?

One year on for projects in the JISC Assessment and Feedback programme

Five projects from the JISC Assessment and Feedback programme took part in the 2012 International Blended Learning Conference at the University of Hertfordshire on 13-14 June, when representatives from Collaborate (University of Exeter), iTEAM and EEVS (University of Hertfordshire), FASTECH (Bath Spa University and the University of Winchester) and FAST (Cornwall College) presented and sought feedback on their emerging work.

Four of the projects combined in an interactive workshop entitled ‘Aligning assessment and feedback with 21st century needs’ to discuss their evolving work. Themes of sustainability and authenticity of assessment practices, plus the challenges presented by engaging stakeholders such as employers, staff and learners, featured strongly in the ensuing discussions, as has been the case for other projects in the programme. (See the baseline report for the programme: A view of the Assessment and Feedback Landscape (Ferrell, 2012).

Sustainability for many was a key area of interest. Ensuring outcomes are embedded in institutional practice has been one approach taken – for example, providing differentiated information and support on the effective use of electronic voting systems (EVS) and QuestionMark™ Perception™ at the University of Hertfordshire. However, sustainability also involved responding effectively and meaningfully to the views of all those involved. A challenge raised by iTEAM was ensuring the authenticity of the learner voice. Is the involvement of student representatives and the student union sufficient?

Another area for debate was the meaning and relevance to different stakeholders of ‘authentic assessment’. The Collaborate project proposed the term ‘work-integrated assessment’ as an alternative to the ill-defined concept of ‘authentic’ in relation to assessment, highlighting the need for a common language that is meaningful and acceptable to all stakeholders.

The FAST project engaged employers in discussions about the choice of technologies, but noted the fact that appropriate digital (and non-digital) literacy skills were key to the successful transition of learners into the workplace. Employers did not expect learners to be trained in technologies specific to their working environment but did expect entrants to be skilful and appropriate in their use of standard software and their communication with others.

Technology-enhanced formats could generate quality assurance issues, as emerged in the discussion led by the FASTECH project. Ensuring parity in the marking of tech-supported formats and traditional essays called for inventive approaches to writing assignments and marking criteria. The involvement of the many others on whom assessment and feedback innovation has an impact – from administrative teams, learners, external examiners and moderators to quality assurance teams – was an equally vital consideration. The Collaborate project’s star model provided a useful road map for cross-institutional collaboration.

(Collaborate project model of cross-institutional collaboration)

The opening day of the conference included a further presentation by Amanda Jefferies and Marija Cubric of the EEVS project of the emerging findings of a year-long evaluation of the University of Hertfordshire’s roll-out of electronic voting systems, adding to a rich and thought-provoking contribution by JISC teams to the International Blended Learning Conference.

We are now looking forward to ALT-C 2012 (Theme: A Confrontation with Reality) in September in Manchester where projects from the Evidence and Evaluation stand of the programme will be taking part in a symposium on ‘Effectiveness and Efficiency in Assessment and Feedback: Can technology really deliver both?

You can follow the work of projects in the JISC Assessment and Feedback programme on their blogs and find out more about their emerging outputs on the Design Studio.

(Ros Smith, Assessment and Feedback programme synthesis team)

Tracks in the snow: finding and making sense of the evidence for institutional transformation

As we trudge into the New Year many projects are getting to grips with developing a baseline and deciding how they will measure the transformational impact of their work. It therefore seems seasonal to revisit a session at last year’s ALT-C conference where we explored these issues.

The ‘Tracks in the Snow’ post on the Curriculum Design and Delivery programme blog summarises some theoretical models that are of relevance and takes a look at how some large scale projects have approached measuring transformation. Hopefully it will provide some food for thought as you start out on this journey.

Finally, if you’re still in holiday mode, remember that Panto season isn’t over yet! You can still watch the video stream of JISC and the Beanstalk from UCISA CISG. The session covers the themes of Cloud Computing, SOA, Product Modularisation, Shared Services and EA. The second half of the session (from about 20 mins) looks at some of the experiences of process improvement from the JISC Flexible Service Delivery programme. You can also go direct to the Improving Organisational Efficiency suite of infoKits to find more tools, tips and case studies on process improvement.

Image from Flickr by Andrei!. Licensed under CC