Over the past year the Royal Society of Edinburgh (RSE) has been on a journey to improve evaluation strategies across all teams and aspects of the organisation. By extension, the RSE Young Academy of Scotland (YAS)- an organisation of 135 leading young professionals working in a wide variety of sectors of Scottish society, founded by the RSE in 2011- is increasing its impact evaluation efforts too. As Young Academy Officer, I am responsible for coordinating with YAS members and Hannah Ormston, the RSE Impact Officer, to ensure all YAS activities- from plenary meetings to public outreach projects- have impact evaluation built in to them. This is a project I have been working on over the past year.
A Slow Start
As I have learned over the past year, “ensuring all YAS activities have impact evaluation built in to them” is much easier said than done. It’s very easy to get swept up in the day-to-day work involved in supporting an active and busy organisation and forget to think about setting outcomes and indicators at a project’s outset. Without forward planning, I admit I have occasionally found myself in the middle of a project having forgotten to devise any methods for collecting evidence or reporting on evidence once it’s collected.
There have been times in the past year in which lack of forethought has caused my evaluation efforts to yield less useful evidence than I hoped. For instance, at a YAS plenary meeting last summer, I attempted to use an interactive Mentimeter smartphone survey to evaluate Young Academy members’ satisfaction with their membership and collect information on their future aspirations for YAS. Because I did not plan this evaluation activity far enough in advance, I found myself unable to schedule any dedicated time during the meeting for members to participate in it and ended up asking members to complete it during registration time. Understandably, our members were more interested in networking and simply having a morning coffee during registration, and less than half of the attendees completed the survey. While the results were somewhat useful in identifying a few new ideas, it didn’t produce very compelling evidence regarding members’ satisfaction.
Learning from Mistakes
This is not to say that the method I used in this disappointing effort at evidence collection were at fault; Mentimeter is a powerful tool for collecting data of all kinds, and when implemented to the right people with thoughtful planning, it’s an interactive and fun way to engage an audience. At another membership meeting later last year, I was able to more successfully employ a Mentimeter survey which made much better use of the software’s potential.
On this occasion, I worked with the YAS co-chairs to design a horizon-scanning session which asked Young Academy members to share areas of the YAS strategic plan they believed YAS delivered well, and which areas needed more development.
The session was led by a YAS co-chair who examined the (anonymised) responses as they came in, and tailored the plenary discussion based on the feedback received. Because the activity was built into a meeting session, every YAS member in attendance participated. The results yielded valuable insight into how members feel about our work programme. Furthermore, the survey provides a framework for a bench-marking exercise that I hope to repeat every year.
While advanced and interactive tools like Mentimeter are a fun and invigorating way to evaluate activities, sometimes simplicity is best; in the past year, some of my greatest evaluation successes have come from the simplest feedback tool- the paper survey. One of these instances was in November, when YAS and the RSE collaborated to hold a public event to mark the centenary of Armistice Day. This event, an ‘In Conversation’ style author event which honoured WWI nurses, brought in about 45 members of the public.
The audience for this event was diverse in age and many attendees had not interacted with the RSE before. Because we did not know if all audience members had a smartphone they were comfortable with using, we decided against a Mentimeter survey. Similarly, an emailed survey such as Surveymonkey did not seem appropriate as many attendees were not ‘RSE regulars,’ and may not have appreciated a follow-up email.
With Hannah’s help, I decided a very brief and simple paper survey was the best way to collect evidence about our desired outcome, ‘attendees have more knowledge and appreciation of the contribution of nurses to the WWI effort.’ We designed an A5 form which asked attendees to give the event a numerical score on a scale of one (poor) to six (excellent), and tell us one thing they enjoyed, and one thing that might have improved the event.
Over 75% of participants filled out this survey, and the feedback they provided was encouraging and helpful. While we were delighted to see 85% of the respondents gave the event an ‘excellent’ score, the most persuasive evidence of impact came in the comments. Several audience members said they ‘learned a lot.’ One participant elaborated: ‘I am studying nursing and I found it interesting [to learn] about how times have changed and the experience [of] these women.’ Comments such as these are invaluable in demonstrating we achieve our desired outcome.
Over the last year, the YAS work programme has rapidly expanded, and staff and members are working hard to ensure that its impact evaluation efforts increase with its activities. This May, we will launch a project to create a Scottish Charter for Responsible Debate, a significant stream of work that we hope will have impact throughout Scotland. It is likely that one online activity or one paper survey will not be enough to demonstrate the reach of this programme and the changes it achieves. However, our work with Evaluation Support Scotland and with Hannah has taught us that good evaluation reporting contains many kinds of evidence to paint a detailed picture of progress made. Hopefully our efforts to set measurable outcomes and plan for evidence collection will pay off as we embark on this, and other, exciting new projects.
The RSE is pleased to be an Inspiring Impact Champion. As part of our commitment to further understand the difference we make as an organisation, the RSE Impact Champions – a group of RSE staff – will blog regularly throughout 2019-20, sharing their evaluation stories, challenges and successes.