Show me the Impact! A Practical Online Tool (EDIN Impact Analysis Tool) for Evidencing the Impact of Academic Development Work

Author Details: Clare Gormley, Teaching Enhancement Unit, Dublin City University; Margaret Keane, Centre for Teaching and Learning, Maynooth University; Íde O’Sullivan, Centre for Transformative Learning, University of Limerick; Claire McAvinia, Learning, Teaching & Technology Centre, TU Dublin; Mary Fitzpatrick, Centre for Transformative Learning, University of Limerick

Corresponding Author: Clare Gormley

Abstract

Academic developers in Ireland and internationally are increasingly being called upon to demonstrate impact. In 2019, the Committee of the Educational Developers in Ireland Network (EDIN) therefore agreed, as a collective community of practice, to consider appropriate evaluation tools and metrics to demonstrate the impact of academic development. To this end, EDIN members were invited to contribute to a two-part workshop in order to (i) discuss and define impact in the context of our work as academic developers, and (ii) identify how to respond to demands to demonstrate the impact of such activities in the Irish Higher Education (HE) context. Having defined, as a collective, what constitutes impact in Irish HE, the outcomes of the first workshop informed the focus of the second workshop, in which participants focused on how to evidence impact effectively. By the end of this second workshop, the development process for a credible approach to evidencing impact was mapped out.

This chapter outlines the learning from both workshops and focuses specifically on the key outcome which was the design of the EDIN Impact Analysis Tool. This tool was developed to help anyone with an interest in educational/academic development consider how and where their work has impact. It draws on the work of Bamber (2013) to help ‘evidence the value’ of learning and teaching development activities/projects/interventions by offering examples of evidence in practice.

This online tool is presented in a step-by-step, interactive format to enable users reflect, evaluate or think generally about the impact of an activity. It includes a series of curated resources to allow deeper exploration of topics, and it allows the user to export their work to Microsoft Word for future reference. The chapter outlines the development of the resource, from concept and review of literature through to technical development, implementation, and initial feedback.

Keywords Impact, Analysis, Educational Development, Academic Development, Online Tool

Introduction

Academic developers (sometimes called educational developers or faculty developers) have a remit to work with staff to lead and support the improvement of student learning (Popovic and Baume, 2016). The overall purpose of the role is to develop the capabilities of academic staff and improve educational methods and processes. Academic developers are typically based in centralised Teaching and Learning units, working primarily with staff (rather than students) who wish to enhance or develop some aspect of their practice through professional learning of some form.

Academic developers (ADs) are increasingly being called upon to demonstrate the impact of their work on their institutions, on academic practice, and most importantly, on students’ learning (Sutherland and Hall, 2018). This Chapter explores the emergence of this focus on evidencing the impact of the work of academic developers. It describes and reflects on a year-long project (2019-2020) to engage with impact by the Educational Developers in Ireland Network (EDIN). The Chapter begins with a brief survey of recent work in relation to impact. Next, the specific work of EDIN over the period of one academic year is described. This leads into a presentation and discussion of the development of an online tool (EDIN Impact Analysis Tool) to support academic developers with the planning of evaluative research to provide them with evidence of the impact of their work. Finally, we present an evaluation of this work to date, and our proposed future directions.

This Chapter will explain how EDIN members focused on impact, how they sought to define it and how they responded in order to demonstrate the impact of their practice. Before presenting this work, it is first of all important to examine the extent to which impact has been defined and explored in academic development practice as reported in research literature.

A review of literature over the past decade reveals an emergent set of concerns around impact of educational development. The increasing pressure to demonstrate impact emerges clearly (Gray and Radloff, 2008; Sutherland and Hall, 2018; Bamber, 2013), as do the challenges this presents. These challenges are manifold (Winter et al., 2017). The specific meaning of impact in this context has been demonstrated to be highly problematic (Jones et al., 2017; Winter et al., 2017). Gray and Radloff (2008, p. 99) cite numerous synonyms for impact arising from their literature review, including ‘change’, ‘results’, ‘success’ and ‘contribution’. The place of informal networking and informal development of academic practice is another issue, as are the numerous influences or confounding variables in any measure of impact on practice (Hoessler et al., 2015). Methodological difficulties have been discussed as academic developers have used wide-ranging approaches to evaluate the effectiveness of their work (Hoessler et. al., 2015; Spowart et al., 2017; Sutherland and Hall, 2018; Winter et al., 2017).

Much previous research has concentrated on the impact of accredited programmes for academic professional development (Chalmers and Gardiner, 2015; Spowart et al., 2017). While this is very valuable, it fragments the evidence base for demonstrating the impact of the work of academic developers overall (Spowart et al., 2017). Examples exist where researchers have attempted to measure impact beyond accredited programmes including in informal settings (Houston and Hood, 2017; Hoessler et al., 2015). However, a proliferation of methodologies and frameworks has complicated this process (Winter et al., 2017), and some established writers express frustration at the series of case studies generated without a greater sense of overall effect (McNaught, 2018). Furthermore, institutional cultures, and particularly the emphasis on research in the institution, have been identified as important influences on, and mediators of, impact (Houston and Hood, 2017).

Academic development has also been explored as a politically ambiguous province of the university, seen by some as the means to effect change and empower those teaching and their students but also critiqued as managerialist and ineffectual (Roxå and Mårtensson, 2016). Examining the impact of our work with effective and robust methodologies, and for clearly defined purposes, is called for in almost all of the work reviewed above, and again most recently by the National Forum for the Enhancement of Teaching and Learning (National Forum, 2019:1) in Ireland:

Understanding the nature of impact in teaching and learning, and how it occurs, is a key first step in ensuring that resources and efforts invested by those in the higher education community result in positive changes to learning, practice, culture, structures and/or policy.

Many related studies already published in the literature refer to the impact on teaching and learning of specific initiatives such as accredited teaching and learning programmes, interventions to support early career academics, supervisors or others. In many cases these originate in funded project work with reports and research papers forming deliverables. This provides evidence to funders within or outside the institution on the value of the work being undertaken. In the case of accredited programmes, it is sometimes possible to demonstrate positive effects for individuals and their students some years later (Hanratty, 2018).

A set of issues emerges from this brief review of literature. These issues include: the problematic nature of the term impact; the eclectic mix of methodologies and cases presented; whether impact relates to accredited programmes or academic development work in general; the contextual and cultural factors influencing academic development within institutions and how evaluative studies are communicated and received; and the nature of an evidence base constructed by diverse groups of developers in their specific settings, with different interpretations of what to measure and how. On the one hand, we see the strength and diversity of academic development practice reflected in literature. On the other, we see a mix of outcomes and findings that could be considered lacking in ‘hard’ evidence of impact by those making senior management decisions. ‘Hard’ evidence in this case could be interpreted as preferentially quantifiable and/or metrics-based. Addressing this potential vulnerability was a key concern of the EDIN network.

Evidencing impact: the view of the EDIN community

Having identified impact as a key thematic area for exploration, which would drive EDIN’s activities for 2019-2020, the Committee set about identifying a dynamic and interactive approach to the investigation of this thematic area so that its membership would lead the conversation on how best to respond to calls to demonstrate the impact of their academic development work on their institutions, on academic practice, and most importantly, on students’ learning. To this end, it was agreed that two facilitated, interactive workshops would be the most appropriate means to engage academic/educational developers in the burgeoning national conversation, with the following identified as initial aims of this two-part workshop: (i) to discuss and define impact in the context of our work as academic developers, and (ii) to identify how we can respond to the demand to demonstrate the impact of our activities in the Irish HE context.

The first workshop entitled Academic Development – Creating and Demonstrating Impact, which took place in January 2019, constituted a facilitated session where EDIN members discussed the issue, shared ideas of what constitutes impact and how they demonstrate this impact. Dr Marion Palmer, an experienced facilitator, led this discussion. In addition, appropriate evaluation and metrics to measure the impact of academic development were considered, as were appropriate channels for dissemination of work in order to raise awareness of impact. An invitation to participate in the workshop was emailed to all EDIN members, a list of more than 100 people from publicly funded, independent and private colleges across Ireland. The workshop was attended by 17 EDIN members from 8 institutions across the sector. All took part in the lively discussions and contributed to the outcomes of the workshop. While there was good representation from across the sector, it should be stated that the outcomes of the workshop and the resources which ensued are not formally designed to be a representative sample of the views of those working in academic development in Ireland.

To scaffold the discussion, an introductory presentation exploring the meaning of impact and contextualising it in higher education (HE) in Ireland, most notably within the National Strategy for Higher Education to 2030 (DES, 2011) and the HEA’s Draft Mission-based Performance Compact (HEA, 2018), served as the starting points for a series of exercises for the participants. Having engaged in the first exercise, a prompt to reflect on the past 30 years by ‘imagining higher education in 1989’, an interesting discussion of the current context for higher education emerged. The second task was to discuss the meaning of impact, while the third task considered the impact of academic development by engaging participants in a discussion of why they had chosen to work in academic development. The discussion demonstrated that the impact of academic development was primarily on teaching and changing the behaviour of those who teach. While there is recognition that the impact of the work of academic developers is primarily on teaching and teaching staff, a Wordle collating the post-it notes from contributors in response to these two activities also shows that academic developers recognise that students are ultimately at the heart of that work (Figure 1). There was a clear sense that academic development was about ‘making things better and increasing professionalism’ (Palmer, 2019:4); nonetheless, the impact of academic development was perceived as being difficult to measure. Practical suggestions for demonstrating impact focused on evidence and rigour and the challenge of reporting was highlighted as problematic. In conclusion to the day, it was suggested that a resource to assist academic developers in evidencing the impact of their AD work should be developed by EDIN. It was agreed that this goal should be advanced through member workshops, which subsequently led to the design of the second workshop. Feedback from the first workshop was very positive, with participants highlighting that the discussion had made them think about what is meant by impact and how best to evidence that impact.

Figure 1: Sharing ideas of the impact of academic development

Figure 1 Impact Word Cloud

Following on this, the second workshop focused on how to evidence impact effectively. EDIN invited ADs to work together to create a resource to guide them in the creation, demonstration and reporting of the impact of their work. By the end of this workshop it was hoped that a resource to define the process and give credibility to approaches to evidencing impact would be mapped out and planned. The workshop, which took place in March 2019, was facilitated by Dr Marita Grimwood (SFHEA, FSEDA), Learning and Teaching Consultant.

The objectives of this workshop were: (1) to agree shared principles and processes for evidencing academic development, and (2) to identify and agree next steps and responsibilities for creating the resource to support evidencing academic development. To facilitate the discussion and progress development of the resource, the facilitator requested that participants read Bamber (2013), in particular, ‘Evidence, chimera and belief’, pps.11-13 and ‘A desideratum of evidencing value’, pps.39-46. Using these readings as a guide, participants agreed on the types and mix of evidence that would be appropriate to make their case in a structured way. It was agreed that Bamber’s triangulation process (research, evaluation and practice wisdom) was an excellent starting point and that, having tested the approach, Bamber’s evaluation grid would be used to capture this evidence.

Participants agreed that communicating this evidence of impact to the appropriate stakeholders was potentially the greatest challenge and that the EDIN resource should focus on the key challenges of communicating the evidence of value. Consequently, much of the workshop was dedicated to pinning down the objectives of the resource through a resource specification activity, adapted from the Reusable Learning Object (RLO) Specification (RLO CETL, 2009). A shared document was established to allow all participants to contribute to the specifications of the resource. On concluding the workshop, the main areas of work were identified, and a timeline was established for the production of the resource. Two groups were formed:

  • Group A reviewed Bamber’s evaluation grid, clarifying and redefining the categories to adapt to EDIN’s own context.
  • Group B designed the prototype resource based on the adapted evaluation grid.

Over the coming months, Group A worked together to adapt Bamber’s evaluation grid to the Irish HE context, notably, to align with the National Professional Development Framework for All Staff Who Teach in Higher Education (National Forum, 2016). In particular, collections of resources were collated in order to be integrated into the resource to inform participants as they work through populating the evaluation grid. At this point, the redesigned evaluation grid and collection of resources was passed to Group B to design the prototype tool.

Design and development of EDIN Impact Analysis Tool

There were four key development phases in what has evolved to be the final EDIN Impact Analysis Tool (https://www.edin.ie/?page_id=384).

Phase 1

As a starting point, a review of workshops and member feedback was undertaken. It was agreed that a software tool that would enable users to plan and/or reflect on the impact of their work, allowing them the opportunity to develop and share that evidence with others, would be useful.

Phase 2

After exploring various software options, it became evident that the H5P content authoring software (https://h5p.org/) had the most potential to provide the required features for the final tool. H5P is an HTML5 based technology which would allow for an interactive tool to be developed that would incorporate a number of different types of prompts for reflection. Developing a prototype would allow for the software to be tested and provide proof of concept. It was considered critically important to design an accessible and intuitive tool that could be piloted with current colleagues before bringing it to the next phase.

The goal, therefore, was to design a prototype resource that would allow the user to consider background information about impact (based on Bamber’s grid) and consider key questions in relation to how they might evidence that impact. This resource would allow the user to input their responses directly. Questions to prompt reflection were included as was the ability to generate a Word document reflecting all potential impact factors in one place.

This initial prototype was shared with the project team to pilot and, once established that this tool met the criteria of the overall objectives, the feedback was incorporated into the next phase of development – the storyboarding process.

Phase 3

To ensure that team members had visibility into the design of the tool (and could provide constructive feedback before substantial development was undertaken), a storyboard of the design was created. Google Slides was selected for this purpose as a widely used tool that everyone could potentially review and comment on, regardless of expertise with H5P. The resulting storyboard demonstrated the intended interface of the tool and included the proposed wording for the question prompts, example lists, and guidance text throughout.

The completed storyboard was presented to the EDIN committee for comment and feedback. It was very positively received with some suggestions on changes to words with a view to ensuring widespread understanding of the language used. In addition, the inclusion of a visual mechanism for users to see the overall trajectory of the tool and quickly grasp how many stages were involved from beginning to end was also suggested. A set of curated articles/resources for users to explore further was also identified during this phase and were integrated into the storyboard.

International colleagues were invited to provide feedback on the viability and value of such a tool. This was hugely valuable in highlighting the potential utility of the tool in an international context. The storyboard was signed off by the network and phase four could proceed.

Phase 4

Phase 4 focused on H5P development of the agreed storyboard. At this stage, an alternative H5P content type – the Documentation Tool – was identified as the ideal mechanism for creating an interactive and intuitive resource with a timeline feature readily built in. The full H5P version of the storyboard was subsequently developed using the Documentation Tool which allowed for multiple user input fields, a clear and consistent layout, and ‘one click’ export to a Microsoft Word document. The final tool was shared and demonstrated at the EDIN AGM on May 29th 2020, initially on a private WordPress site. Feedback from members was very positive with several attendees stating their plans to use this tool at the earliest opportunity. Following further testing, it was officially launched to the EDIN community via the main EDIN website where it now resides at: https://www.edin.ie/?page_id=384.

Figure 2: Screenshot from EDIN Impact Analysis Tool


Figure 2 Screenshot of EDIN Impact Analysis Tool

Evaluation

While the focus of this paper is on the broad theme of how educational developers can be supported to evaluate impact, this section outlines EDIN’s evaluation of the Impact Analysis Tool itself and looks to opportunities to further develop and evaluate it into the future.

Evaluating the theoretical concepts

In keeping with the original intention of having a collective approach to the theme of impact that would reflect the ideas and input of the EDIN membership, it was important that evaluation of the tool would also be representative of this group. To that end, evaluation of the tool was planned at the early stages and was an important aspect of its development, as outlined in the four phases previously described. The review and adaptation of Bamber’s evaluation grid by Group A had collective agreement from the project group and network committee on how the structure and process of the intended tool would work theoretically prior to developing it into an online version.

Evaluating the online tool

The initial online prototype was shared with the smaller project team to evaluate if the structure and approach of the tool was meeting the key intended outcomes arising from the workshops at a broader level and to ensure the collective ideas and agreement of membership was carried through.

Presenting the storyboard to the wider EDIN Committee for feedback was an opportunity to evaluate the theoretical impact as well as the potential flow and process that a user would experience. It was also an opportunity to evaluate the cognitive process for the user and the potential evaluative or pedagogical impact the tool could have. Feedback from the Committee centred around these elements as well as giving an indication of potential uses for the tool. Suggestions included using it for evaluating individual teaching and learning or professional development activities, evaluating more general approaches or practice, planning best opportunity for an intended activity, reflecting on outcomes or intended outcomes of an activity, and for quality assurance and funding purposes. It was suggested also at this point that its uses may not be limited to the Irish context and there could be potential to explore its use by international colleagues.

Prior to the launch, the tool was piloted by one of the authors, and key developers, as part of a ‘Show me the impact’ workshop with three learning technologists, one academic developer, and one head of unit within her own university. Feedback from this workshop highlighted strengths and weaknesses of the tool and pointed to areas for final enhancement and inclusion in the final version.

A number of strengths were identified: having a guided and scaffolded approach to reflection on impact, the simplicity and ease of use of the tool for the individual, inclusion of excellent prompts and the ability of the tool to support the user in considering a wider evaluation of particular points and alignment to relevant policy. Weaknesses related to: difficulty completing the tool stages within a confined workshop time frame, understanding when is best to use the tool in relation to a project, and how individuals would obtain clarifications if they were completing it on their own beyond a workshop.

In respect of the first point on timing, as this is an online, self-access tool, users can determine how much time they spend using it, relevant to their own context. In terms of when to use the tool in respect of a project, the tool was designed to be used both to plan a project or to use afterwards and further feedback from users will identify how it best works for both of these scenarios. Finally, in respect of support for use of the tool outside of a workshop context, this highlighted the importance of clear instructions for use of the tool for those using it independently.

Testing for an international context

Early indicators suggest the benefit of adapting the tool for international colleagues. The final version was shared with Professor Veronica Bamber, and with the original facilitators of the two workshops (higher education consultants in Ireland and the UK). It was also shared with other UK higher education consultants. Initial feedback was very positive and indicated it was a user-friendly tool that acknowledged and allowed the theory of impact to be translated well into practice. Feedback indicated that they intend using it themselves and recommending it to others. A further recommendation was to develop the tool beyond the Irish context. This is in line with feedback received from Irish colleagues in reviewing its potential value and uses. As a consequence, there are plans for a revised version that will widen the use of the tool beyond the Irish higher education context that could support practice in a wide range of roles and contexts. The use of the H5P software also enables colleagues to reuse the original and adapt it to their own contexts, in keeping with the ethos of open educational resources and practices.

Conclusion/Future Steps

This Chapter has presented the work of EDIN and its members in addressing a need to demonstrate the impact of their wide-ranging work activities to a variety of potential audiences or evaluators of that work. The literature highlighted the many challenges faced by academic developers in demonstrating impact, such as, the myriad of definitions of the word, pressure to demonstrate impact from many sources, and a vast range of methods and approaches used to do so. The research also highlights that examples of impact of formal activities such as accredited programmes and funded projects are more readily available, and that evidence is not as strong for measuring the impact of non-accredited or less formal activities. A key challenge for academic developers involved in the initial workshops hosted by EDIN for its members was having a sound, theoretically-based structure that would allow them to gather evidence and outcomes of practice in order for them to reflect on and identify impact for a broad range of situations. Explorations of impact beyond the immediately quantifiable, with a stronger focus on consideration of wider sources of impact, provide a possible counter argument to the kinds of criticisms raised by Roxå and Mårtensson (2016). Alignment to the National Professional Development Framework for All Staff Who Teach in Higher Education was also deemed important, and this Framework puts emphasis on acknowledging both the formal and informal work of those who are engaged in teaching and supporting teaching in Irish higher education (National Forum, 2016:2).

EDIN’s focus on examining and addressing such challenges with its members led to the creation of a theoretical structure and process that linked to the already recognised work of Veronica Bamber and, more specifically, Bamber’s evaluation grid. Further development of this structure into an online tool, and the continuous evaluation and testing of it to various users throughout the process, led to the final offering of a user-friendly, open access, H5P-based online tool that can be used by educational developers, academics, learning technologists and other teaching and learning support roles to describe the impact of a wide range of teaching, learning and professional development activities and practices. By supporting the process of reflection on a range of sources of impact, and by enabling further sharing of those perspectives, the challenge of communicating with stakeholders about where we have impact may be eased.

References

Bamber, V. (Ed.). (2013). Evidencing the value of educational development. SEDA Special No 34. London: SEDA. ISBN 978-1-902435-56-5.

Chalmers, D. & Gardiner, D. (2015). ‘An evaluation framework for identifying the effectiveness and impact of academic teacher development programmes’, Studies in Educational Evaluation, 46, pp. 81-91.

Department of Education and Skills (2011) National Strategy for Higher Education to 2030 – Report of the Strategy Group (Hunt Report), Dublin: Department of Education and Skills. Available at http://hea.ie/assets/uploads/2017/06/National-Strategy-for-Higher-Education-2030.pdf.

Gray, K. & Radloff, A. (2008). ‘The idea of impact and its implications for academic development work’, International Journal for Academic Development, vol. 13, no. 2, pp. 97-106.

Hanratty, O. (2018) Being a Professional Lecturer: Framing Professional Learning Within and Beyond an Initial Accredited Programme in Teaching and Learning in an Irish Institute of Technology. Ed.D. Thesis, Maynooth University. Available at http://mural.maynoothuniversity.ie/view/ethesisauthor/Hanratty=3AOrla=3A=3A.html.

Higher Education Authority. (2018). Draft Mission-based Performance Compact. Dublin: HEA. Available at http://hea.ie/assets/uploads/2017/04/Mission-Based-Performance-Compact-Template-2020-2021.pdf.

Hoessler, C., Godden, L. & Hoessler, B. (2015). ‘Widening our evaluative lenses of formal, facilitated, and spontaneous academic development’, International Journal for Academic Development, vol. 20, no. 3,
pp. 224-237.

Houston, D. & Hood, C. (2017). ‘University teacher preparation programmes as a quality enhancement mechanism: evaluating impact beyond individual teachers’ practice’, Quality in Higher Education, vol. 23,
no. 1, pp. 65-78.

Jones, A., Lygo-Baker, S., Markless, S., Rienties, B. & Di Napoli, R. (2017). ‘Conceptualizing impact in academic development: finding a way through’, Higher Education Research and Development, vol. 36, no. 1, pp. 116-128.

McNaught, C. (2020). ‘A narrative across 28 years in academic development’, International Journal for Academic Development, vol. 25, no. 1, pp. 83-87.

National Forum for the Enhancement of Teaching and Learning in Higher Education (2016) National Professional Development Framework for All Staff Who Teach in Higher Education. Available at: https://www.teachingandlearning.ie/wp-content/uploads/NF-2016-National-Professional-Development-Framework-for-all-Staff-Who-Teach-in-Higher-Education.pdf.

National Forum for the Enhancement of Teaching and Learning in Higher Education (2019) Forum Insight: Evidence-based Insights About Impact in Teaching and Learning. Available at: https://www.teachingandlearning.ie/wp-content/uploads/NF-2019-Impact-Insight-web-ready-1.pdf.

Palmer, M. (2019). Academic Development – Creating and Demonstrating Impact: EDIN Workshop 18 January 2019. Unpublished report.

Popovic, C. and Baume, D. (2016) ‘Introduction – Some issues in academic development’ in Baume, D.,
& Popovic, C. (Eds.). Advancing practice in academic development. London: Routledge, pp. 1-16.

RLO-CETL (2009). Resources for Evaluation of Reusable Learning Objects. Previously retrieved from
www.rlo-cetl.ac.uk/joomla, legacy site available at http://web.archive.org/web/20080215092518/http://www.rlo-cetl.ac.uk/joomla/index.php?option=com_content&task=view&id=21&Itemid=102.

Roxå, T. & Mårtensson, K. (2016). ‘Agency and structure in academic development practices: are we liberating academic teachers or are we part of a machinery suppressing them?’ International Journal for Academic Development, vol. 22, no. 2, pp. 95-105.

Spowart, L., Winter, J., Turner, R., Burden, P., Botham, K.A., Muneer, R., van der Sluis, H. & Huet, I. (2019). ‘Left with a title but nothing else’: the challenges of embedding professional recognition schemes for teachers within higher education institutions’, Higher Education Research and Development, vol. 38, no. 6, pp. 1299-1312.

Sutherland, K.A. & Hall, M. (2018). ‘The ‘impact’ of academic development’, International Journal for Academic Development, vol. 23, no. 2, pp. 69-71.

Winter, J., Turner, R., Spowart, L., Muneer, R. & Kneale, P. (2017). ‘Evaluating academic development in the higher education sector: academic developers’ reflections on using a toolkit resource’, Higher Education Research and Development, vol. 36, no. 7, pp. 1503-1514.