Start Submission Become a Reviewer

Reading: Designing a Reflective Assessment with Universal Design for Learning Principles: Insights fr...

Download

A- A+
Alt. Display

Practice-based papers

Designing a Reflective Assessment with Universal Design for Learning Principles: Insights from a Speech and Language Therapy Programme

Author:

Nicola Mimms

The University of Huddersfield, GB
X close

Abstract

Reflection is a requirement of the regulatory bodies for trainee and practicing Allied Health Professionals, making authentic reflection on practice a typical form of assessment in Higher Education professional programmes. The pessimism of reflective assessments from authors, working in HE motivated the scoping and design of an evidence-based assessment for reflection on the BSc Speech and Language Therapy Programme at De Montfort University.

This paper presents a tripartite of challenges relating to the assessment discourse for professional programmes within a neoliberal University. First, the tension between professional and academic education is highlighted, addressing key themes from the debate around whether reflection can be assessed and how. Second, this is contextualised by the Higher Education Institutional national and local measures and the implementation of ‘assessment of’ and ‘assessment for learning.’ Third, the challenges of social mobility are presented given the high-stake responsibility for equitable learning and assessment. The principles of Universal Design for Learning (UDL) are therefore explored in the context of widening participation.

In response to the literature, practitioner reflections have influenced the design of an assessment which is pedagogically sound but inviting for students. This assessment is presented to demonstrate how reflective assignments may allow for creativity, individuality, and real-life value, whilst also providing a holistic judgement of students’ skills. UDL is implicit to the project and rightly aims to reduce barriers in the curriculum and the learning environment to promote inclusivity and support all students to achieve.

How to Cite: Mimms, N., 2022. Designing a Reflective Assessment with Universal Design for Learning Principles: Insights from a Speech and Language Therapy Programme. Gateway Papers: A Journal of Education and Pedagogic Research, 3(1), p.2. DOI: http://doi.org/10.3943/gp.56
16
Views
2
Downloads
  Published on 22 Sep 2022
 Accepted on 05 Sep 2022            Submitted on 25 Mar 2022

Introduction

In the UK Higher Education (HE) system, the route into a health and social care professional career requires qualifications to develop student employability (Freidson, 2004). In professional education, professionalisation refers to demonstrating the acquisition and maintenance of knowledge and skills (ibid.). The guidance from professional regulatory bodies influences the curriculum, assessment, and work-based placements that evidence student’s professional competencies. Work-based learning is hence a requirement on programmes and is defined as learning and evidencing skills for employment through practical, simulated, or written experiences (Kandlbinder, 2007). The process of exploring the evaluation of one’s own development, termed reflection, is key in this process (Moir, 2009; Schön, 2016). It is assumed that reflection supports the construction of knowledge to shape practice and rationalise decisions (Rolfe, Freshwater and Jasper, 2011). However, at an epistemological level, research into reflection has heightened the tensions between clinician and scientific knowledge.

As Kemmis (1996) states, the concept of reflection is ‘meta-thinking’, encouraging the conscious analysis of thoughts and actions and which Moustakas (1994) adds is self-awareness of experience as subjective knowledge. Also termed tacit knowledge, clinician expertise, refers to skill about a patient when decision-making becomes intuitive (Freidson, 2004). For professionals, the push for evidence-based medicine is ironic given the subjectivity of reflection. As Guyatt et al. (1992) states, this is treatment that is led by research and which Sicora (2017: 491) adds ‘diminishes’ tacit knowledge. It seems that reflection governs ‘accountability of autonomous professionals’ (Kinsella et al., 2012: 211), since reflection emerged into professional practice amidst medical negligence (Guyatt et al., 1992). Hence, it is common for practitioners to be encouraged to use Schön’s methods to achieve meaningful reflection. The application of Schön’s (1983) reflection ‘in and on practice’ therefore illustrates the relevance of facilitating change and development for the benefit of society. The reflection ‘in practice’ definition describes the process of responding to a challenge which is different to that expected, whereas reflection ‘on practice’ involves ‘retrospectively looking back’ to view the situation, exploring the actions and learning from the result (Kinsella et al., 2012: 213–214).

Reflective competence is an expectation at the point of graduation (Health Care Professions Council (HCPC), 2021a). Consequently, reflection is an assessed component on professional undergraduate programmes. Broadly termed, Barnett (2007: 34) defines assessment in education as a ‘classification’ system to illustrate students have achieved standards of competency. This is commonly termed summative assessment and links with the philosophy that knowledge is measured for ‘assessment of learning’ (Ashcroft and Palacio, 1996: 22). Historically, this type of assessment reflected the traditional methods of assessment (Falchikov, 2005), epistemologically focused on objectivist acquisition where students were passive learners. Serafini (2000) referred to this as quantitatively measuring students using exams, multiple choice questions and essays to assess the students lower order cognitive processes of recall and understanding of information. In the political context, the rationale for assessment of learning appears to link student grades with statistics for stakeholders (Race, Brown and Smith, 2004). Despite the successive government focus on this, the passivity of learners has attracted criticism (Falchikov, 2005). This is likely to contradict the aims of undergraduate training, which involve the demonstration of knowledge and critical understanding, application, evaluation, and analysis (QAA, 2014).

Thus, researchers and authors have attempted to shift the definitions over time and direct ‘assessment of learning’ towards Assessment ‘For Learning’ (AFL) (Barnett, 1992; Falchikov, 2005). The constructivist approach (Serafini 2000) of AFL underpins the modern approaches to assessment that seek to promote active and deep learning (Pellegrino, Chudowsky and Glaser, 2001; Sambell, McDowell and Montgomery, 2013). Nushe (2015) refers to these methods as open-ended or performance-based assessments, for example case studies, portfolios, and projects (Kandlbinder, 2007). Here, assignments may encourage active learning when they individualized, authentic and encourage self-evaluation (Serafini, 2000). These methods focus on constructing student-centred knowledge, emphasising the scaffolding of learning (Williams, 2011) as enhancement and motivation for learning (Boud and Falchikov, 2007). Like assessment of learning, AFL may have a summative component, but uniquely AFL permits ongoing lecturer judgement about students learning and how they respond to the teaching (Ashcroft and Palacio, 1996). As Ashcroft and Palacio (1996) describe, this is formative assessment.

Medland (2016) claims that performance-based assessment methods are supported by a breadth of research. Despite this claim, it is argued that AFL has not been researched in sufficient depth over the last 15 years. In defence, a change in the assessment strategy due to the HEI reform has created an alternative culture of assessment. First, AFL in the current climate is highlighted by Kvale (2007: 67) as control of a ‘knowledge economy.’ This refers to the government push for summative assessment when grades are a measure of institutional success (ibid.). The modern university straplines the visibility of student attainment, employment figures of graduates and the use of league tables. Second, AFL in HEI has morphed to what Falchikov (2005: 62–63) believes to be ‘assessment as accountability and ‘assessment as quality control’.

Assessment as Quality Control

The introduction of the Teaching Excellence Framework (TEF) saw prescriptive measures of institution performance attached to financial reward (Department for Business Innovation and Skills (DBIS), 2016). The emphasis on profit as success has led to competition between institutions and issues with the quality of assessment within universities due to the pressures of accountability for student results (Bryan and Clegg, 2006). Arguablythis has created ideologies of performance (Ball, 2012). The emphasis on assessment is consequently a controversial topic in HE, attached to government incentives (DBIS, 2016) and advocating competition between HEIs for student outcomes. This again feeds into the earlier argument of the ‘knowledge economy’. As Lomer, Papatsiba and Nooda (2018) add, reverting to assessment of learning is a consequence of the context of marketized education. Assessment in HEIs is consequently multi-faceted; the government and stakeholders at the macro level, the institution at the meso level and, finally, lecturers at the micro level.

We describe neoliberal assessment of learning, when the political movements have switched the definition of assessment to measure teaching quality (Entwistle, Thompson and Tait, 2020). Assessment as measurement presents pedagogical conflict when there is an expectation for quality teaching and good grades (Nixon, Scullion and Hearn, 2016). As argued by Ball (2012: 215), this regulation creates ‘ontological dilemmas’ for lecturers. Educationalists, who share concern when grades are attached to institutional success (Hursh, 2007; Sambell, McDowell and Montgomery, 2013), perceive assessment as an opportunity to develop skills (Clegg and Bryan, 2006). This paper argues for a shift to assessment for life-long learning (Boud and Falchicov, 2007).

Institutional responses to changes in government policy have influenced local measures in HEI and their implementation of ‘assessment of and for learning’ (Sambell, McDowell and Montgomery, 2013). The reality of the constraints of neoliberalism appear to increase the risk of creating new assessments and steering away from traditional methods (Falchikov, 2005). Despite this, attempts to achieve authentic assessment which is underpinned by pedagogy is a recommendation from several authors (Hahn, 2018; Kirkwood, 2007; Kvale, 2007; Sambell, McDowell and Montgomery, 2013).

The Impartiality of Assessment

Wider issues are also apparent which run parallel to marketised HE. Institutions have risen to the challenge of social mobility, ensuring that learning and assessment is equitable (Whitty and Anders, 2015). The core principles of Universal Design for Learning (UDL) (CAST, 2022) are relevant here, given the UK governments widening participation agenda. This policy is described by Schuetze and Slowley (2002) as enabling students who were traditionally excluded from going to university due to social, economic, or cultural reasons, to access HE. On the flipside, UDL aims to remove the assumptions of HE for academic students and instead promote an education system for all (ibid.).

Thus, the UDL approach encourages:

  1. Flexible ways to evidence learning;
  2. Engagement and preferential style at the centre of learning; and
  3. Multiple ‘representations’ of learning (CAST, 2022).

Ultimately when varied means of assessment and learning are utilised, ‘UDL aims to reduce barriers in the curriculum and the learning environment to promote inclusivity and support all students to achieve’ (Rao and Meo, 2016: 11). Within the assessment parameters of UDL, summative and formative assessments have equal importance. The benefits of formative assessment are fourfold. Formative assessment creates opportunities for students to monitor own progress (NCUDL, 2010), utilise scaffolds to ‘acquire content’ and practice for the assessment, and finally to evidence learning (Rao and Meo, 2016: 5). Recent authors have evidenced the formative advantages of UDL already listed (Coffman and Draper 2022; Tobin and Behling 2018). Additionally, Snow (2018) details summative recommendations, including having a strong, summative assessment design and clear learning outcomes, to emphasise the purpose of each assignment.

When these factors were considered, UDL improved the learning outcomes for students with and without a disability (Al-Azawei et al., 2016; Atkinson et al. 2000; Orsmond, Merry and Reiling 2002). However, key recent findings by Coffman and Draper (2022) are not without critique. Although the varied learning and assessments were positively reported by students, occasionally they were overwhelmed by choice. Despite this, and like Rose (2000), Coffman and Draper (2022) concluded that UDL presented learning environments well-suited to many people’s needs.

Sandars and Murray (2009) have rightly demonstrated, a clear research gap regarding assessment of reflection that considers the needs of all students. This is of no surprise given Croy’s (2018) earlier argument that assessment of professional development is not prioritised in research. Additionally, it is recognised that there is limited guidance to support the design of authentic and sustainable assessments (Villarroel et al., 2017). That said, some recommendations are available for addressing the shorter-term goals of summative assessment, to measure the intended learning outcomes (Kirkwood, 2007).

The success of designing a good assessment, in order to ascertain whether learning outcomes have been met, considers both content and construct validity (Harlen, 2015). Content validity applies to assessments which make a judgement about the extent that required skills and knowledge have been evidenced. Whereas construct validity is the learning and knowledge that the assessment constructs. Choice and differentiation in work present issues with judgements of quality, for example in terms of marking inequity. For instance, Nushe (2015) evidenced that students were dissatisfied with the equity of marking when projects allowed variations in choice of the assessment task. As Freeman and Lewis (1998) state, differentiation in work is felt to reduce the reliability of assurance procedures in place to tackle issues of inequity, including moderation and marking criteria. Therefore, despite the importance of UDL, providing quality assessments that meet the learning needs of all students is challenging. It is worth noting that there is limited evidence into assessment quality and UDL principles (Nushe, 2015).

As argued so far; designing assessments which are equitable, professionally sound, as well as worthwhile to students, is a high-stake task. When the values of neoliberal education (Ball, 2012) lead to professional conflict (Bryan and Clegg, 2006), and the educator’s judgement is clouded by issues of attainment and passivity, the reasons for assessment discussed herewith clearly present challenges for pedagogic practice (Freeman and Lewis, 1998).

Background to the Project

Reflection is an academic and personal interest of the author and, as highlighted already, a requirement of the regulatory bodies for trainee and practising Allied Health Professionals. Few empirical studies show the effectiveness of reflection for students; the benefits of doing are based on the personal reflections of authors (Aronson, 2011; Redmond, 2006), rather than peer reviewed research. Authors working in HE share concerns with reflective learning and assessments in two key regards (Bryan and Clegg, 2006; McKinney and Sen, 2016; Nushe, 2015). First, that reflective assignments may provide limited opportunities for skill and professional development if students list descriptions of their experience. According to Polanyi (1966), deeper and more meaningful reflection improves knowledge gain when personal knowledge of the situation proves theory (Polanyi, 1966). Second, there is the danger of students writing what they think the lecturer wants to read (Freeman and Lewis, 1998), thereby limiting the credibility of their accounts.

From the author’s own practice on one Bachelor of Science (BSc) degree programme, compulsory reflection limits engagement with reflection. There is a strong sense of disinterest and only some students complete optional and non-assessed reflections. According to Horton, Gibson and Curington (2021), the exploration of formative and summative approaches at a local, institutional level indicates the need to establish a greater degree of consistency in developing reflective thinking. Also, there is not yet a response to Ward and Gracey’s (2006) recommendations of developing lecturers’ skills in supporting to students access the curriculum for reflective practice. This is understandable given that the lack of studies investigating effective teaching approaches means that it has not been possible to publish guidance.

The complexities discussed thus so far, motivated the scoping and design of an evidence-based assessment for reflection and work-based learning, designed to improve student outcomes. It was important to discover whether employing choice and flexibility would allow students the freedom to reflect on their development rather than constraining them. Secondly, it was key to follow the research base around self-assessment and portfolios as assessment methods promoting reflection and allowing learning to be transferred between modules (Freeman and Lewis 1998). Peeters and Vaidya (2016) and Croy (2018) evidenced that reflective assessments involving self-assessment, peer assessment or portfolios were effective methods of assessment.

The tension between political and pedagogical recommendations regarding assessment (Sambell, McDowell and Montgomery, 2013; Williams, 2011) provided justifications for an assessment design that considered both the institutional and pedagogical requirements. With these points in mind, the author had three aims for their work.

  1. To create a formative and summative assessment, which was peer reviewed by a colleague, to support reflection on practice while complying with the institution’s assessment policy.
  2. To critically reflect on the author’s own practice and the assessment designed using key literature.
  3. To provide recommendations for the author’s department and future practice.

The first aim of the project was achieved once the assessment was drafted, and once feedback was received. Changes were made to the assessment design. For example, following this reflection, there was greater transparency in the assessment brief for students around the longer-term goals of professional development. The purpose of this was to strengthen the authenticity of the assessment so that students understand the rationale of the purpose.

Utilising evidence-based pedagogy, aims two and three are addressed in the discussion of the assessment designed. Also note that an evaluation of the assessment was not possible, due to the authors honorary role at the institution.

The assessment design addressed the module learning outcomes for the BSc Speech and Language Therapy (SLT) professional practice module, completed by first-year students. This involved two 10-week placements, which are both assessed. The placement allowed students to consolidate knowledge (Freeman and Lewis, 1998) from two other weekly taught modules: introduction to linguistics & language acquisition; and communication, disability & psychology. Both modules introduced typical speech and language development and either delayed, disordered, or acquired conditions. The professional practice module therefore allowed students to relate theory to paediatric and adult practice (Freeman and Lewis, 1998). The assignment encouraged students to make informed decisions, use theory and reflect on whether their decisions were appropriate (Kandlbinder, 2007).

The assessment design had two parts: (1) a portfolio collection of evidence to illustrate student practice and development from first-year placements; and, (2) a written reflection on professional practice, allowing students to begin to reflect and develop professionally. The portfolio invited students to complete tasks on placement, following the identification of an area of development regarding the HCPC Standards of Proficiency (HCPC, 2021b). Students were encouraged to personalise their learning so that the experience facilitated critical reflection on their own practice and application of theory (Devick-Fry, Klages and Barnhill, 2010; Horton et al., 2004). The portfolio allowed students to integrate and consolidate their learning from the overlapping modules outlined (Peeters and Vaidya, 2016) and created choice to foster motivation and aid learning (Harlen, 2015). The weekly, written reflections described above, and the additional peer and self-assessments provided opportunities to practice for the reflective assignment, and encouraged students to create a log of evidence for the requirements of the HCPC audit.

The reflection on practice invited students to reflect on their weekly experience and submit these reflections as an appendix. Students were encouraged to choose how they presented the assignment. They either submitted a report or a reflective article, such as the type found in the Royal College of Speech and Language Therapy (RCSLT) Bulletin magazine. The report provided an opportunity for students to develop their professional writing style, which is a common SLT professional task. Alternatively, writing for the magazine promoted shared practice and publishing skills.

Discussion

The challenges that formative assessment had created in the author’s own teaching encouraged the construction of an assessment that strongly linked formative and summative assessment (Crisp, 2012). The rationale was to invite students to monitor their own performance (Harlen, 2015) through self-assessment. This is authentic and true to the HCPC requirements that SLTs evidence mandatory Continued Professional Development using reflection.

Sources agree that students need to understand personal relevance to engage in their learning (Sambell, McDowell and Montgomery, 2013; Stenberg, Rajala and Hilppo, 2016). Therefore, this is explained to the students in the assessment brief. The formative assessment is continuous, including self-evaluation of skills, as recommended by Falchikov (2005), so that students can act on feedback to further develop, rather than just completing the summative assessment at the end of the semester. Both the self-assessment and peer assessment permit students to practice verbally delivering their reflection in anticipation of assignment two, which they submit in writing. This was a consideration based on the recommendations of formative assessment by Williams (2011) and Andrews, Brown and Mesher (2018), to improve students learning and grades.

Traditional methods of reflection have involved keeping a written diary or writing a reflective essay (Bassot, 2016) and currently the programme encourages the use of reflective frameworks, for example Johns (2017) and Gibbs (1998). From the literature, it seems that these may not be the methods of reflection that are the most advantageous (Aronson, 2011: 2000; Boud and Walker, 1998). Instead, as Tobin and Behling (2018) propose, personal learning preferences should guide the reflective tools. Given the limitations of reflective assignments previously mentioned (Bryan and Clegg, 2006; McKinney and Sen, 2016), the rationale for using peer assessment in the portfolio was to encourage self-assessment and improvement. As Carr and Cormody (2006) argue, face to face discussion alongside reflection invites students to respond to constructive feedback, which is key to developing criticality. Peer assessment is promoted in the institution’s policy guidance and by several authors (Boud and Falchikov, 2007; Sambell, 2010; Williams, 2011). This could increase engagement with the assessment (The Higher Education Academy (HEA), 2014), as motivation may be influenced by interactions with others, enjoyment, and support (Ryan and Deci, 2000).

This links with Williams (2011) and Egan and Costello’s (2016) recommendations to facilitate development of skills in a safe environment. There are challenges with peer assessment, such as when some students choose not to work with their peers (Ingham and Boyle, 2006). However, interpersonal skills are key in the SLT role and, therefore, this justifies why this peer assessment should be embedded into the assignment. Additionally, from a UDL perspective, peer assessment is an evidenced based strategy to improve student achievement and learning (CAST Inc, 2022). This is due to the formative component which is likely to be fostered by peer assessment as a more objective measure than self-assessment (Orsmond, Merry and Reiling, 2002). As Zins et al. (2007) argued, there is scientific research to support the positive impact of social and emotional experience on learning.

The portfolio is judged as having high validity as students reflect on an experience and evidence this (Freeman and Lewis, 1998), and also because it combats the issues already discussed around students writing to pacify the marker. Submitting the discussed tasks in a portfolio should allow the opportunity to develop students’ own reflective thinking and style (Devick-Fry, Klages and Barnhill, 2010). This is part of the process of seeing and understanding the benefits of capturing thoughts, feelings, and evaluation, and was based on Valo (2000) and Sandars and Murray’s (2009) idea of choosing how to reflect, including answering questions, free writing or creating a podcast (Santos, Figuierado and Vieira, 2019). The flexibility allows ‘self-discovery’ and students may then be more likely to engage in the learning and assessment (Barnett 2007: 38), based on Ryan and Deci’s (2000) theory that choice and self-direction are motivating. The multi-modal reflection additionally follows the UDL principles and may address the needs of all students (Tobin and Behling, 2018).

The reflective assignment invites students to develop an action plan and submit their reflections as appendices to counteract the challenges highlighted by Bryan and Clegg (2006) and McKinney and Sen (2016) with written reflections. Lecturers may then be able to validate the student discussions of their own learning and reflections. Students are required to refer to their reflections from placements in semester 1 and 2 to evidence their development over time. This is designed to motivate students to affirm the benefits of maintaining a record and incentivise reflection (Sambell, McDowell and Montgomery, 2013 and Stenberg, Rajala and Hilppo, 2016). Encouraging engagement with reflection is key due to concerns about student disengagement with reflective practice, in both the literature (Sandars and Homer, 2008; Sandars and Murray, 2009; Grant et al., 2006; Harris, 2008), and within the author’s own practice.

The justification for student choice, in creating either a reflective report or an article for an SLT magazine was to allow students to have a purpose for their assignment and to authenticate the purpose of writing for a real audience (Freeman and Lewis, 1998). Students can, therefore, see the real-life authenticity of developing report writing, which is important for the SLT role (Kandlbinder, 2007; Peeters and Vaidya, 2016) and beneficial for employment (Villarroel et al., 2017). The option of writing a contribution for an article was felt to give students the opportunity to add value to the profession (Sambell, McDowell and Montgomery, 2013) by writing about developing reflective practice and the benefits, for their peers.

The use of report structures is felt to achieve marking reliability between students’ work (Freeman and Lewis, 1998), and examples of report templates were created. The use of assessment criteria may support the issues with reliability and ensure consistency with the institution’s policy. Reliability is felt to be strengthened by having one lecturer who marks the student assignments anonymously and a sample of marking viewed by another colleague, termed moderation. Considering Harlen’s (2015) criteria for sound assessments, this assessment is felt to be valid and to illustrate construct and content validity, as it measures the intended learning outcomes and assesses the skills and knowledge required. In terms of reliability, consistency of marking written reflections due to a lack of rubric specific to reflection has highlighted challenges with marking (Smith, 2011). The author reflected on the local policy and proposed that criteria for reflection should be designed to explicate the range of depth of reflection that is appropriate to the level of undergraduate training. As Bowman (2021) evidenced, clarity was lacking as to the level of reflective ability at each undergraduate level.

Designing a new assessment is deemed challenging (Boud and Falichikov, 2007) given the accountability for students’ learning and the quality issues in HEI, highlighted by Barnett (2007) and Nushe (2015). Understandably, this is the reality of the neoliberal paradigmatic approach to ranking universities (Hahn, 2018). As discussed, the designed assessment proposed solutions to counteract some of these issues.

  1. Despite the challenges expressed and the criticism attached to authentic assessment, this assessment appears to be a reasonable solution to achieve competency-based development that is justified by the literature.
  2. Practitioner reflections have influenced the design of an assessment that is pedagogically sound but motivating for students to engage in reflection ‘for the longer-term’ (Boud and Falchikov, 2007).
  3. In principle, this assessment allows creativity, individuality, and real-life value to provide a holistic judgement of students’ skills.
  4. Finally, the assessment addresses the principles of UDL set out by Rose (2000).

As argued by Coffman and Draper (2022), this is crucial given that this is a recommendation from their research. Also given the contemporary cultures of HE institutions, we must respond to the needs of diverse cohorts.

Although it was not possible to trial the assessment and present the outcomes, the next steps for the implementation of the assessment should involve wider enhancement and judgement in the context of assurance processes. Future research should continue to explore the challenges and benefits of designing authentic assessments and as mentioned, their validity and reliability (Uygur et al., 2019: 3). This may address the gaps in the literature around the impact of work-based learning. First-year students may be a suitable population for such studies, as learning is likely to be a result of teaching on the programme. Despite this recommendation, reflective ability at each undergraduate ability continues to challenge HEI’s and as already mentioned by Bowman (2021: 47), clarity is required.

Acknowledgements

Thank you to the Master of Arts Education Practice team at De Montfort University for the opportunities and experience of authentic assessment. This motivated the topic and project for my negotiated study module as part of the MA. Thank you to Dr Gisela Oliveria and Dr Rosi Smith for guidance and reassurance. A final thankyou to the Speech and Language Therapy lecturing team for your time. A special mention to Idalina Rodrigues, Senior Lecturer in Speech and Language Therapy for the opportunity and encouragement to design this assessment.

Competing interests

The author has no competing interests to declare.

References

  1. Al-Azawei, A, Serenelli, F and Lundqvist, K. 2016. Universal design for learning (UDL): A content analysis of peer reviewed journals from 2012 to 2015. The Journal of Scholarship of Teaching and Learning, 16(3): 39–56. DOI: https://doi.org/10.14434/josotl.v16i3.19295 

  2. Andrews, M, Brown, R and Mesher, L. 2018. Engaging students with assessment and feedback: Improving assessment for learning with students as partners. Practitioner Research in Higher Education, 11(1): 32–46. https://pure.port.ac.uk/ws/portalfiles/portal/10445415/Engaging_students_with_assessment_and_feedback.pdf. 

  3. Aronson, L. 2011. Twelve tips for teaching reflection at all levels of medical education. Medical teacher, 33: 200–05. DOI: https://doi.org/10.3109/0142159X.2010.507714 

  4. Ashcroft, K and Palacio, D. 1996. Researching into assessment and evaluation: in colleges and universities. London: Kogan Page Limited. 

  5. Atkinson, RK, Derry, SJ, Renkl, A and Wortham, D. 2000. Learning from examples: Instructional principles from the worked examples research. Review of Educational Research, 70(2): 181–214. DOI: https://doi.org/10.3102/00346543070002181 

  6. Ball, S. 2012. Global Education Inc: New policy networks and the neo-liberal imaginary. Oxon: Routledge. 

  7. Barnett, R. 1992. Improving Higher Education: total quality care. Buckingham: SRHE and Open University Press. 

  8. Barnett, R. 2007. Assessment in higher education: an impossible mission? In Boud, D and Falchikov, N (eds.), Rethinking Assessment in Higher Education: learning for the longer term, 29–40. Oxon, Routledge. DOI: https://doi.org/10.4324/9780203964309 

  9. Bassot, B. 2016. The reflective journal. London: Palgrave. DOI: https://doi.org/10.1057/978-1-137-60349-4 

  10. Bowman, M. 2021. A framework for scaffolding academic reflective writing in dentistry. European Journal of Dental Education, 25: 35–49. DOI: https://doi.org/10.1111/eje.12575 

  11. Boud, D and Falchikov, N. 2007. Rethinking Assessment in Higher Education: learning for the longer term. Oxon: Routledge. DOI: https://doi.org/10.4324/9780203964309 

  12. Boud, D and Walker, D. 1998. Promoting reflection in professional courses: the challenge of context. Studies in Higher Education, 23(2): 191–206. DOI: https://doi.org/10.1080/03075079812331380384 

  13. Bryan, C and Clegg, K. 2006. Innovative assessment in higher education. Oxon: Routledge. DOI: https://doi.org/10.4324/9780203969670 

  14. Carr, S and Carmody, D. 2006. Experiential learning in women’s health: medical student reflections. Medical Education, 40: 768–774. DOI: https://doi.org/10.1111/j.1365-2929.2006.02536.x 

  15. CAST Inc. 2022. Universal Design for Learning Guidelines version 2.2. Available from: http://udlguidelines.cast.org [Last accessed 5th September 2022]. 

  16. Coffman, S and Draper, C. 2022. Universal design for learning in higher education: A concept analysis. Teaching and Learning in Nursing, 17(1): 36–41. DOI: https://doi.org/10.1016/j.teln.2021.07.009 

  17. Crisp, GT. 2012. Integrative assessment: reframing assessment practice for current and future learning. Assessment and Evaluation in Higher Education, 37(1): 33–43. DOI: https://doi.org/10.1080/02602938.2010.494234 

  18. Croy, SR. 2018. Development of a group work assessment pedagogy using constructive alignment theory. Nurse Education Today, 61: 49–53. DOI: https://doi.org/10.1016/j.nedt.2017.11.006 

  19. DBIS. 2016. Success as a knowledge economy: Teaching Excellence, social mobility and student choice. London: Crown. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/523546/bis-16-265-success-as-a-knowledge-economy-web.pdf [Last accessed 5th September 2022]. 

  20. Egan, A and Costello, L. 2016. Peer Assessment of, for and as Learning: A Core Component of an Accredited Professional Development Course for Higher Education Teachers. The All-Ireland Journal of Teaching and Learning in Higher Education (AISHE-J), 8(3): 2931–2931. https://ojs.aishe.org/index.php/aishe-j/article/view/293/471. 

  21. Entwistle, N, Thompson, S and Tait, H. 2020. Guidelines for promoting effective learning in higher education. Available from: https://www.researchgate.net/publication/339627823_Guidelines_for_Promoting_Effective_Learning_in_Higher_Education [Last accessed 5th September 2022]. 

  22. Falchikov, N. 2005. Improving assessment through student involvement: practical solutions for aiding learning in higher and further education. Oxon: Routledge. 

  23. Freeman, R and Lewis, R. 1998. Planning and Implementing Assessment. London: Kogan Page Limited. 

  24. Freidson, E. 2004. Professionalism: The Third Logic. Cambridge: Polity Press. 

  25. Devick-Fry, J, Klages, C and Barnhill, A. 2010. A systematic approach to measuring inquiry in teacher education. Studies in learning, evaluation, innovation and development, 7: 37–51. 

  26. Gibbs, G. 1998. Learning by doing: a guide to teaching and learning methods. Oxford: Further Education Unit. 

  27. Grant, A, Kinnersley, P, Metcalf, E, Pill, R and Houston, H. 2006. Students’ views of reflective learning techniques: an efficacy study at a UK medical school. Medical Education, 40(4): 379–588. DOI: https://doi.org/10.1111/j.1365-2929.2006.02415.x 

  28. Guyatt, G, Cairns, J, Churchill, D, Cook, D, Haynes, B, Hirsh, J, Irvine, J, Lavine, M, Levine, M, Nishikawa, J, Sackett, D, Brill-Edwards, P, Gerstein, H, Gibson, J, Jaeschke, R, Kerigan, A, Neville, A, Panju, A, Detsky, A, Ekin, M, Frid, P, Gerrity, M, Laupacis, A, Lawrence, V, Menard, J, Moyer, V, Mulrow, C, Links, P, Oxman, A, Sinclair, J and Tugwell, P. 1992. Evidence-Based Medicine: A New Approach to Teaching the Practice of Medicine. JAMA, 268(17): 2420–2425. DOI: https://doi.org/10.1001/jama.1992.03490170092032 

  29. Hahn, W. 2018. Assurance of learning: an evaluation of how grade inflation and course pedagogy impact student learning sustainability in business core courses.’ Journal of Higher Education Theory and Practice, 18(2): 128–137. DOI: https://doi.org/10.33423/jhetp.v18i2.552 

  30. Harlen, W. 2015. Assessment and the curriculum. In Wyse, D, Hayward, L and Pandya, J (eds.), The Sage handbook of curriculum, pedagogy and assessment, 693–709. London: Sage publications. DOI: https://doi.org/10.4135/9781473921405.n43 

  31. Harris, M. 2008. Scaffolding reflective journal writing negotiating power, play and position. Nurse Education Today, 28: 314–326. DOI: https://doi.org/10.1016/j.nedt.2007.06.006 

  32. Horton, AG, Gibson, KB and Currington, AM. 2021. Exploring reflective journaling as a learning tool: An interdisciplinary approach. Archives of Psychiatric Nursing, 35(2): 195–99. DOI: https://doi.org/10.1016/j.apnu.2020.09.009 

  33. Horton, S, Byng, S, Bunnings, K and Pring, T. 2004. Teaching and learning speech and language therapy skills: the effectiveness as classroom as clinic in speech and language therapy student education. International Journal of Language and Communication Disorders, 39(3): 365–90. DOI: https://doi.org/10.1080/13682820410001662019 

  34. HCPC. 2021a. Reflection and meeting your standards. Available from: https://www.hcpc-uk.org/standards/meeting-our-standards/reflective-practice/reflection-and-meeting-your-standards/ [Last accessed 5th September 2022]. 

  35. HCPC. 2021b. Speech and Language Therapists. Available from: https://www.hcpc-uk.org/standards/standards-of-proficiency/speech-and-language-therapists/ [Last accessed 5th September 2022]. 

  36. Hursh, DW. 2007. Marketing education: the rise of standardized testing, accountability, competition, and markets in public education. In Ross, EW and Gibson, R (eds.), Neoliberalism and education reform, 21–34. New Jersey: Hampton Press Inc. 

  37. Ingham, J and Boyle, RA. 2006. Generation X in Law School: How These Law Students Are Different from Those Who Teach Them. Journal of Legal Education, 56: 281–6. 

  38. Johns, C. 2017. Becoming a reflective practitioner. 5th edition. Hoboken, NJ: John Whiley and Sons Ltd. 

  39. Kandlbinder, P. 2007. Writing about practice for future learning. In: Boud, D and Falchikov, N (eds.), Rethinking Assessment in Higher Education: learning for the longer term, 159–66. Oxon: Routledge. 

  40. Kemmis, S. 1996. ‘Action research and the politics of reflection.’ In Boud, D, Keogh, R and Walker, D (eds.), Reflection: turning experience into learning, 139–164. London: Nichols Publishing. 

  41. Kinsella, EA, Caty, M-E, Ng, S and Jenkins, K. 2012. Reflective practice for allied health: theory and application. In: English, LM (ed.), Adult Education and Health, 210–28. Toronto: University of Toronto Press. DOI: https://doi.org/10.3138/9781442685208-015 

  42. Kirkwood, M. 2007. The contribution of sustainable assessment to teachers’ continuing professional development. In: Boud, D and Falchikov, N (eds.), Rethinking Assessment in Higher Education: learning for the longer term, 167–80. Oxon: Routledge. 

  43. Kvale, S. 2007. Contradictions of assessment for learning in institutions of higher learning. In: Boud, D and Falchikov, N (eds.), Rethinking Assessment in Higher Education: learning for the longer term, 57–71. Oxon: Routledge. 

  44. Lomer, S, Papatsiba, V and Naidoo, R. 2018. Constructing a national higher education brand for the UK: positional competition and promised capitals. Studies in Higher Education, 43(1): 134–53. DOI: https://doi.org/10.1080/03075079.2016.1157859 

  45. McKinney, P and Sen, B. 2016. The use of technology in group-work: A situational analysis of students’ reflective writing. Education for information, 32: 375–96. DOI: https://doi.org/10.3233/EFI-160983 

  46. Medland, E. 2016. Assessment in higher education: drivers, barriers and directions for change in the UK. Assessment and evaluation in higher education, 41(1): 81–96. DOI: https://doi.org/10.1080/02602938.2014.982072 

  47. Moir, J. 2009. Bologna bytes: Higher Education and Personal Development Planning. International journal of learning, 16(9): 367–73. DOI: https://doi.org/10.18848/1447-9494/CGP/v16i09/46544 

  48. Moustakas, C. 1994. Phenomenological research methods. London: Sage Publications. DOI: https://doi.org/10.4135/9781412995658 

  49. NCUDL. 2010. Universal Design for Learning guidelines version 2.0. Available at: https://udlcenter.org/aboutudl/udlguidelines [Last accessed 5th September 2022]. 

  50. Nixon, E, Scullion, R and Hearn, R. 2016. Her majesty the student marketised higher education and the narcissistic (dis) satisfactions of the student- consumer. Journal Studies in Higher Education, 43(6): 927–43. DOI: https://doi.org/10.1080/03075079.2016.1196353 

  51. Nushe, D. 2015. Student assessment and its relationship with curriculum, teaching and learning in the twenty-first century. In: Wyse, D, Hayward, L, Pandya, J (eds.), The Sage handbook of curriculum, pedagogy and assessment, 838–52. Thousand Oaks, CA: Sage publications. DOI: https://doi.org/10.4135/9781473921405.n52 

  52. Orsmond, P, Merry, S and Reiling, K. 2002. The use of exemplars and formative feedback when using student derived marking criteria in peer and self-assessment. Assessment and Evaluation in Higher Education, 27(4): 309–23. DOI: https://doi.org/10.1080/0260293022000001337 

  53. Peeters, MJ and Vaidya, VA. 2016. A mixed-methods analysis in assessing students’ professional development by applying an assessment for learning approach. American journal of pharmaceutical education, 80(5): 1–10. DOI: https://doi.org/10.5688/ajpe80577 

  54. Pellegrino, J, Chudowsky, N and Glaser, R. 2001. Knowing what students know: the science and design of educational assessment. Washington, DC: National Academy Press. 

  55. Polanyi, M. 1966. The tacit dimension. Chicago: The University of Chicago Press. 

  56. Quality Assurance Agency. 2014. UK Quality Code for Higher Education: Part A: Setting and maintaining academic standards. Available from: https://www.qaa.ac.uk/docs/qaa/quality-code/qualifications-frameworks.pdf [Last accessed 5th September 2022]. 

  57. Race, P, Brown, S and Smith, B. 2004. 500 tips on assessment. 2nd edition. Oxon: Routledge. DOI: https://doi.org/10.4324/9780203307359 

  58. Rao, K and Meo, G. 2016. Using Universal Design for Learning to design Standards-Based Lessons. SAGE Open, 6(4): 1–12. DOI: https://doi.org/10.1177/2158244016680688 

  59. Redmond, B. 2006. Reflection in action: developing reflective practice in health and social services. London: Ashgate. 

  60. Rolfe, G, Jasper, M and Freshwater, D. 2011. Critical reflection in practice: generating knowledge for care. 2nd edition. Basingstoke: Palgrave Macmillan. 

  61. Rose, D. 2000. Universal Design for Learning. Journal of Special Education Technology, 15(1): 67–70. DOI: https://doi-org.libaccess.hud.ac.uk/10.1177/016264340001500108. 

  62. Ryan, R and Deci, EL. 2000. Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being. American psychologist, 55(1): 68–78. DOI: https://doi.org/10.1177/016264340001500108 

  63. Sambell, K. 2010. Enquiry-based learning and formative assessment environments: student perspectives. Practitioner Research in Higher Education, 4(1): 52–61. DOI: https://doi.org/10.1037/0003-066X.55.1.68 

  64. Sambell, K, McDowell, L and Montgomery, C. 2013. Assessment for learning in Higher Education. London and New York: Routledge. DOI: https://doi.org/10.4324/9780203818268 

  65. Sandars, J and Homer, M. 2008. Reflective learning and the net generation. Medical teacher, 30: 877–9. DOI: https://doi.org/10.1080/01421590802263490 

  66. Sandars, J and Murray, C. 2009. Digital storytelling for reflection in undergraduate medical education: a pilot study. Education for primary care, 20: 441–4. DOI: https://doi.org/10.1080/14739879.2009.11493832 

  67. Santos, J, Figuieredo, AS and Vieira, M. 2019. Innovative pedagogical practices in higher education: an integrative literature review. Nurse Education Today, 72: 12–17. DOI: https://doi.org/10.1016/j.nedt.2018.10.003 

  68. Schuetze, HG and Slowey, M. 2002. Participation and exclusion: A comparative analysis of non-traditional students and lifelong learners in higher education. Higher Education, 44(3/4): 309–27. DOI: https://doi.org/10.1023/A:1019898114335 

  69. Schön, D. 1983. The Reflective Practitioner: How Professionals Think in Action. New York: Basic Books. 

  70. Schön, D. 2016. The reflective practitioner: how professionals think in action. 2nd ed. Oxon: Routledge. 

  71. Serafini, F. 2000. Three Paradigms of Assessment: Measurement, Procedure, and Inquiry. The Reading Teacher, 54(4): 384–393. 

  72. Sicora, A. 2017. Reflective practice, risk and mistakes in social work. Journal of social work practice, 31(4): 491–502. DOI: https://doi.org/10.1093/bjsw/bcx152 

  73. Smith, E. 2011. Teaching critical reflection. Teaching in Higher Education, 16(2): 211–23. DOI: https://doi.org/10.1080/13562517.2010.515022 

  74. Snow, HK. 2018. High-impact practices, universal design and assessment opportunities in liberal arts seminars. ASIANetwork Exchange, 25(2): 117–35. DOI: https://doi.org/10.16995/ane.284 

  75. Stenberg, K, Rajala, A and Hilppo, J. 2016. Fostering theory-practice reflection in teaching practicums. Asia-pacific journal of teacher education, 44(5): 470–85. DOI: https://doi.org/10.1080/1359866X.2015.1136406 

  76. The Higher Education Academy (HEA). 2014. The HEPI–HEA Student Academic Experience Survey 2014. Available from: https://www.hepi.ac.uk/wp-content/uploads/2014/05/HEA_HEPI-Report_WEB_160514.pdf [Last accessed 5th September 2022]. 

  77. Tobin, TJ and Behling, KT. 2018. Reach Everyone, Teach Everyone: Universal Design for Learning in Higher Education. Morgantown, WV: West Virginia University Press. 

  78. Uygur, J, Stuart, E, De Paor, M, Wallace, E, Duffy, S, O’Shea, M, Smith, S and Pawlikowska, T. 2019. Best evidence in medical education systematic review to determine the most effective teaching methods that develop reflection in medical students: BEME Guide no 51. Medical teacher, 41(1): 3–16. DOI: https://doi.org/10.1080/0142159X.2018.1505037 

  79. Valo, M. 2000. Experiencing work as a communications professional: students’ reflections on their off-campus work practice. Higher Education, 39(2): 151–79. DOI: https://doi.org/10.1023/A:1003946617377 

  80. Villarroel, V, Bloxham, S, Bruna, D, Bruna, C and Herrera-Seda, C. 2017. Authentic assessment: creating a blueprint for course design. Assessment and Evaluation in Higher Education, 43(5): 840–54. DOI: https://doi.org/10.1080/02602938.2017.1412396 

  81. Ward, A and Gracey, J. 2006. Reflective practice in physiotherapy curricula: a survey of UK university based professional practice coordinators. Medical teacher, 29(1): 32–39. DOI: https://doi.org/10.1080/01421590600568512 

  82. Whitty, G and Anders, J. 2015. Closing the achievement gap: rhetoric or reality? In: Whitty, G, Anders, J, Hayton, A, Tang, S and Wisby, E (eds.), Research and Policy in Education, 74–88. London: University College London, Institute of Education Press. 

  83. Williams, D. 2011. ‘What is assessment for learning?’ Studies in educational evaluation, 37: 3–14. DOI: https://doi.org/10.1016/j.stueduc.2011.03.001 

  84. Zins, JE, Bloodworth, MR, Weissberg, RP and Walberg, HJ. 2007. The scientific base linking social and emotional learning to school success. Journal of Educational and Psychological Consultation, 17(2–3): 191–210. DOI: https://doi.org/10.1080/10474410701413145 

comments powered by Disqus