Following the recommendations of the GMC, a Joint Royal
Colleges of Physicians Training Board (JRCPTB) Working
Group proposed a revised WPBA system to be piloted in 2012
and 2013.4 Trainees are encouraged to identify learner-directed
learning goals with their trainers before any SLE. Both trainees
and trainers should subsequently identify opportunities that
would facilitate the acquisition of these learning goals and
are suitable for SLEs. SLEs provide opportunities for trainees
and trainers to interact. Furthermore, SLEs intend to promote
deeper learning through effective feedback and self-refl ection.
Trainees and trainers should formulate action plans with
further learning goals following SLEs.1
The pilot SLEs continued to use the assessment methods
of mini-clinical evaluation exercise (mini-CEX), case-based
discussion (CbD) and acute care assessment tool (ACAT).
These methods were retained because they had previously
been demonstrated to be feasible, reliable and valid.5–10 The
intention of SLEs is to enhance learning through self-refl ection
and effective feedback.11 The scoring system, which is part of
the current WPBAs, was removed from the SLEs to promote
self-refl ection and feedback. The anchor statements on the SLEs
have been retained to provide trainees with a clear indication
of their level of development. SLEs will not contribute
directly towards the decision process of the Annual Review of
Competence Progression (ARCP).
Here, we focus on the evaluation of the use of SLEs in
postgraduate medical education and explore lessons learnt from
the pilot of these assessments for learning.