Meaningful in-training and end-of-training assessment: The need for implementing a continuous workplace-based formative assessment system in our training programs

Objectives: To analyze the systems and tools involved in assessment of skill procurement and demonstrating workplace skills in postgraduate medical training. Methods: This cross-sectional survey was carried out by enrolling trainee doctors currently working in Medical, Surgical, Dental and Allied specialties of the country by sending a validated and piloted questionnaire through email. Data collection was done from 20th April to 20th May 2021. Data was analysed using SPSS v. 21.0. Results: A total of 351 completed responses were received from 10 major cities of the country. Multiple aspects of entry-into-training, in-training and end-of-training evaluation showed poor correlation with the required training goals. A comparison of assessment for entry-into-supervised training (FCPS-I) versus independent practice (FCPS-II) showed a dismal situation regarding assessing affective skills like leadership, teamwork, coping with pressure and self-awareness. The concept of maintaining portfolios was completely alien to the trainees and the assessment tools used for demonstrating workplace skills were outdated. The lack of a continuous, periodic and balanced assessment (65%); detailed feedback (61.5%); fair exams (59%); variability in scoring system (58%) and professionalism of the examiners (57.5%) were the most frequently selected perceived flaws in the assessment system by the participants. Conclusion: There are multiple lacunae regarding competency-based assessment systems in our training programs and a massive scope for improvement. Assessment systems should be implemented as continuous process of learning, self-reflection, feedback and revalidation throughout the training tenure at regular and multiple points.


INTRODUCTION
Postgraduate assessment for trainee doctors have rationalized from the mere objective of binary outcome (that is pass or fail) to a much wider concept of competency and an insight in to the clinical acumen in real-life scenario. 1 The Western world is in a continuous process of harmonizing post-graduate training, making it effective, robust and trainee/specialty oriented to enhance learning in competency based programs. 2 Testing factual knowledge alone has been seen to fail abysmally in assessing skills and behaviour with orthodox examination systems that mainly involve long essay questions, viva voce or performance in a controlled environment by the end of training program. 3 Our training programs lack objective and continuous systematic assessments for many of the specialties, unfortunately. Even if the trainees are provided with workplace based assessments (WBA), they do not count towards the final postgraduate examinations. 4,5 This study was designed to analyze the systems involved in assessment of skill procurement and growth and the tools used for demonstrating workplace skills during training programs provided by the institutions in Pakistan. Also, the perceived flaws in the assessment systems of the training programs were explored in detail.

METHODS
This cross sectional survey was carried out by enrolling trainee doctors working in different setups of Pakistan through consecutive sampling after acquiring ethical approval from the concerned department (IRB 90/Trg-ABP1K2 dated 20.04.2021). The survey was completed in one month from 20 th April to 20 th May 2021 by sending a validated and piloted questionnaire through email to trainee doctors currently working in Medical, Surgical, Dental and Allied specialties. Trainees from basic medical sciences, non-trainee doctors and incomplete surveys were all excluded.
The questionnaire was developed by LA and MA after a thorough literature review [4][5][6][7][8] and was reviewed by two medical education experts for content validity. The survey was piloted among 10 post-graduate residents before putting it to test. The questionnaire encompassed questions regarding in-training and end-of-training evaluation systems offered through-out the tenure in our training programswith assessment of skill procurement and growth for entry in to supervised training (FCPS I) versus independent practice (FCPS II). The participants were also surveyed about the various tools used for workplace based assessments (WBA) provided by their training institutes. Also the perceived flaws in the assessment system of the training programswere explored in detail.
The sample size was calculated with margin of error set at 5.5%, confidence level at 95% and an anticipated frequency (response distribution) of 50% using OpenEpi sample size calculator. The questionnaire was sent through email, a reminder was given to the participants after one week of no response and the candidates were dropped who failed to respond after another seven days 6 . Qualitative data was expressed as frequencies and percentages. A value of <0.05 was considered statistically significant. All analysis was done using SPSS V.21.

RESULTS
A total of 351 completed responses were received out of 800 questionnaires sent, making a response rate of 44%. Male participants (76.6%), age range of 25-30 years (68%), working in private setup (39.6%) and those currently in their first fellowship (70%) made majority of the sample ( Table-I). The distribution of specialties and region of training are shown in Fig.1.
Only 28.5% of the trainees reported the presence of periodic assessments provided by the training institute with only 23% of these assessments counting towards the end-of-training evaluation (FCPS-II) ( Table-II). There was a serious dearth of evaluation post compulsory rotations (35%), individual learning plan (IPL) (15%), assessment systems related to training goals (22%), assessment system that helps to identify trainees in need of assistance (22%) and prompting in the form of letters of recommendation/appreciation (22%). A statistically significant number of trainees reported variability in scoring system, professionalism of the examiners, lack of postassessment feedback and lack of training and revalidation for the assessors during end-oftraining assessment.
Workplace based formative assessment in our training programs   A comparison of assessment for entry in to supervised training (FCPS-I) versus independent practice (FCPS-II) in Fig.2 showed a dismal situation regarding assessing affective skills like leadership, teamwork, coping with pressure and self-awareness. The concept of maintaining portfolios was completely alien to the trainees and the assessment tools used for demonstrating workplace skills were, unfortunately, outdated (Fig.3).
The lack of a continuous, periodic and balanced assessment (65%); detailed feedback (61.5%); fair exams (59%); variability in scoring system (58%) and professionalism of the examiners (57.5%) were the most frequently selected perceived flaws in the assessment system of training programs (Fig.3)

DISCUSSION
A continuous, periodic and balanced WBA has notoriously been neglected to be incorporated in to the postgraduate training programs in many Asian countries where the evaluation system relies heavily upon end-of-training summative assessment 5 . Also assessing attitude has always been considered to be of lesser importance than assessing knowledge, cognitive skills and psychomotor skills in our education system. The traditional assessment systems usually underestimate the practicality and the profound effect affective skills have to play in re-humanizing patient-doctor relationship. 7 It was seen in this study that 45% of the trainees reported non-availability of clear guidelines/ blue-prints for their training programs, although the trainee-portal specifically designed by College of Physicians and Surgeons of Pakistan (CPSP) has clear guidelines regarding the competencies and syllabi for all specialties. 8 This reflects the careless attitude of the majority of our postgraduate trainees who need to inculcate a culture of active engagement in their evaluations and revalidations.
This survey showed that 41% of the trainees believed their clinical supervisors had a "laid back" attitude towards their training needs which emphasizes the need for dedicated mentors who are interested in teaching and identifying trainees that need assistance. One of the methods to accomplish this is to provide the trainees with support to progress at their own pace by measuring progress in achieving competencies for the chosen career path. 9 Entry-into-training is vigorously controlled and assessed in the West with weighting given to additional post-graduate and undergraduate degrees, awards, teaching experience, publications, presentations/posters/conferences, audits and quality improvement projects and properly compiled portfolios 10 . Also the progression to independent practice is carefully scrutinized with a set of learning outcomes called "capabilities in practice". 10 In comparison, our assessment systems lack the basic essence of competency building.
Only 28.5% of the trainees in our study conveyed that WBA are arranged for them periodically by their institutions with assessment tools that were unfortunately outdated. Those trainees who did Laima Alam et al. report periodic assessments revealed that none of these assessment records contributed towards their final exam (FCPS-II), which is a despairing but highly concerning reality. This dismal situation was iterated by a similar study from India that reported evaluation of the trainees only on the day of final examination and none of the workplace based assessment records contributed towards the final assessment. 11 Portfolios make one of the essential tools to evaluate the progress of a trainee in addition to the fact that they help the trainees to assess themselves by reflection and keep them focused on their learning goals. 12 Our trainees were alien to the concept of portfolios with no access to reflective learning and have long been entangled in the concept of spoon-feeding and apprenticeship. This cycle of self-pity and holding others responsible for personal failure and lack of goals needs to be terminated with objectivisation of assessment programs. 13 There is an interesting study by Andreassen P et al addressing the challenges and resistance towards implementing a properly planned assessment program among both trainers and trainees. Subtle tactics to sabotage a periodic workplace based formative assessment like contesting, avoiding and deprioritizing were reported by the authors either because of lack of knowledge or display of hierarchical and authoritative roles by the seniors. 14 Although this phenomenon was not studied in detail in our study, a significant number of trainees did report a lack of interest on part of their supervisors towards their career goals and requirements. This highlights the role of Educational Supervisors, specifically trained and hired for the sole purpose of evaluating trainees, giving feedback and mitigating the stigma around regular assessments.
Supervisors are required by the CPSP to submit their trainees' three monthly progress 8 but unfortunately only 23% of the trainees reported an actual implementation of the supervisor's crucial role in updating the College regarding their trainees' progress. The failure rate of FCPS is higher than any of the comparable examinations in the West. Lack of objectivity of assessment systems, personal attributes of the examiners, lack of structured training and no revalidations for the teaching faculty were some of the plausible causes that were thoroughly reviewed by Farooq S. 15 Similarly, an elaborate report comparing the performance during MRCP exam of UK trained Internal Medicine trainees with those trained outside UK showed a high disparity between the pass-rate (59% versus 32.6%). This emphasizes the need for rigorous standardization of training and assessment programs for the international medical graduates (IMGs). 16 There were some concerning points raised by the trainees like external influence on results, variability in professionalism of the examiners and scoring system and the lack of fair exams. These are serious allegations and need to be investigated sensitively and meticulously.
Our concern is with the actual implementation and usage of the assessment towards final evaluation to make it a meaningful drive. Without any incentive, neither the residents nor the mentors would exert any effort towards a real change 17 Career progression should not be considered a burden like summative exams. A culture of self-reflection and self-progression need to be inculcated so that the trainees are involved whole heartedly in their career. 18 There is a need for a shift of focus from hierarchical mentality and apprenticeship towards a teaching culture that promotes competence-based learning and improves clinical abilities.

Limitations:
The study has its limitations such as the sampling technique and the participants were asked to forward the survey through email for maximum participation. Although the trainees' grievances were extensively sought in this study, the financial; clerical; time and constraints regarding expertise that need to be endorsed by the higher authorities for running and maintaining the system were not studied.

CONCLUSION
There are multiple lacunae regarding competency-based assessment systems in our training programs and a massive scope for improvement. Assessment systems should be implemented as continuous process of learning, self-reflection, feedback and revalidation throughout the training tenure at regular and multiple points. Training the trainers as well as the trainees for creating an environment that is safe, using assessment systems that are evenly rewarded and that contribute towards the final assessment while acknowledging the limitations of the system/training facility and the students are the critical steps towards development.