Indian Journal of Radiology Indian Journal of Radiology  

   Login   | Users online: 57

Home Bookmark this page Print this page Email this page Small font sizeDefault font size Increase font size     

 

PARACLINICAL Table of Contents   
Year : 2010  |  Volume : 20  |  Issue : 2  |  Page : 83-88
Objective structured clinical examination in radiology


1 National Board of Examinations, NAMS Building Ansari Nagar, New Delhi 110 029, India
2 Department of Radiology, KEM Hospital, Parel, Mumbai 400 012, India
3 Department of Radiology and Imaging, University College of Medical Sciences (Delhi University) and associated GTB hospital, Delhi 110 095, India
4 Department of Radiology and Imaging Sciences, Apollo Hospitals, Chennai 600 006, India
5 Department of Radiodaignosis, Command Hospital (Air Force), Bangalore 560 007, India

Click here for correspondence address and email

Date of Web Publication6-May-2010
 

   Abstract 

There is a growing need for introducing objective structured clinical examination (OSCE) as a part of radiology practical examinations in India. OSCE is an established, reliable, and effective multistation test for the assessment of practical professional skills in an objective and a transparent manner. In India, it has been successfully initiated and implemented in specialties like pediatrics, ophthalmology, and otolaryngology. Each OSCE station needs to have a pre-agreed "key-list" that contains a list of objective steps prepared for uniformly assessing the tasks given to students. Broadly, OSCE stations are classified as "manned" or "unmanned" stations. These stations may include procedure or pictorial or theory stations with clinical oriented contents. This article is one of a series of measures to initiate OSCE in radiology; it analyzes the attributes of OSCE stations and outlines the steps for implementing OSCE. Furthermore, important issues like the advantages of OSCE, its limitations, a strengths, weaknesses, opportunities, and threats (SWOT) analysis, and the timing of introduction of OSCE in radiology are also covered. The OSCE format in radiology and its stations needs to be validated, certified, and finalized before its use in examinations. This will need active participation and contribution from the academic radiology fraternity and inputs from faculty members of leading teaching institutions. Many workshops/meetings need to be conducted. Indeed, these collaborative measures will effectively sensitize universities, examiners, organizers, faculty, and students across India to OSCE and help successfully usher in this new format in radiology practical examinations.

Keywords: Assessment; radiology; OSCE

How to cite this article:
Agarwal A, Batra B, Sood A K, Ramakantan R, Bhargava SK, Chidambaranathan N, Indrajit I K. Objective structured clinical examination in radiology. Indian J Radiol Imaging 2010;20:83-8

How to cite this URL:
Agarwal A, Batra B, Sood A K, Ramakantan R, Bhargava SK, Chidambaranathan N, Indrajit I K. Objective structured clinical examination in radiology. Indian J Radiol Imaging [serial online] 2010 [cited 2015 May 6];20:83-8. Available from: http://www.ijri.org/text.asp?2010/20/2/83/63040

   Introduction Top


There is a growing need for introducing objective structured clinical examination (OSCE) as a part of practical examinations for awarding professional degrees and diplomas in radiology in India. OSCE is a method of practical assessment. It is an established, reliable, and effective multistation test for the assessment of practical professional skills. Introduced by Hardin in 1979, [1] it has found increasing acceptance in various medical disciplines, chiefly due to its emphasis on objective assessment of students rather than subjective assessment. It has been successfully implemented by examination bodies in specialties like pediatrics, ophthalmology, and otolaryngology in India. OSCE is versatile in that "it can be and has been used for many levels of education, including undergraduate, postgraduate, continuing education, and licensure and certifying exams." [2]

This review article seeks to demystify OSCE. It analyzes the attributes that make OSCE stations effective and outlines the list of measures for successful implementation of OSCE. The advantages of OSCE, its limitations (with suggestions for addressing the problems), and the timing of introduction of OSCE in radiology are covered in this article; a SWOT (strengths, weaknesses, opportunities, and threats) analysis of OSCE is also presented.


   Why is There a Need for OSCE? Top


The existing model of radiology practical examinations in India comprises of spots, long case, short cases, and table viva. Although this "time-honoured" assessment technique has been employed for long, there are a few problems that are intrinsic to this method. These have been analyzed in detail in many fora as well as in this journal earlier. [3],[4],[5] To recapitulate, the list of deficiencies includes an overall lack of objectivity, wide variability in assessing the skills of different students, and the inability to test the communication skills of students. The traditional format offers little useful feedback, allows favoritism to creep in, and does not permit standardization of the expertise of the different examiners in their role as "evaluators." The traditional system also has limitations in predicting the future performance of radiology students. [6] Often, the ability to examine a patient, diagnose a disorder, and report it professionally is not analyzed objectively by examiners. To overcome these shortcomings, there is a need for introducing OSCE.


   Core Features of OSCE Top


The OSCE, comprising 10-20 stations, is a method of practical assessment. Depending on the specialty, the number of manned stations may vary from 4 to 10. In general, "each station presents part of a case or problem using simulated/standardized patients, slides, audio tapes, photographs, or laboratory reports, and requires examinees to perform a specific procedure, solve a problem, or record requested findings." [2],[7] In radiology, each station would focus on specific topics sourced from the prescribed curriculum.

Completion of a task within a single station involves one of the following: demonstrating a task to an examiner, providing verbal answers, or writing specific objective answers in a response sheet. Typically, a student spends 5 min within a station. The stations are concurrently run in a specific direction, which in practice means that approximately 10 students can be assessed in a period of 2 h.


   Types of OSCE Stations Top


OSCE stations are classified as "manned" or "unmanned" stations depending on the presence or absence of an examiner at the station. Of the total stations, no more than four OSCE stations can be manned.

Manned stations are often procedure stations. In a procedure station, the student performs a basic radiographic task, such as loading a film (radiography OSCE station) or demonstrates an examination technique, such as carrying out a USG examination of the abdomen in a patient (USG OSCE station). The performance of the examinee at any given station is judged against a standardized and pre-agreed "key-list" of steps, which serves as a guide to the examiner at the time of student assessment. These written steps are prepared prior to the exams and commence with patient care related inquiries and ethical questions, e.g., does the student speak to the patient on entering the station? Does the student explain to the patient the procedure of USG? Does the student seek the permission of the patient before commencing the examination? Or Does the student provide a screen to maintain privacy during the examination?[5],[8]

In an unmanned station the student analyzes films and images, which indeed are the foundation of radiology as a specialty, or answers "context-specific" theory questions. Thus, in a pictorial station, the student analyzes a given film (radiographs, USG, CT scan, DSA, or MRI) and objectively answers specific questions. Likewise, in a theory station the student answers specific questions, with either paragraph answers or objective answers ("a," "b," "c," "d," "all of the above," or "none of the above").

Each station by itself is a "context-sensitive" objective assessment of an important topic from the radiology curriculum. Expectedly, this leads to the creation of different stations, with the overall emphasis being on conventional radiology, as is the case in exams conducted in USA and the UK. Foremost among the OSCE stations in radiology are the basic OSCE stations which would include conventional radiography and physics, CT scan, MRI, USG, and interventional radiology. Advanced OSCE stations may be subspecialty-based, e.g., chest radiology, musculoskeletal radiology, uroradiology, neuroradiology, etc. There may also be a combination of all modalities relevant to a given case. OSCE stations can be further evolved to create specialized OSCE stations dealing with emergency radiology, [9] prenatal sex determination test (PNDT), communication skills, reporting skills, ethical issues, and writing skills for article/dissertation publishing, all of which need consensus, validation, certification, and finalization.


   Measures Facilitating OSCE Implementation Top


OSCE stations should be designed to comprehensively test the professional skill of a student. [10] A brief outline of the parameters that support the successful implementation of OSCE is given in [Table 1].

Broadly, there are two issues that need to be addressed. First and foremost is the design of the professional content of radiology OSCE stations. [11] An important intention in designing OSCE stations for radiology is that the assessment should be clinically oriented. The student should identify the abnormality/abnormalities, methodically correlate the data, and extract the clinical content of a given radiological study to make a meaningful contribution toward the correct management of a given case. The selection of objective tasks at each OSCE station should assess a student's understanding of concepts, observation, interpretation, and reporting skills in radiology. The language used at the station should be simple, clear, and easy to understand. During designing of a station, adequate time should be factored in for completion of a task at the station.

Importantly, a pre-agreed "key-list" is created by examiners for each station. It consists of an outline or a list of objective steps, which will ensure uniformity in the assessment of students. At each station, the marks awarded are objective and have a relative weightage factor. How close to or how far away a student's answer is as compared to the examiner-prepared, pre-agreed "key-list" determines the performance at any given station.

The other important issue is the physical design of OSCE stations. Here attention must be given to details such as the layout and signposting, lighting and air-conditioning of rooms, the number of OSCE stations, the status of the view-boxes, stationery, etc.


   Advantages of the OSCE Format Top


The OSCE format has numerous advantages. "OSCE is applicable to any area of medical education." [12] The entire examination is objective and promotes transparency. [13] A large number of students can be evaluated within a short time. It encourages increased interaction between the examiner and students. [14] It facilitates a convenient integration of teaching and evaluation. The variability posed by assessment of students on dissimilar patients, different cases, and assorted topics is drastically trimmed down. Similarly, the variability that exists among different examiners is bypassed when many students are assessed using a standardized format, which advantageously leads to objectivity in assessment. [8] OSCE has been successful in removing biases associated with traditional examination systems.

Once introduced in the syllabus, practice OSCEs are an inevitability. Students can use these practice OSCEs to identify stations where their performance is suboptimal and attempt to improve their skills in that area. OSCE thus "identifies areas of weakness in the curriculum and/or teaching methods" and thereby improves educational effectiveness. [6] Morag et al. have shown that "OSCE may be useful to uncover deficits in individuals and groups beyond the ones detected with traditional clerkship evaluations and provides guidance for remediation." [6] Likewise, a study by Hamann et al. showed that in practice OSCE sessions, the "station scores identified specific content that needs improved teaching." [15] Literature further adds that "OSCE examinations are more likely to measure other qualities such as problem-solving abilities, critical thinking, and communication skills." [16]

With increased acceptance in the academic fora in India and the availability of interesting radiology cases in teaching institutions, there is ample scope for creating OSCE question-and-answer banks featuring a large storehouse of educational material. OSCE is able to provide truthful, honest, and genuine feedback to the candidates, clearly identifying their strengths and weaknesses.


   Difficulties with the OSCE Format and Overcoming Them Top


OSCE's strength lies in its unique format, which enhances objectivity and transparency. However, a few problems are inevitably present. To begin with, many first timers find OSCE to be a "strong anxiety-producing" experience. [17] However, it must be kept in mind that this is also the case in the other existing formats that include theory and/or practical exams.

A reported difficulty experienced by students is the "inadequacy of time" for expected tasks at some stations. [17] Where this is a genuine problem, examiners should design stations such that there is adequate time per station. Additionally, at the time of setting up the OSCE, the group of examiners can decide to do away with "time-wasting" or subjective questions. Reusing OSCE stations has its share of implications but the problems can be eventually overcome. [18] A SWOT analysis of OSCE is given in [Table 2].

For the organizers at exam centers, the examination may be costly [19] and effort-consuming. This is also true when designing and preparing any format of practical exams (spotters, MCQs, etc.), and it is encouraging to find that other specialties in India have overcome these initial difficulties; literature reports that "the new OSCE format posed no organisational problems" [20] and that "subsequent exams will not require the same degree of administrative load." [21]

OSCE in radiology requires more inputs from the faculty members of leading teaching institutions. A series of workshops/meetings need to be organized so that the OSCE can be validated, certified, and finalized before it is used for examination purposes. This entails multiple and repeated brainstorming sessions with the active participation and contribution of the faculty members. This may be time consuming and, most importantly, will call for determination and zeal on the part of the faculty members to switch over from the traditional method of examinations to the more rational, objective, and methodical OSCE.

To overcome these difficulties, there is a compelling need for all educators in radiology to plan a series of inclusive, constructive steps. An expert body empanelled under the aegis of the Indian College of Radiology and Imaging (ICRI), National Board of Examinations (NBE), leading universities, and reputed teaching institutions under the Medical Council of India (MCI) across the country should commence designing OSCE tests in radiology. The question banks created for different OSCE stations should be implemented gradually over time. University and governing bodies should direct various hospitals and teaching institutes and centers to compulsorily introduce and conduct OSCE, over a period of a year. This will enable examiners, organizers, faculty, and students across India to become familiar with the OSCE format. This will be the key to the initiation and successful implementation of OSCEs in radiology.


   A few Special Issues in Designing OSCE Stations in Radiology Top


Communication and reporting skills are important aspects of radiology [22],[23] that are not assessed in traditional radiology practical examinations, but are extremely important in clinical practice. It is well known that inadequate and faulty communication of findings may lead to preventable medical errors, with morbidity and mortality of upto 20%. [24],[25] OSCE provides an opportunity to assess these neglected skills by allowing the creation of specifically designed stations for this purpose. Similarly OSCE "provides a means of assessing radiology resident reporting skills," which is yet another neglected area that is not comprehensively assessed in traditional exams. [24]

Radiology as a specialty deals with the acquisition and analysis of images. In recent times, digital images have become the mainstay of all modalities, including conventional radiography, leading to a new form of workflow. An "electronic OSCE" may create the opportunity to assess students using digital images with the help of computers and projectors [26] and possibly, even the Web [27] or handhelds. Although still in its infancy, an initial experience with "electronic OSCE" has demonstrated "user-friendliness, interactivity, and navigation facilities." [28] Further, "this form of assessment is considered to be cost-effective in terms of staff and equipment resources' and, besides, appears to be preferred by students." [28]


   Feasibility and Timing of Introduction of OSCE in Radiology Top


OSCE is now a part of practical examinations in many medical disciplines. In the prevailing radiology model, practical assessment in MD, DMRD, and DNB examinations comprises varyingly of assessment formats such as spots, long cases, short cases, table vivas, and MCQs. The introduction of OSCE as a new format will either require doing away with one or more of the existing formats [29] or OSCE could be an additional format. It must be remembered that all exams face limitations of time and space. In view of this, replacing rather than adding OSCE may be more feasible when incorporating it in the assessment formats. In view of the fact that spots and long cases are "time-tested" methodologies, perhaps it would be practical to replace one or both of the table vivas with OSCE.

The final model may be arrived at once different exam-conducting bodies gain experience and learn during the process of assessment evolution. Despite it being a tall order, OSCE needs to be implemented. Finally, it must be reiterated that the OSCE technique does not replace the existing model of practical examinations, but augments it effectively by being transparent and enhancing objectivity.


   Summary and Conclusion Top


OSCE is an examination format that uses a contextual format at multiple stations. It facilitates assessment of core competence and contemporary professional skills in several medical disciplines in an objective and a transparent manner. In India, it has been successfully initiated and implemented by examination bodies in specialties like pediatrics, ophthalmology, and otolaryngology.

As regards radiology, the time may have come for incorporating OSCE as a part of the practical examination. This will require several collaborative measures such as (a) active participation and contribution from the academic radiology fraternity, (b) inputs from faculty members of leading teaching institutions, (c) organization of a series of workshops/meetings, (d) sequencing the critical process of validation, certification, and finalization before the use of OSCE in radiology examinations.

 
   References Top

1.Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J 1975;1:447-51.  Back to cited text no. 1  [PUBMED]  [FULLTEXT]  
2.The Objective Structured Clinical Examination (OSCE) Review Project. Available as Osce Report3 pdf. Available from: http://www.bemecollaboration.org/beme/files/OSCE%20REPORT3.pdf [last accessed on 2009 Nov 23]   Back to cited text no. 2      
3.Ramakantan R. Stop the Spots. Indian J Radiol Imaging 1989;43;365-6.  Back to cited text no. 3      
4.Reddy KP. Presidential address. Indian J Radiol Imaging 2009;19:2-3.  Back to cited text no. 4  [PUBMED]  Medknow Journal  
5.Agarwal A, Batra B, Sood A. Evolutionary trends in radiology assessment: The importance of the learning cycle and its assessment in radiology. Indian J Radiol Imaging 2008;18:272-5.   Back to cited text no. 5  [PUBMED]  Medknow Journal  
6.Morag E, Lieberman G, Volkan K, Shaffer K, Novelline R, Lang EV. Clinical competence assessment in radiology: introduction of an objective structured clinical examination in the medical school curriculum. Acad Radiol 2001;8:74-81.  Back to cited text no. 6  [PUBMED]  [FULLTEXT]  
7.Harden RM. What is an OSCE? Med Teach 1988, 10:19-22.  Back to cited text no. 7  [PUBMED]    
8.OSCE notes in otolaryngology. Available from: http://www.oscenotesent.wikidot.com/ [last accessed on 2009 Nov 22]  Back to cited text no. 8      
9.Sisley AC, Johnson SB, Erickson W, Fortune JB. Use of an Objective Structured Clinical Examination (OSCE) for the assessment of physician performance in the ultrasound evaluation of trauma. J Trauma 1999;47:627-31.  Back to cited text no. 9  [PUBMED]  [FULLTEXT]  
10.Tombleson P, Fox RA, Dacre JA. Defining the content for the objective structured clinical examination component of the professional and linguistic assessments board examination: development of a blueprint. Med Educ 2000;34:566-72.  Back to cited text no. 10  [PUBMED]  [FULLTEXT]  
11.Osceology. Available from: http://www.faculty.ksu.edu.sa/hamza/myfiles/OSCE.pdf [last accessed on 2009 Nov 23]  Back to cited text no. 11      
12.Altshuler L, Kachur E. A culture OSCE: teaching residents to bridge different worlds. Acad Med 2001;76:514.  Back to cited text no. 12  [PUBMED]  [FULLTEXT]  
13.van den Berk IA, van de Ridder JM, van Schaik JP. Radiology as part of an objective structured clinical examination on clinical skills. Eur J Radiol 2008 Dec 15. (In Press)   Back to cited text no. 13      
14.Amiel GE, Tann M, Krausz MM, Bitterman A, Cohen R. Increasing examiner involvement in an objective structured clinical examination by integrating a structured oral examination. Am J Surg 1997;173:546-9.  Back to cited text no. 14  [PUBMED]  [FULLTEXT]  
15.Hamann C, Volkan K, Fishman MB, Silvestri RC, Simon SR, Fletcher SW. How well do second-year students learn physical diagnosis? Observational study of an Objective Structured Clinical Examination (OSCE). BMC Med Educ 2002;2:1.  Back to cited text no. 15  [PUBMED]  [FULLTEXT]  
16.Dennehy PC, Susarla SM, Karimbux NY. Relationship between dental students′ performance on standardized multiple-choice examinations and OSCEs. J Dent Educ 2008;72:585-92.  Back to cited text no. 16  [PUBMED]  [FULLTEXT]  
17.Pierre RB, Wierenga A, Barton M, Branday JM, Christie CD. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ 2004;4:22.  Back to cited text no. 17  [PUBMED]  [FULLTEXT]  
18.Jolly B, Cohen R, Newble D, Rothman A. Possible effects of reusing OSCE stations. Acad Med 1996;71:1023-4.   Back to cited text no. 18  [PUBMED]    
19.Cusimano MD, Cohen R, Tucker W, Murnaghan J, Kodama R, Reznick R. A comparative analysis of the costs of administration of an OSCE (objective structured clinical examination). Acad Med 1994;69:571-6.  Back to cited text no. 19  [PUBMED]    
20.Taylor A, Rymer J. The new MRCOG Objective Structured Clinical Examination--the examiners evaluation. J Obstet Gynaecol 2001;21:103-6.  Back to cited text no. 20  [PUBMED]  [FULLTEXT]  
21.Manogue M, Brown G. Developing and implementing an. OSCE in dentistry. Eur J Dent Educ 1998;2:51-7.  Back to cited text no. 21  [PUBMED]    
22.Hall FM. Language of the radiology report: primer for residents and wayward radiologists. AJR Am J Roentgenol 2000;175:1239-42.  Back to cited text no. 22  [PUBMED]  [FULLTEXT]  
23.Orrison WW, Nord TE, Kinard RE, Juhl JH. The language of certainty: proper terminology for the ending of the radiologic report. AJR Am J Roentgenol 1985;145:1093-5.  Back to cited text no. 23  [PUBMED]  [FULLTEXT]  
24.Williamson KB, Steele JL, Gunderman RB, Wilkin TD, Tarver RD, Jackson VP, et al. Assessing radiology resident reporting skills. Radiology 2002;225:719-22.   Back to cited text no. 24  [PUBMED]  [FULLTEXT]  
25.Committee on Quality of Health Care in America, Institute of Medicine. To err is human: building a safer health system. In: Kohn LT, Corrigan JM, Donaldson MS, editors. Washington, DC: National Academy Press; 1999.  Back to cited text no. 25      
26.Fliegel JE, Frohna JG, Mangrulkar RS. A computer-based OSCE station to measure competence in evidence-based medicine skills in medical students. Acad Med 2002;77:1157-8.  Back to cited text no. 26  [PUBMED]  [FULLTEXT]  
27.Finlay K, Norman GR, Keane DR, Stolberg H. A web-based test of residents′ skills in diagnostic radiology. Can Assoc Radiol J 2006;57:106-16.  Back to cited text no. 27  [PUBMED]    
28.Palarm TW, Griffiths M, Phillips R. The design, implementation and evaluation of electronic objective structured clinical examinations in diagnostic imaging: an ′action research′ strategy. J Diagn Radiography Imaging 2004;2:79-87.   Back to cited text no. 28      
29.Norcini JJ. The death of the long case? BMJ 2002;324:408-9.  Back to cited text no. 29  [PUBMED]  [FULLTEXT]  

Top
Correspondence Address:
Anurag Agarwal
National Board of Examinations, NAMS Building Ansari Nagar, New Delhi 110 029
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/0971-3026.63040

Get Permissions




 
 
    Tables

  [Table 1], [Table 2]

This article has been cited by
1 Designing and Implementing the Objective Structured Clinical Examination in Anesthesiology
Maya Jalbout Hastie,Jessica L. Spellman,Parwane P. Pagano,Jonathan Hastie,Brian J. Egan
Anesthesiology. 2014; 120(1): 196
[Pubmed]
2 What is an objective structured practical examination in anatomy?
Yaqinuddin, A. and Zafar, M. and Ikram, M.F. and Ganguly, P.
Anatomical Sciences Education. 2013; 6(2): 125-133
[Pubmed]
3 What is an objective structured practical examination in anatomy?
Ahmed Yaqinuddin,Muhammad Zafar,Muhammad Faisal Ikram,Paul Ganguly
Anatomical Sciences Education. 2013; 6(2): 125
[Pubmed]
4 Objective structured clinical evaluation as an assessment method for undergraduate chest physical therapy students: A cross-sectional study [Avaliação clínica objetiva e estruturada como um método para avaliar estudantes de graduação em fisioterapia respiratória: Um estudo transversal]
Silva, C.C.B.M. and Lunardi, A.C. and Mendes, F.A.R. and Souza, F.F.P. and Carvalho, C.R.F.
Revista Brasileira de Fisioterapia. 2011; 15(6): 481-486
[Pubmed]
5 Objective structured clinical examination for undergraduates: Is it a feasible approach to standardized assessment in India?
Bhatnagar, K., Saoji, V., Banerjee, A.
Indian Journal of Ophthalmology. 2011; 59(3): 211-214
[Pubmed]
6 Assessment of Iranian Dental Lecturers Attitude and Perspectives Toward Objective Structured Clinical Examination (OSCE)
Elaheh Vahid Dastjerdie,Abolfazl Saboury,Mina Mahdian,Mohammad Javad Khahr Fard
Research Journal of Biological Sciences. 2010; 5(3): 241
[Pubmed]



 

Top
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Email Alert *
    Add to My List *
* Registration required (free)  


    Abstract
    Introduction
    Why is There a N...
    Core Features of...
    Types of OSCE St...
    Measures Facilit...
    Advantages of th...
    Difficulties wit...
    A few Special Is...
    Feasibility and ...
    Summary and Conc...
    References
    Article Tables

 Article Access Statistics
    Viewed5131    
    Printed265    
    Emailed1    
    PDF Downloaded577    
    Comments [Add]    
    Cited by others 6    

Recommend this journal