THE COMPETENCE OF PLASTIC SURGEONS
Manturova NE1, Kochubey VV2 Kochubey AV3
1 Department of Plastic and Reconstructive Surgery, Cosmetology and Cell Technologies, Plrogov Russian National Research Medical University, Moscow
2 Department of faculty surgery №1, Faculty of Medicine, ••Yevdokimov Moscow State University of Medicine and Dentistry, Moscow
3 Department of Public health and Healthcare Management, ••Institution of Advanced Training FMBA, Moscow
Russian system of continuous medical education does not guarantee professional development of its participants: doctors do not report the number and specifics of the operations performed, self-assess their competence and compile individual professional development plans, and the professional community does not take part in these processes. Therefore, there is a need for accurate assessment of competence of plastic surgeons and objectivity of their self-assessment. We have conducted a study in the form of a single-stage questionnaire filled by the surgeons in person. The questionnaire contained two sections. The first section offered a competence self-assessment table listing 9 plastic surgery specialties; the participants used a 5-point scoring system to state their level, where 1 meant "no experience", 2 — "beginner", 3 — "specialist", 4 — "knowledgeable", 5 — "expert". The second section contained 9 test tasks (closed, univariate) used to objectively assess the level of competence of the participants. Each correct answer added 1 point to the participant's score, wrong answers added nothing. 162 people took part in the survey. The average age of the participants was 31.5 ± 6.9 years; average length of service — 4.0 ± 4.8 years. Analyzing the data, we applied the Kolmogorov-Smirnov test, Mann-Whitney test, Kruskal-Wallis test, Spearman's coefficient, used ANOVA, Levene's test, Duncan test. The values were considered statistically significant at p < 0.05. The overall self-assessment score was 2.1 ± 0.92 points. We have discovered a statistically significant (p < 0.001) correlation of the length of service with the level of self-assessment (rs = 0.72). The average score for the second section, the tests, was 2.6 ± 1.76 points (out of 9). The correlation between the test score and the length of service was insignificant (rs = -0.08, p = 0.3); same is true for the self-assessment (rs = -0.006, p = 0.9).
Keywords: competence of plastic surgeons, self-assessment of competence by plastic surgeons, objectivity of self-assessment of competence
[X] Correspondence should be addressed: Valentin Vladimirovich Kochubey Delegatskaya, 20/1, Moscow, 127473; [email protected]
Received: 24.04.2018 Accepted: 23.05.2018
DOI: 10.24075/brsmu.2018.023
КОМПЕТЕНТНОСТЬ ПЛАСТИЧЕСКИХ ХИРУРГОВ
Н. Е. Мантурова1, В. В. Кочубей2 А. В. Кочубей3
1 Кафедра пластической, реконструктивной хирургии, косметологии и клеточных технологий, факультет дополнительного профессионального образования, Российский национальный исследовательский медицинский университет имени Н. И. Пирогова, Москва
2 Кафедра факультетской хирургии № 1, лечебный факультет,
Московский государственный медико-стоматологический университет имени А. И. Евдокимова, Москва
3 Кафедра общественного здоровья и здравоохранения,
Институт повышения квалификации Федерального медико-биологического агентства, Москва
Отечественная система непрерывного медицинского образования (НМО) не гарантирует ежегодного прогресса профессиональной компетентности у всех ее участников: врачами не предоставляются отчеты о количестве и спектре проделанных операций, оценка компетентности и создание планов индивидуального обучения проводятся ими самостоятельно без участия профессионального сообщества. В этой связи актуальным является изучение компетентности пластических хирургов и объективности ее самооценки. Изучение проводили с помощью очного одноэтапного анкетирования. Анкета содержала два раздела. Первый раздел включал тесты для самооценки компетентности по 9 трудовым функциям пластического хирурга, где 1 означало, что нет опыта, 2 — новичок, 3 — специалист, 4 — знаток, 5 — эксперт. Во второй раздел входили тестовые задания закрытого типа простого одновариантного выбора для объективной оценки знаний респондентов. Правильный ответ оценивали в 1 балл, неверный — 0. В анкетировании приняли участие 162 человека. Средний возраст респондентов был 31,5 ± 6,9 года, средний стаж работы 4,0 ± 4,8 года. Для статистического анализа рассчитывали критерии Колмогорова-Смирнова, Манна-Уитни, Краскела-Уоллиса, коэффициент Спирмена, использовали однофакторный дисперсионный анализ (ANOVA), тест Левена, тест Дункана. Статистически значимыми считали значения при р < 0,05. По результатам исследования общий уровень самооценки всех респондентов составил 2,1 ± 0,92 балла. Обнаружена статистически значимая (р < 0,001) корреляция стажа с уровнем самооценки (rs = 0,72). Средняя оценка по тестам составила 2,6 ± 1,76 баллов из 9 максимально возможных. Незначимой оказалась корреляция тестовой оценки со стажем работы (rs = -0,08, р = 0,3) и с самооценкой (rs = -0,006, р = 0,9).
Ключевые слова: компетентность пластических хирургов, самооценка компетентности пластическими хирургами, объективность самооценки компетентности
[X] Для корреспонденции: Кочубей Валентин Владимирович
ул. Делегатская, д. 20/1, г. Москва, 127473; [email protected]
Статья получена: 24.04.2018 Статья принята к печати: 23.05.2018
DOI: 10.24075/vrgmu.2018.023
Russian legislation obliges each medical doctor to constantly improve professional expertise through participation in additional professional education programs [1]. One of the goals of modernization of the national medical education system is introduction of the continuous professional development principle [2]. For this purpose, there was built a continued medical and pharmaceutical education web portal, which is maintained by the Center for Scientific and Methodological Support of Transition to Continuous Medical and Pharmaceutical Education System under the Russian National Research Medical University named after N. Pirogov [3, 4]. This portal hosts continued education programs and educational activities approved by the Commission for Development of Continued Medical and Pharmaceutical Education of the Ministry of Health of the Russian Federation [5]. This is the factor that guarantees high quality of content provided in the context of the CME (continued medical education) system. The portal also allows practitioners to compile individual training plans [3], which means that a doctor's professional level depends on his or her own attitude, personal desire to develop and master a wider range of professional skills, and, most importantly, objectivity of self-assessment.
Today, testing in the context of periodic accreditation is the only tool to "weed out" unqualified specialists. At the same time, it encourages professional growth [6]. But periodic accreditation only takes place once every 5 years; it is a checkpoint through which a doctor gets permission to practice. During those 5 years, the Russian continued medical education system does not oblige practitioners to develop their professional skills, as opposed to the like systems active abroad, which do [7-10]. The latter prove that their mechanisms designed to annually assess a doctor's professional progress are effective, with such assessment comprising a portfolio of reports stating number and specifics of operations, peer reviews and an individual training plan (developed jointly with the organization administrating the system) listing mandatory subjects [11-14].
This study aimed to learn the level of competence of plastic surgeons, as well as objectivity of their self-assessment, in certain modules of plastic surgery.
MATERIALS AND METHODS
The study started with a one-time surveying of medical doctors carrying plastic surgeon certificates (which was the inclusion criterion). The questionnaire contained two sections. First section offered a competence self-assessment table listing 9 plastic surgery modules; the participants used a 5-point scoring system to state their level, where 1 point meant "no experience" and 5 points — "expert". Each score tier for each specialty
Table 1. Competence level of plastic surgeons: self-assessment
defined one of five groups a doctor could belong to: 1 — no experience, 2 — beginner, 3 — knowledgeable, 4 — specialist, 5 — expert.
The mean of all points a participant assigned to each specialty was his or her overall self-assessment level. This level could bring the participant into one of three groups: mean from 0 to 2 — low self-assessment level; mean from 2 to 4 — average self-assessment level; mean from 4 to 5 — high self-assessment level.
Second section contained 9 test tasks (closed, univariate) used to objectively assess the level of competence of the participants. The tasks were taken from a collection of tests used at the final certification exam plastic surgeons take [15]. Each correct answer added 1 point to the participant's score, while wrong answers added nothing. The total number of points was the overall assessment level of each participant; the values ranged from 0 to 9.
162 people took part in the survey. The survey was anonymous. Respondents indicated their age, sex and length of service in the field of plastic surgery.
The average age of the participants was 31.5 ± 6.9 years, average length of service — 4.0 ± 4.8 years. There were 63 women (mean age 32.1 ± 8.6 years, average length of service 4.5 ± 0.8 years) and 99 men (mean age 31.2 ± 5.5 years, average length of service 3.7 ± 0.4 years). Length of service defined 4 groups of participants: 0 years of experience — 14.2% of respondents, 1-5 years — 58.0%, 6-10 years — 20.4%, over 10 years — 7.4 %. Distribution by age, length of service and variants of respondents' self-assessment was irregular (p < 0.001).
We applied the Kolmogorov-Smirnov test to assess uniformity of distribution; any deviation at p < 0.05 was considered statistically significant. Differences between the two groups were assessed with the help of the Mann-Whitney test (U), that among multiple groups — through ANOVA with simultaneous assessment of the equality of variances (Levene's test) and application of the Kruskal-Wallis test (chi-square). Duncan test allowed isolation of the homogeneous groups. Spearman's coefficient (rs) was used to confirm correlation between attributes. The criteria and coefficients were considered statistically significant at p < 0.05. Average, standard deviations, percentages were calculated.
IBM SPSS Statistics software (version 23) was used for statistical processing of the data.
RESULTS
The most popular competence level for all plastic surgery modules was "no experience" (Table 1).
Specialty Competence levels
no experience beginner knowledgeable specialist expert
Otoplasty 37% 27.8% 16.7% 18.5% -
Rhinoplasty 38.9% 27.8% 18.5% 14.8% -
Blepharoplasty 40.7% 22.2% 14.8% 13.0% 9.3%
Cheiloplasty 40.7% 33.3% 13.0% 7.4% 5.6%
Mammaplasty 33.3% 29.6% 13.0% 14.8% 9.3%
Urogenital plastic surgery 68.5% 20.4% 5.6% 1.9% 3.7%
Cutaneous plastic surgery 31.5% 25.9% 20.4% 16.7% 5.6%
Craniofacial plastic surgery 50.0% 29.6% 9.3% 5.6% 5.6%
Arm and hand plastic surgery 57.4% 24.1% 5.6% 11.1% 1.9%
The greater the competence level, the less respondents picked it throughout all specialties, the only exception being "arm and hand surgery". In this specialty, the share of self-assessed "specialists" was greater than that of "knowledgeable". The participants of the study picked the lower competence levels in the "arm and hand surgery", "urogenital plastic surgery", "craniofacial plastic surgery" specialties.
The overall self-assessment score was 2.1 ± 0.92 points. For the "otoplasty" specialty, the figure was 2.2 ± 1.12 points; for the "rhinoplasty" — 2.1 ± 1.08; "blepharoplasty" — 2.3 ± 1.36; "cheiloplasty" — 2.0 ± 1.16; "mammaplasty" — 2.4 ± 1.32; "urogenital plastic surgery" — 1.5 ± 0.96; "cutaneous plastic surgery" — 2.4 ± 1.24; "craniofacial plastic surgery" — 1.9 ± 1.14; "arm and hand surgery" — 1.8 ± 1.09.
Analysis of the self-assessment scores against length of service (0 years, 1-5 years, 6-10 years, over 10 years) has shown that the level of self-assessed competence level differs in all specialties (chi-square 66.9; p < 0.001). The average self-assessment score grows together with the length of service (Table 2).
The results received and the fact that the greater the length of service the more participants consider themselves "experts" and "specialists" lead us to look into the relationship of experience and self-assessment. We have found a statistically significant (p < 0.001) correlation between length of service, overall (rs = 0.72) and specialty-specific self-assessment levels: "otoplasty" rs = 0.64, "rhinoplasty" rs = 0.52, "blepharoplasty" rs = 0.57, "cheiloplasty" rs = 0.66, "mammaplasty" rs = 0.72, "urogenital plastic surgery" rs = 0.35, "cutaneous plastic surgery" rs = 0.62, "craniofacial plastic surgery" rs = 0.46, "arm and hand surgery " rs = 0.42.
As for the second part of the study, the tests, 7.4% of participants gave no correct answers, 22.8% gave one correct answer, 20.4% — two correct answers, 17.9% — three correct answers, 20.4% — five correct answers, 3.7% — six correct answers, 1.9% — nine correct answers. Specialty-wise, the distribution of correct answers was as follows: "otoplasty" — 53.1% (of respondents answered correctly); "rhinoplasty" — 46.9%, "blepharoplasty" — 21%, "cheiloplasty" — 20.4%, "mammaplasty" — 20.4%, "urogenital plastic surgery" — 24.1%, "cutaneous plastic surgery" — 34%, "craniofacial plastic surgery" — 32.1%, "arm and hand surgery" — 13%.
The overall average score was 2.6 ± 1.76 points. There was no correlation found between the overall score (all test tasks) and length of service (rs = -0.08, p = 0.3). As for the specialties, no correlation between length of service and the amount of correct answers was established for "rhinoplasty" (rs = -0.03, p = 0.7), "blepharoplasty" (rs = -0.05, p = 0,5), "urogenital plastic surgery" (rs = -0.09, p = 0.2), "cutaneous plastic surgery" (rs = -0.05, p = 0.5). A statistically significant but inverse correlation was found between the length of service and the number of correct answers for "mammaplasty" (rs = -0.2, p = 0.01) and "craniofacial plastic surgery" (rs = -0.05, p = 0.01 ). "Rhinoplasty" and "cheiloplasty" were the only modules where experience directly affects the number of correct answers: rs = 0.27, p = 0.001 and rs = 0.19, p = 0.02, respectively. It should be noted that the correlation, although statistically significant, was quite weak.
The distribution of correct answers among the 4 length of service groups was uneven (chi-square 12.1, p = 0.007).
The "overall assessment level" in the three groups that were defined during the study (low, medium and high self-assessment levels) showed no statistically significant differences: 2.6 ± 1.63, 2.8 ± 2.0, 1.5 ± 0.55, respectively, chi-square 3.3, p = 0.20. Table 2 shows collates the overall average assessment score by length of service and self-assessment level.
The correlation between overall self-assessment level and overall assessment level was not statistically significant (rs = -0.006, p = 0.9). In some specialties, the distribution of correct answers considered from the point of view of 5 proficiency groups (no experience, beginners, knowledgeable, specialists, experts) was quite even: "rhinoplasty" (p = 0.36), "blepharoplasty" (p = 0.31), mammaplasty (p = 0.11), "urogenital plastic surgery" (p = 0.45). In other specialties, on the contrary, it was uneven: "rhinoplasty" (p = 0.0001), "cheiloplasty" (p = 0.015), "cutaneous plastic surgery" (p = 0.018), "craniofacial plastic surgery" (p = 0.002), "arm and hand plastic surgery" (p = 0.005).
The Duncan's test allowed singling out groups with homogeneous distribution of correct answers. In "rhinoplasty", the correct answers were distributed similarly in "no experience" (M = 0.25) and "specialist" (M = 0.29) (p = 0.74) proficiency
Table 2. Overall assessment score by length of service and self-assessment level
Length of service Self-assessment level Overall average assessment score
0 years low 2.6 ± 1.64
total 2.6 ± 1.64
1-5 years low 2.8 ± 1.61
medium 2.1 ± 1.48
high 1.0 ± 0.00
total 2.6 ± 1.59
6-10 years low 2.0 ± 1.09
medium 3.8 ± 2.24
total 3.5 ± 2.18
Over 10 years low 0.0 ± 0.00
medium 1.7 ± 0.51
high 2.0 ± 0.00
total 1.3 ± 0.88
Total low 2.6 ± 1.63
medium 2.9 ± 2.01
high 1.5 ± 0.55
total 2.7 ± 1.76
Table 3. Competence level: results of the testing
Modules Average number of correct answers in groups
"no experience" "expert"
Otoplasty 0.47 + 0.50 -
Rhinoplasty 0.29 + 0.46 -
Blepharoplasty 0.18 + 0.39 0.20 + 0.41
Cheiloplasty 0.09 + 0.29 0.33 + 0.50
Mammaplasty 0.28 + 0.45 0.00 + 0.00
Urogenital plastic surgery 0.26 + 0.44 0.00 + 0.00
Cutaneous plastic surgery 0.35 + 0.48 0.00 + 0.00
Craniofacial plastic surgery 0.33 + 0.47 0.00 + 0.00
Arm and hand plastic surgery 0.16 + 0.37 0.00 + 0.00
groups, as well as in "knowledgeable" (М = 0.60) and "beginner" (М = 0.76) (р = 0.15). As for the cheiloplasty specialty, there were no statistically significant differences in distribution of correct answers: "no experience" (М = 0.09) and "knowledgeable" (М = 0.14) (р = 0.1), "specialist" (М = 0.25), "beginner" (М = 0.33), "experts" (М = 0.33) (р = 0.26). In cutaneous plastic surgery, homogeneity in distribution of correct answers was registered in groups "expert" (М = 0.00) and "beginner" (М = 0.21) (р = 0.12), as well as groups "no experience" (М = 0.35), "specialist" (М = 0.44), "knowledgeable" (М = 0.48) (р = 0.07). In craniofacial plastic surgery we have also detected two groups with expressed correct answers homogeneity, one comprised of "knowledgeable" (М = 0.00), "expert" (М = 0.00), "no experience" (М = 0.33) (р = 0.05), the other including "beginner" (М = 0.40) and "specialist" (М = 0.67) (р = 0.05). In arm and hand surgery, homogeneous groups were "beginner", "knowledgeable", "expert" (М = 0.00 for each) (р = 0.34) and groups "no experience" (М = 0.20) and "specialist" (М = 0.33) (р = 0.26).
As this data describing homogeneity of distribution of correct answers in groups shows, self-assessment does not match scores received through testing. Moreover, in most specialty groups those respondents that claimed to have "no experience" gave more correct answers than those who placed themselves on the "expert" tier (Table 3).
In blepharoplasty the average number of correct answers was higher in the "expert" group, but the difference was non statistically significant (U = 486, р = 0.88). Only in the cheiloplasty specialty the average number of correct answers given by "experts" was higher than that seen in the "no experience" group (U = 225, p = 0.04).
DISCUSSION
Our survey has shown that plastic surgeons generally believe their level of competence to be quite low. This is especially so in such modules as urogenital plastic surgery (1.5 ± 0.96), arm and hand surgery (1.8 ± 1.09), craniofacial plastic surgery (1.9 ± 1.14). It should be noted that according to a study we have conducted earlier [16], these are the modules where surgeries are most seldom.
However, in most cases plastic surgeons assess their own proficiency objectively. The level of objective assessment is low: the average is 2.6 ± 1.76 with the top being 9 points.
We have established that practitioners tend to self-assess their proficiency level higher as their length of service increases. Unfortunately, testing — objective competency level assessment — does not confirm that correlation. Self-assessment (rs = -0.006, р = 0.9) has no statistically significant relation to length of service (rs = -0.08, р = 0.3). It should be noted that in rhinoplasty and cheiloplasty there was some weak correlation registered between length of service and number of correct answers given; however, comparison of "no experience" and "expert" groups and calculation of testing homogeneity in the proficiency level groups show no proof of length of service being the factor directly and positively affecting level of competence. Even in the cheiloplasty specialty, where "experts" have given more correct answers than respondents belonging to the "no experience" proficiency group (U = 225, p = 0.04), calculations of homogeneity brought "beginners" together with "experts", and "no experience" joined "knowledgeable". At that, overall self-assessment and specialty-specific self-assessment show statistically significant correlation with the length of service (р < 0.001).
The fact that those participants that have no or minimal practical experience have done better in testing than surgeons who have been practicing for over 10 years may probably be the result of the former being fresh out of medical schools. Practitioners whose length of service ranges between 6 and 10 years have shown the highest average score, 3.8 ± 2.24 points, which reflects the level of activity of their practice.
CONCLUSIONS
The present study has shown that the level of competence of plastic surgeons is low; self-assessment does not match the results of objective testing; after 10 years of practice, surgeons tend to regress. Thus, we believe that our findings back the suggestions to limit access to reconstructive plastic surgery practice by introducing discrete training and support opinions of our continuous medical education system being ineffective in allowing doctors to draw up training plans themselves.
References
1. Federal'nyy zakon ot 21.11.2011 N 323-FZ «Ob osnovakh okhrany zdorov'ya grazhdan v Rossiyskoy Federatsii», stat'ya 73. Russian.
Prikaz Minzdrava Rossii ot 02.06.2016 N334n «Ob utverzhdenii
Polozheniya ob akkreditatsii spetsialistov». Russian.
Portal nepreryvnogo medicinskogo i farmacevticheskogo
obrazovanija Available at: https://edu.rosminzdrav.ru/o-portale Russian.
4. Portal nepreryvnogo medicinskogo obrazovaniya. Zamestitel' glavnogo vracha. 2016; 1: 81. Russian.
5. Prikaz Minzdrava Rossii ot 18.02.2013 N82 «O Koordinacionnom sovete po razvitiyu nepreryvnogo medicinskogo i farmacevticheskogo obrazovaniya Ministerstva zdravoohraneniya Rossijskoj Federacii» Russian.
6. Petrova I. A. Akkreditatsiya meditsinskikh rabotnikov: pol'za i riski. Byulleten' Natsional'nogo nauchno-issledovatel'skogo instituta obshchestvennogo zdorov'ya imeni N.A. Semashko. 2015; (4-5): 180-6. Russian.
7. Kochubey V. V Nepreryvnoe medicinskoe obrazovanie plasticheskogo hirurga v Velikobritanii. Moskovskiy khirurgicheskiy zhurnal. 2017; 1: 56 -9. Russian.
8. Conntinuing Professional Development (CPD) [Elektronniy resurs] https://www.rcseng.ac.uk/careers-in-surgery/surgeons/ practicing-as-a-surgeon/continuing-professional-development Data obraschenija 25 dekabrja.
9. Shojania KG, Silver I, Levinson W. Continuing medical education and quality improvement: a match made in heaven? Ann Intern Med. 2012 Feb 21; 156 (4): 305-8. DOI: 10.7326/0003-4819156-4-201202210-00008.
10. Zolotor AJ, Randolph GD, Johnson JK, Wegner S, Edwards L,
Powell C, et al. Effectiveness of a practice-based, multimodal quality improvement intervention for gastroenteritis within a Medicaid managed care network. Pediatrics. 2007; 120: e644-50. [PMID: 17766504].
11. Esposito P, Dal Canton A. Clinical audit, a valuable tool to improve quality of care. General methodology and applications in nephrology. World J Nephrol. 2014 Nov 6; 3 (4): 249-55. DOI: 10.5527/wjn.v3.i4.249.
12. GMC. Licence to practise withdrawal appeals [Elektronniy resurs] http://www.gmc-uk.org/doctors/14008.asp Data obraschenija 25 dekabrja.
13. Ericsson KA. Acquisition and Maintenance of Medical Expertise: A Perspective From the Expert-Performance Approach With Deliberate Practice. Acad Med. 2015; 90 (11): 1471-86. [PMID: 26375267].
14. Van de Wiel MW, Van den Bossche P. Deliberate practice in medicine: The motivation to engage in work-related learning and its contribution to expertise. Vocat Learn. 2013; 6 (1): 135-58.
15. Enoch S, Jagadeesan J, Jose RM, Chan WY, McGrouth G. Plastic Surgery Exam Questions and Answers: A Guide to the Plastic Surgery exit exam, 2 ed. UK: Doctors Academy Publications, 2012. 498 ps.
16. Manturova N. E., Kochubey V. V., Kochubei A. V. Harakteristiki deyatel'nosti plasticheskih hirurgov. Vestnik RGMU. 2017; 6: 47-51.
Литература
1. Федеральный закон от 21.11.2011 № 323-03 «Об основах охраны здоровья граждан в Российской Федерации». Ст. 73 «Обязанности медицинских работников и фармацевтических работников».
2. Приказ Минздрава России от 02.06.2016 №334н «Об утверждении Положения об аккредитации специалистов».
3. Портал непрерывного медицинского и фармацевтического образования [Электронный ресурс] https://edu.rosminzdrav. ru/o-portale/ Дата обращения 25 декабря 2017.
4. Портал непрерывного медицинского образования. Заместитель главного врача. 2016; 1: 81.
5. Приказ Минздрава России от 18.02.2013 N82 «О Координационном совете по развитию непрерывного медицинского и фармацевтического образования Министерства здравоохранения Российской Федерации».
6. Петрова И. А. Аккредитация медицинских работников: польза и риски. Бюллетень Национального НИИ общественного здоровья имени Н. А. Семашко. 2015; (4-5): 180-6.
7. Кочубей В. В. Непрерывное медицинское образование пластического хирурга в Великобритании. Московский хирургический журнал. 2017; 1: 56-9.
8. Continuing Professional Development (CPD) [Электронный ресурс] https://www.rcseng.ac.uk/careers-in-surgery/surgeons/ practicing-as-a-surgeon/continuing-professional-development Дата обращения 25 декабря 2017.
9. Shojania KG, Silver I, Levinson W. Continuing medical education and quality improvement: a match made in heaven? Ann Intern Med. 2012 Feb 21; 156 (4): 305-8. DOI: 10.7326/0003-4819-
156-4-201202210-00008.
10. Zolotor AJ, Randolph GD, Johnson JK, Wegner S, Edwards L, Powell C, et al. Effectiveness of a practice-based, multimodal quality improvement intervention for gastroenteritis within a Medicaid managed care network. Pediatrics. 2007; 120: e644-50. [PMID: 17766504].
11. Esposito P, Dal Canton A. Clinical audit, a valuable tool to improve quality of care. General methodology and applications in nephrology. World J Nephrol. 2014 Nov 6; 3 (4): 249-55. DOI: 10.5527/wjn.v3.i4.249.
12. GMC. Licence to practise withdrawal appeals [Электронный ресурс] http://www.gmc-uk.org/doctors/14008.asp Дата обращения 25 декабря 2017/.
13. Ericsson KA. Acquisition and Maintenance of Medical Expertise: A Perspective From the Expert-Performance Approach With Deliberate Practice. Acad Med. 2015; 90 (11): 1471-86. [PMID: 26375267].
14. Van de Wiel MW, Van den Bossche P. Deliberate practice in medicine: The motivation to engage in work-related learning and its contribution to expertise. Vocat Learn. 2013; 6 (1): 135-58
15. Enoch S, Jagadeesan J, Jose RM, Chan WY, McGrouth G. Plastic Surgery Exam Questions and Answers: A Guide to the Plastic Surgery exit exam, 2 ed. UK: Doctors Academy Publications, 2012. 498 ps.
16. Мантурова Н. Е., Кочубей В. В., Кочубей А. В. Характеристики деятельности пластических хирургов. Вестник РГМУ. 2017; 6: 47-51.