Abstract

Objective(s). The aim of this study was to evaluate the status of electrocardiography (ECG) training in emergency medicine residency programs in Turkey, and the attitude of the program representatives towards standardization of such training.

Methods. This investigation was planned as a cross-sectional study. An 18-item questionnaire was distributed to directors of residency programs. Responses were evaluated using SPSS (v.16.0), and analyzed using the chi-square test.

Results. Thirty-nine program directors (out of 42) responded to the questionnaire. Twenty-eight of them stated they did not have a formal ECG training curriculum. The most preferred ECG education method was clinical education in the Emergency Department; the most common education resource was ECG textbooks; and the most common evaluation method was case scenarios. Only thirteen of the programs had an obligation to prove competency. The most common competency-assessment method was obtaining a passing grade based on an instructor’s observation. The majority of program directors are of the opinion that there should be a formal ECG teaching curriculum, and that a national ECG training program and national ECG database should be formed.

Conclusions. The majority of programs do not have a formal ECG interpretation curriculum, which is an obligation to prove competency. As a result, their training methods, resources, and assessment tools were determined to be subjective.

Key words: emergency medicine, electrocardiography, education

Introduction

Electrocardiography (ECG) is a commonly used diagnostic test in Emergency Departments (EDs), and has proven to be an important adjunct for emergency physicians (EPs) and emergency medicine residents (EMRs) who need to diagnose a condition and provide rapid treatment.

Despite its importance in diagnosis, directing treatment, and follow-up procedures, interpreting ECGs has limitations. Diagnoses based on structural and pathophysiological changes are made through inferences, and therefore there is potential for error. (1-6) Since ECGs provide snapshots of cardiac rhythms, intermittent rhythm anomalies can be missed.

The rates of error in ECG interpretation made by EMRs and EPs that are clinically important or life-threatening appear to be lower than those of overall ECG incompatibility rates. (7-9) Although cardiac markers with high sensitivity and specificity are now used in combination with ECGs for ED evaluation of chest-pain patients, 2% to 8% of patients with acute myocardial infarction (AMI) are still discharged inappropriately from EDs. (10-12) A proportional relationship has been found between ECG interpretation errors and mortality rates among patients with AMI. (13) AMI cases missed as a result of ECG misinterpretation constitute 10% of all judicial cases with ED origin, and 20%-45% of the value of all monetary compensation awards. (14,15) Current guidelines focus specifically on obtaining and interpreting ECGs in all algorithms. (16,17)

There are no agreed scales for learning, evaluation, and retention of ECG interpretation skills, and none of the existing scales are evidence-based. (3,18)

In this study, we aimed to:

a) Determine teaching methods, training resources and the perceived value of these methods among EMRs;

b) Determine the methods used to assess the competency of EMRs, and the perceived value of these methods among EMRs;

c) Determine the methods used for mandatory competency assessment of ECG interpretation, if any, and the sufficiency of such assessment methods;

d) Determine current attitudes of department chairs/chiefs on improving and standardizing ECG interpretation training across all active, emergency medicine residency programs.

Participants and Methods

Design and Population

This investigation was planned as a cross-sectional study consisting of an 18-question survey. Forty-two emergency medicine residency programs were included in the study; eligible programs were those determined to be actively training residents, based on data obtained from the Ministry of Health and professional organizations involved in emergency medicine. A questionnaire was developed using a commercial web-based provider ( www.surveymonkey.com ). This web-based tool was used both to send out questionnaires, and to collect responses using participants’ personal email addresses. Participants comprised department chairs (in university-based residency programs), program chiefs (in residency programs delivered at training and research hospitals operating under the Ministry of Health), or the supervisors of residency-training programs.

Approval from the Gazi University Ethics Committee was obtained prior to the study. Participants also signed informed consent forms.

Survey Content

The survey consisted of 18 questions addressing the following themes: emergency medicine training program (3); ECG interpretation training (3); ECG teaching methods and sources (4); evaluation of ECG interpretation competency methods (2); competency assessment requirements and methods (3); EMRs’ interpretation skills and length of ECG training (2); and one question about participants’ attitudes towards the standardization of ECG training.

Both open-ended and closed-ended questions were used in the survey. Two-choice answers, scaled answers, sorted answers, group answers, and answers with more than one selection were used in the closed-ended questions. A box titled “other” was included at the end of each question for comments, suggestions, and additional input related to the methods and sources. Questions that involved a subjective response were presented using a 5-point Likert scale.

Data Collection

A total of four initial and reminder messages were sent to the participants via email, and participants were also contacted via telephone between 1 June 2008 and 31 July 2008. The required institutional, ethical approval forms were collected from the participating sites using dedicated sections of the web-based survey form. All data collected through the “Survey Monkey” website by the submission deadline were analyzed using commercial software.

Data Analysis

All data were analyzed using SPSS (version 16.0 for Windows). Descriptive statistical values are presented by frequencies and percentages. Fisher’s exact test is used to perform a 2 by 2 chi-square analysis when at least one expected cell count is less than 5. Yates’ corrected chi-square test was applied for all other 2 by 2 tables because of the small sample size (N<40). A value of p<0.05 was considered statistically significant throughout the study.

Results

Descriptive Statistical Findings

This research achieved 92.9% (39/42) participation among the eligible EDs in Turkey. The majority of participating EDs (69.2%; 27/39) were in state university hospitals, 25.7% (10/39) were in state teaching and research hospitals, while 5.1% (2/39) were in private university hospitals.

It was found that the majority of emergency medicine programs (71.8%; 28/39) did not have formal training in ECG interpretation which was curriculum planned and phased according to the study years and subjects. The remaining 11 programs (28.2%) had a formal ECG interpretation curriculum in place.

In departments that provided training (formal and spontaneous) in theoretical and/or practical ECG interpretation, the most common amount of training was a maximum of 7-12 hours during the first year (33.3%; 13/39) and second year (34.2%; 13/38). The number of hours devoted to ECG training per year was reduced to 1-6 hours during the third (40.8%; 11/27), fourth (48.0%; 11/23), and fifth years (48.0%; 11/23) of residency training.

We found that ECG interpretation training was offered only by EPs in 61.6% (24/39) of programs, only by cardiologists in 5.1% (2/39), and in a multidisciplinary manner by EPs and cardiologists and/or internal medicine specialists in 33.3% (13/39) of programs

(tables 1, 2, 3).

Two-thirds of programs (66.7%; 26/39) required no proof of capability in ECG interpretation, and one-third of programs (33.3%; 13/39) had an obligation to prove capability (table 4). The majority of programs (69.2%; 27/39) represented the opinion that the residency training offered was adequate to achieve the required competency in ECG interpretation capabilities. Nevertheless, 89.7% (35/39) stated that every emergency medicine program should have a formal ECG training curriculum, 71.8% (28/39) believed that a national ECG training program should be established, and 79.5% (31/39) thought that a national ECG database should be developed. It was observed that the idea of a national ECG competency examination with national ECG competency stipulation was accepted by 38.4% (15/39) and by 41.0% (16/39) of participants, respectively. Those opposed represented 28.2% (11/39) and 25.6% (10/39) of participants, respectively, and 33% (13/39) were undecided..

Statistical Findings

No statistically significant relationship was found between the presence or absence of a curriculum and the variables of hospital type, variety of specialists offering training, and program age (Fisher’s exact test, p>0.05).

A significant relationship was found between hospital type and the variety of specialists offering training (Fisher’s exact test, p=0.023). According to this, ECG training was mostly offered in university hospitals and offered in a multidisciplinary environment in training and research hospitals.

Separate analysis of variables showed no statistically significant relationship between the preferences of ECG training methods and the variables of hospital type, variety of specialists offering training, program age, and the presence or absence of a curriculum (Fisher’s exact test, p>0.05).

No statistically significant relationship was found between the preferences of ECG training resources and the variables of hospital type, the variety of specialists offering training, program age, and the presence or absence of a curriculum (Fisher’s exact test, p>0.05).

With the exception of two findings, separate analysis of variables showed no statistically significant relationship between the preferences of ECG interpretation skills, evaluation methods, the variables of hospital type, the variety of specialists offering training, program age, and the presence or absence of a curriculum (Fisher’s exact test, p>0.05). Firstly, a significant difference was found between the presence or absence of a curriculum, and the use of the formal examination method as one of the ECG interpretation skills evaluation methods (Yates’ continuity correction test, p=0.011) (table 5).

Secondly, a significant relationship was found between the presence or absence of a curriculum and preference for the peer evaluation method (Fisher’s exact test, p=0.033). Those departments with a curriculum, accepted formal examination and peer evaluation as the preferred evaluation methods.

No significant relationship was found between hospital type, variety of specialists offering training, program age variables, and the presence or absence of a requirement to demonstrate capability in interpreting ECGs (Fisher’s exact test, p>0.05).

A significant relationship was observed between the presence or absence of a curriculum, and the presence or absence of a stipulation to demonstrate the ability to interpret ECGs (Fisher’s exact test, p=0.00). Those departments without curricula also did not have a stipulation to demonstrate capability in ECG interpretation (table 6).

Similarly, a statistically meaningful relationship was identified between the presence or absence of a curriculum, and the departments’ self-opinion on whether their own ECG training processes were sufficient (Fisher’s exact test, p=0.009). According to this finding, all of the departments which had a curriculum regarded their own ECG interpretation training as adequate.

Discussion

We found discrepancies in ECG interpretation training among the residency programs. In a similar study, (19) the majority of the programs (64.4%; 56/87) had an ECG training curriculum that was formally planned and phased according to years and subjects. This contrasts with our study, which established that 71.8% of programs did not provide a planned curriculum for ECG interpretation training. In our study, 72.4% of university hospitals and 66.7% of departments with a program age of 10-14 years lacked a curriculum. This finding indicates a conclusion opposing the general assumption that university-based and experienced departments have planned ECG training curriculums. However, it was noted during our study that a spontaneous ECG interpretation training process is also operational in programs that did not have a formal curriculum.

A previous study (19) reported similar conclusions, that the three most widely used ECG training methods in emergency services were bedside training, didactic lessons, and case-based lessons. In another study, (20) the two most widely used methods were lessons and small group discussions. In that study, 95% of participants evaluated didactic lessons as being “valuable” or “very valuable”, and these were ranked among the three most widely used methods. In contrast, in the present study, only 56.4% of programs considered didactic lessons as “valuable” or “very valuable” in terms of their contribution to resident training. This contradiction, which indicates the existence of a visible opposition against the most important classic training method, is suggestive of an insistence to use the same method despite opposition, lack of recognition of alternative methods, or an inability to apply them.

According to our study, the most widely used ECG training materials were leading text books on ECGs, academic members’ personal training dossiers, and internet documents. A similar study, (20) reported differing results, in that ECG training dossiers of academic members and emergency services were the most commonly used resources (91%) and regarded as a “very valuable” resource in terms of their contribution to resident training. Another study (19) indicated that the national ECG database was the most often-used resource.

According to our study, the most widely used and valuable method of evaluation for ECG interpretation competency in terms of resident training is case scenarios. While a similar study (19) reported case scenarios to be the most commonly used method for evaluating ECG interpretation competency, another study (20) indicated that performance observation in the emergency service setting was the most commonly used and valued method.

Those departments with an established curriculum mostly prefer formal examinations for evaluating the ECG interpretation competency of residents. This frequent preference may be due to the objectivity of the formal examination method systematically questioning ECG knowledge and interpretation skills, and also due to its ability to convert capability evaluation to objective data.

Our study has identified that the majority of departments (66.7%) had no stipulation for EMRs to demonstrate adequate ECG interpretation skills. In a similar study, (19) only 11.5% of programs had a stipulation to prove competency. Those departments which require proof of competency widely consider a passing grade based on the observation of the instructor method as proof of competency. The method of obtaining a passing grade based on an instructor’s observation is a subjective and passive method compared to other methods. Even so, it was found to be the most preferred method in terms of the perception of a high-adequacy level of resident training, and is ranked first on the preference list. From the point of view of EPs, this method may be seen as the ‘fastest’ and the ‘least troublesome’. However, it appears to be meaningful to combine this method with others to render the stipulation of proof of adequacy convertible to scientific, concrete data understandable by everyone. Although more concrete and objective, the method of obtaining a passing grade from formal examinations is the least preferred, ranked last in the preferences list, and the perception of its adequacy in terms of resident training method is also ranked the lowest.

The majority of program directors were of the opinion that each emergency medicine program should have a formal ECG teaching curriculum, and that a national ECG training program and national ECG database should be formed. It was observed that the idea of a national ECG competency examination with national ECG competency stipulation is opposed by the minority of programs. In a similar study, (19) it was found that the majority of emergency medicine programs opposed a national ECG competency examination and national ECG competency stipulation.

There are several reasons for the difficulties encountered in the planning of ECG interpretation training and application phases. Firstly, the nonexistence of an ECG training curriculum in a majority of programs, the variable structure of informal ECG training processes, and the wide variations in methods and resources leads to complications in determining how and by which curriculum ECG training should be done. The same factors make it harder to decide which measurements should be used for ECG interpretation competency testing.

Secondly, for interventionist medical skills, it is easier to demonstrate the existence of a relationship between the increasing number of procedures and favorable patient outcomes, but it appears to be more difficult to determine a potential cause and effect relationship for knowledge-based medical skills, such as ECG interpretation.

Thirdly, while a large number of national or foreign manuals exist for interventionist medical skills, there are no clear, nationally or internationally agreed manuals for knowledge-based skills such as ECG training. These difficulties impede the standardization of ECG interpretation training within emergency medicine programs.

The reasonable expectation for ECG interpretation skills, in resident training for Emergency Medicine, is adequacy in bedside ECG interpretation. It is possible to define the specialty of emergency medicine as the ability to demonstrate all required knowledge and skills at a level sufficient to provide all aspects of acute care without needing consultation. EMRs should be trained in such a way that they are able to accurately and rapidly interpret an ECG without need for consultation. ECG interpretation skills which are independent of the subjective characteristics of physical conditions such as crowding, time windows, and urgent settings are ineffective. Therefore, ECG interpretation training should provide, evaluate, and sustain the ability to interpret and manage ECGs in a real, emergency-service setting accordingly.

The following are viewed by us as being meaningful for emergency medicine science: the definition of subjective and objective impediments obstructing the planning of ideal ECG interpretation training; identification of the most effective training models; appropriate and sufficient curriculum formation; determination of adequacy measurements; acceleration of standardization efforts; and, if considered necessary, contributions to the realization of national and international manuals.

Limitations

In survey-type studies, difficulties such as failing to understand the questions, biased answers to questions, the potential inclination created by unintentionally directing questions, low response rate, and a weak sampling quality (representative of space problems), could be encountered. This study may also include probable risks unique to survey-type research.

This survey was sent to emergency medicine program authorities primarily responsible for resident training. For this reason, it may not reflect potentially differing opinions of other EPs working in Emergency Departments.

Conclusion

We determined there was only a limited number of studies evaluating ECG interpretation training within emergency medicine residency programs both nationally and internationally. Our study evaluated the status of ECG interpretation training, including the delivery methods used, resources utilized, and assessment and evaluation instruments used by the emergency medicine residency programs. This included their attitudes towards standardization of ECG interpretation training. We found that the presence of a formal training curriculum and competency criteria were rare events for emergency medicine residency programs and that, based on these findings, their training methods, resources, and assessment tools were determined rather subjectively. This study showed that a spontaneous ECG interpretation training process is also operational in programs that do not have a formal curriculum and competency criteria. We believe that future studies should focus on objective and measurable indicators of competency in ECG interpretation skills that are vital for emergency practitioners.

Table 1. Preferred electrocardiogram (ECG) training methods.

ECG Training Methods The percentage of preferred% (n/T) Percentage of “valuable” or “very valuable” (%)
Clinical training at the ED 100 (39/39) 97.4
Case-based lessons 97.4 (38/39) 94.8
Didactic lessons 97.4 (38/39) 56.4
Personnel study 97.4 (38/39) 69.2
Cardiology rotation 92.3 (36/39) 63.9
Computer-based training 69.2 (27/39) 68.6
Other* 10.3 (4/39)

ED, emergency department.

*Other methods: ECG handouts prepared for residency training, attending intern classes, attending courses by professional organizations, attending symposia.

Table 2. Preferred electrocardiogram (ECG) training resources.

ECG Training Resources Percentage preferred% (n/T) Percentage of “valuable” or “very valuable” (%)
Leading ECG text books 97.4 (38/39) 89.5
Training documents from faculty members 92.3 (36/39) 78.9
Internet resources 92.3 (36/39) 75.7
Training documents of the emergency medicine programs 66.7 (26/39) 81.3
Training programs offered by professional organizations 66.7 (26/39) 65.6

Table 3. Preferred methods for evaluating electrocardiogram (ECG) interpretation competency.

Preferred Evaluation Methods Percentage preferred% (n/T) Percentage perceived as “valuable” or “very valuable (%)
Performance observation in clinical environment 94.9 (37/39) 92.3
Case scenarios 97.4 (38/39) 92.3
Observation during lectures 94.9 (37/39) 70.3
Informal tests* 74.4 (29/39) 80.0
Formal tests† 53.8 (21/39) 62.0
Residents’ self evaluation 53.8 (21/39) 50.0
Interpretation of ECG with an instructor‡ 35.9 (14/39) 69.2
Residents’ evaluation of each other 35.9 (14/39) 41.3

*Informal tests: Informal bedside ECG interpretation tests

†Formal tests: Formal, planned, classical or multiple choice tests

‡Interpretation of a sufficient number of ECGs in the presence of an instructor

Table 4. Preferred methods for proving electrocardiogram (ECG) interpretation competency.

 Preferred Proving Methods Percentage preferred% (n/T) Percentage percieved as “valuable” or “very valuable” (%)
Obtaining a passing grade based on an instructor’s observation 92.3 (12/13) 91.6
Passing grade based on informal tests 84.6 (11/13) 100.0
Interpretation of a sufficient number of ECGs in the presence of an instructor 76.9 (10/13) 90.9
Passing grade based on formal tests 61.5 (8/13) 37.5
Other: Follow-up on residents’report cards 7.7 (1/13)

 

Table 5. Relationship between preference for formal examination method and the presence or absence of a curriculum [% (n)].

Formal Examination Method
Curriculum Used Not Used Total p value*
Curriculum Exists 90.9 (10) 9.1 (1) 100.0 (11) p=0.011
Curriculum Does Not Exist 39.3 (11) 60.7 (17) 100.0 (28)

*Yates’ continuity correction test, p=0.011

Table 6. Relationship between presence or absence of curriculum and stipulation to demonstrate competency in electrocardiogram (ECG) interpretation skills [% (n)].

Competency Stipulation
Curriculum Exists DoesNotExist Total p value*
Curriculum Exists 81.8 (9) 18.2 (2) 100.0 (11) p=0.00
Curriculum DoesNot Exist 14.3 (4) 85.7 (24) 100.0 (28)

*Fisher’s exact test, p=0.00

References

  1. Brady WJ, Whetstone D, Ghaemmaghami CA. The ECG and clinical decision making in the emergency department. In: Mattu A, Barish RA, Tabas JA, editors. Electrocardiography in emergency medicine. 1st ed. Dallas: ACEP Bookstore; 2007. p. 1-12.
  2. Hollander JE. Acute coroner syndromes: Acute myocardial infarction and unstable angina. In: Tintinalli JE, Stapczynski JS, Cline DM, editors. Emergency medicine: A comprehensive study guide. 8th ed. New York: McGraw-Hill; 2011. p. 370-4.
  3. Fisch C, Ryan TJ, Williams SV, Achord JL, Akhtar M, Crawford MH, et al. Clinical competence in electrocardiography. A statement for physicians from the ACP/ACC/AHA task force on clinical privileges in cardiology. J Am Coll Cardiol 1995;25:1465-9.
  4. Speake D. The first ECG has a low sensitivity for myocardial infarction in patients with chest pain. BestBets. Accessed 2010 Sep 15;
    Available at: http://www.bestbets.org/bets/bet.php?id=75 .
  5. Brady WJ, Perron AD, Martin ML, Beagle C, Aufderheide TP. Cause of ST segment abnormality in ED chest pain patients. Am J Emerg Med 2001;19:25-8.
  6. Benner JP, Borloz MP, Adams M, Brady WJ. Impact of the 12-lead electrocardiogram on ED evaluation and management. Am J Emerg Med 2007;25:942-8.
  7. Snoey ER, Housset B, Guyon P, Elhaddad S, Valty J, Hericord P. Analysis of emergency department interpretation of electrocardiograms. J Accid Emerg Med 1994;11:149-53.
  8. Rusnak RA, Stair TO, Hansen K, Fastow JS. Litigation against emergency physicians: common features in cases of missed myocardial infarction. Ann Emerg Med 1989;18:1029-34.
  9. Sever M, Karcıoğlu Ö, Aslan Ö, Sever F, Parlak İ, Ersel M. An analysis of accuracy and reliability of emergency department ECG interpretations. Turk J Emerg Med 2007;7:56-63.
  10. Eken C, Başarıcı İ, Eray O, Belgi A, Hakbilir O. Likelihood classification of patients presented with chest pain to the emergency department. Turk J Emerg Med 2006;6:41-8.
  11. Pope JH, Aufderheide TP, Ruthazer R, Woolard RH, Feldman JA, Beshansky JR, et al. Missed diagnoses of acute cardiac ischemia in the emergency medicine. N Engl J Med 2000;342:1163-70.
  12. Sharkey SW, Berger CR, Brunette DD, Henry TD. Impact of the electrocardiogram on the delivery of thrombolytic therapy for acute myocardial infarction. Am J Cardiol 1994;73:550-3.
  13. Masoudi FA, Magid DJ, Vinson DR, Tricomi AJ, Lyons EE, Crounse L, et al. Implications of the failure to identify high-risk electrocardiogram findings for the quality of care of patients with acute myocardial infarction. Results of the emergency department quality in myocardial infarction study. Circulation 2006;114:1565-71.
  14. Karcz A, Korn R, Burke MC, Caggiano R, Doyle MJ, Erdos MJ, et al. Malpractice claims against emergency physicians in Massachusetts: 1975-1993. Am J Emerg Med 1996;14:341-5.
  15. Karcz A, Holbrook J, Auerbach BS, Blau ML, Bulat PI, Davidson A, et al. Preventability of malpractice claims in emergency medicine: a closed claims study. Ann Emerg Med 1990:19:865-73.
  16. Hazinsky MF, Samson R, Schexnayder S. Handbook of Emergency Cardiovascular Care for Healthcare Providers 2010. Dallas: American Heart Association, 2010.
  17. Initial management of acute coronary syndromes. In: European Resuscitation Council Guidelines for Resuscitation 2005. European Resuscitation Council Web site. Accessed2010 Sep 20; Available at: https://www.erc.edu/index.php/docLibrary/en/viewDoc/down%3D6/ .
  18. Kadish AH, Buxton AE, Kennedy HL, Knight BP, Mason JW, Schuger CD, et al. ACC/AHA clinical competence statement on electrocardiography and ambulatory electrocardiography: A report of ACC/AHA/ACP-ASIM task force on clinical competence (ACC/AHA committee to develop a clinical competence statement on electrocardiography and ambulatory electrocardigraphy) endorsed by the international society for holter and noninvasive electrocardiography. J Am Coll Cardiol 2001;38:2091-100.
  19. Ginde AA, Char DM. Emergency medicine residency training in electrocardiogram interpretation. Acad Emerg Med 2003;10:738-42.
  20. Pines JM, Perina DG, Brady WJ. Electrocardiogram interpretation training and competency assessment in emergency medicine residency programs. Acad Emerg Med 2004;11:982-4.

Betül Akbuğa Özel
Department of Emergency Medicine
Başkent University Faculty of Medicine, Ankara Hospital
Fevzi Çakmak Cad. 10. Sok. No:45
06490 Bahçelievler, Ankara,Turkey
Ahmet Demircan, Ayfer Keleş, Fikret Bildik
Department of Emergency Medicine
Gazi University Faculty of Medicine, Ankara Hospital
Beşevler, Ankara, Turkey
Deniz Özel
Department of Biostatistics and Medical Informatics
Akdeniz University Faculty of Medicine
Dumlupınar Boulevard 07058 Antalya, Turkey
Mehmet Ergin
Department of Emergency Medicine
Necmettin Erbakan University, Meram Faculty of Medicine
Akyokuş, Meram 42080 Konya, Turkey
Gül Pamukçu Günaydin
Department of Emergency Medicine
Atatürk Training and Research Hospital
Bilkent Road 3.km Çankaya, Ankara, Turkey
Corresponding author:
Betül Akbuğa Özel
Başkent University Faculty of Medicine, Ankara Hospital
Fevzi Çakmak Cad. 10. Sok. No:45
06490 Bahçelievler, Ankara, Turkey
Phone: +90 312 212 68 68
Fax: +90 312 223 73 33
GSM: +90 530 490 08 88
E-mail: bakbuga2000@yahoo.com

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License