You are on page 1of 18

Language Assessment Quarterly, 8: 386403, 2011 Copyright Taylor & Francis Group, LLC ISSN: 1543-4303 print / 1543-4311

1 online DOI: 10.1080/15434303.2011.622017

COMMENTARY

The Politics of Aviation English Testing


J. Charles Alderson
Lancaster University

The International Civil Aviation Association has developed a set of Language Prociency Requirements (LPRs) and a Language Prociency Rating Scale, which seeks to dene prociency in the language needed for aviation purposes at six different levels. Pilots, air trafc controllers and aeronautical station operators are required to achieve at least Level 4 on this scale (usually in English, the de facto language of international aviation) in order to be licensed to y aircraft or control air trafc on international (cross-border) ights or to work in international operations. This article summarises a series of research studies into the implementation of the LPRs and speculates on the reasons for the current state of affairs, with particular emphasis on the macro- and micropolitics of individuals and organisations.

INTRODUCTION The International Civil Aviation Association (ICAO) is a specialised agency of the United Nations, which sets standards and regulations necessary for aviation safety, security, efciency and regularity, as well as for aviation environmental protection (ICAO, n.d.). As part of its role in promoting the safe and orderly development of international civil aviation throughout the world, ICAO fomented the development of standard radiotelephony phraseology (known as RTF) to be used in communications between pilots and air trafc controllers in order to reduce the risk that a message will be misunderstood and to aid the communication process so that any error is quickly detected. International standards of phraseology are laid down in ICAO Annex 10 Volume II Chapter 5 and in ICAO Doc 9432Manual of Radiotelephony. However, ambiguous or nonstandard phraseology is a frequent causal or contributory factor in aircraft accidents and incidents. In reaction to a series of fatal aircraft accidents in which language
Correspondence should be sent to J. Charles Alderson, Department of Linguistics and English Language, County College South, Lancaster University, Lancaster, LA1 4YL United Kingdom. E-mail: c.alderson@lancaster.ac.uk

POLITICS OF AVIATION ENGLISH TESTING

387

appeared to play a part, ICAO set up a working group, known as the Price SG, which developed a set of Language Prociency Requirements (LPRs) and a Language Prociency Rating Scale, which seeks to dene prociency in the language needed for aviation purposes at six different levels. Pilots, air trafc controllers, and aeronautical station operators are required to achieve at least Level 4 on this scale (usually in English, the de facto language of international aviation) in order to be licensed to y aircraft or control air trafc on international (cross-border) ights or to work in international operations. The LPRs were published in 2003, and the aviation industry was given 5 years to implement these standards. National civil aviation authorities were required by ICAO to post their implementation plans on the ICAO FSIX website. The initial deadline was March 5, 2008, but it quickly became clear that the majority of ICAO Member States were not compliant by that deadline, and so it was then extended to March 5, 2011 to allow operators more time to implement the LPRs. This article summarises a series of research studies into the implementation of the LPRs and speculates on the reasons for the current state of affairs, with particular emphasis on the macroand micropolitics of individuals and organisations.

AVIATION COMMUNICATION As Emery (2011, p. 8) pointed out,


Commercial aviation is a much-studied industry, and there is a volume of literature on the subject of air-ground communications produced in a variety of elds and disciplines, from aviation psychology to applied linguistics. Training materials for RT communications are numerous and readily accessible.

Indeed, there is a plethora of literature looking at a wide range of factors involved aviation communication. But Emery went on to argue that a better understanding of the language itself is urgently required (p. 13). Nevertheless, there are a few such studies. Cookson (2009), Cushing (1994), Mell (1992), and Prinzo (1996) all studied pilot/control communications and miscommunications with serious consequences. There is even a growing number of studies looking at English second language users communication in aviation environments. Farris, Tromovich, Segalowitz, and Gatbonton (2008) examined the implications for training and assessment of cognitive factors involved in air trafc communication in a second language. Kim and Elder (2009) reported on radiotelephony nonroutine discourse between native and nonnative speakers of English and the perceptions of Korean aviation personnel of the shared responsibilities of participants for miscommunications. Estival and Molesworth (2009) studied English second language pilots radio communication in the general aviation environment and found that both they and English native speakers rank understanding other pilots as the most challenging task. However, to date there are even fewer studies of aviation English testing, and given the importance of the LPRs and their implementation in tests of aviation English, I shortly turn to overviewing two such studies. But rst a word on our methodology.

388

ALDERSON

METHODOLOGY Commenting on an earlier version of this article, one reviewer considered that it was not academic, in that it consisted of anonymous quotations. Not identifying sources was considered to be unacceptable practice, and the individuals cited should at least have been identied as representative of particular groups of stakeholders. Information about methodology, instruments, subjects, and hypotheses and data analysis should have been included, and that lack alone was a reason for rejection. However, the traditional organisation of research reports that the (ironically anonymous!) reviewer demands does not easily apply to an article like this, which is reviewing a whole range of issues in the politics of aviation English testing. In addition, the issue of anonymity is not as easy to address as the reviewer assumes. First, it is normal practice in research to anonymise respondents. Second, in the case of research quoting sources on discussion lists, it is often impossible to know exactly what groups of stakeholders a respondent represents, other than the fact that they are probably, in this case, interested in aviation English teaching or testing (otherwise why would they sign up to such discussion lists?). Indeed, I addressed the issue of reporting such research in the nal chapter of a volume I recently edited on The Politics of Language Education (Alderson, 2009b), where I discuss the difculties that authors face in describing, discussing, understanding, and publishing accounts of the role of individuals and their agendas within organisations:
The problem is how to gather valid information not only because the method of data collection is likely to inuence the data but also because the behaviours and attitudes being investigated may be subversive, controversial, never openly acknowledged, especially to a researcher. . . . The eld of micropolitics of language education has barely developed a research methodology, and has certainly not yet developed an accepted way of reporting its ndings. Therefore, there is much reliance on anecdote, personal observation and partial accounts. Once these are published, they can, of course, be challenged. But if they are never published, then not only do the accounts go unchallenged, so does the behaviour that it is argued underlies the accounts. (p. 235)

The methodology I report and use in this article includes questionnaires, individual letters, and e-mails to contacts in the aviation industry; quotations from unsolicited e-mails to me; postings on discussion lists; document and website information analysis; and reports of my own personal experience. Clearly some of the data reported are selective and possibly biased, and they can and should be challenged, as in any academic debate. However, I cannot and will not report what positions or institutions informants represent, because then they could easily be identied, and that would not conform to ethical guidelines.

STUDY 1 (2008) The intention of the LPRs is to ensure that the language prociency of pilots and air trafc controllers is sufcient to reduce miscommunication as much as possible and to allow pilots and controllers to recognize and solve potential miscommunication when it does occur (ICAO, 2010, Section 4.2.1) and that all speakers have sufcient language prociency to handle non-routine

POLITICS OF AVIATION ENGLISH TESTING

389

situations (ICAO, 2010, Section 4.2.2). Clearly the consequences of inadequate language prociency on the part of pilots and air trafc controllers are potentially very serious. Equally, if the language prociency tests used to license aviation personnel are unreliable or lacking in validity, there are potentially dangerous consequences. These are, after all, high-stakes tests. Therefore, the Lancaster Language Testing Research Group decided to search for evidence of the quality of aviation English tests. Alderson (2010) reported in detail on that investigation. After an initial search on the Internet in 2007 had revealed very little information on the validity and reliability of the few tests that were identied, LTRG designed a questionnaire survey to elicit detailed information relevant to professional concerns about standards of good practice in language testing. Based upon the guidelines for good practice of the European Association for Language Testing and Assessment, the questionnaire went through a number of iterations before it was judged to be clear and comprehensive, but making minimal demands on respondents by containing largely closed response questions. The questionnaire was made available on the Internet through SurveyMonkey, and potential informants were contacted by e-mail or through discussion lists and invited to contribute to our survey; full details, including a copy of the questionnaire, are given in Alderson (2010). We contacted 74 individuals or institutions and received 22 relevant responses. From these responses, we identied 20 tests that seemed to be specically oriented to aviation English; we were somewhat surprised to nd that the International English Language Testing System was claimed to be being used for aviation purposes. The answers to questions about test specications, test design, and item writing were on the whole adequate, but some test developers did not seem to understand our use of terminology, although it was far from technical. There were reasons to suspect that rater reliability was problematic in several cases, and indeed several respondents did not reply to questions about reliability. Most claimed to trial their tests and to analyse the results, but only three made the results of the analyses available to the public or interested parties. Only ve respondents provided details of validation studies or any further documentation to back up their responses and the claims made on websites. We concluded that little or no condence can be had in the meaningfulness, reliability and validity of several of the aviation language tests currently available (Alderson, 2010, p. 63). Clearly this is an unsatisfactory situation, and more research is needed. Fortunately there are examples of good practice in aviation English test development, as attested by the few validation studies we received, and as was also revealed in a special issue of the Australian Review of Applied Linguistics, which appeared after our survey was complete. That special issue contains articles by Huhta (2009) on aviation testing in Finland; van Moere, Suzuki, Downey, and Cheng (2009) on the development of the Versant Aviation English test; and Read and Knoch (2009), who provide an overview of the state of aviation English testing in Australasia. All three articles make clear that good quality aviation English tests can be and have been developed and that professional standards can be and have been followed. It is, therefore, all the more to be regretted that rather many of the test developers from whom we received replies do not seem to be aware of, or to meet, such standards. Alderson (2008) concluded that we can have little condence in the meaningfulness, reliability, and validity of several of the aviation language tests currently available for licensure (p. 1). It was therefore recommended that the quality of language tests used in aviation be monitored to ensure they follow accepted professional standards for language tests and assessment procedures.

390

ALDERSON

When the results of this research were circulated among interested parties, we received numerous comments from correspondents, of which the following are examples:
1. How relieved I was to hear about your research into aviation English testing. The ICAO option of leaving such an important issue as this to market forces leaves the clients for these tests open to any kind of commercial skullduggery. 2. In this highly regulated industry, I have been aware from the outset of the ICAO initiative of a continuing stream of unsupported claims being made in regard to various forms of statute, standards or accreditation for testing. 3. I am extremely concerned about safety and am worried about the way tests have been used as a means to avoid meaningful change and training. I also feel that some of the tests I have seen have not been constructed by persons with a deep enough understanding of the issues pilots face. 4. The point is that the LPRs havent been given teethand may never have any. . . . Everyone realizes that currently-working pilots and ATCs jobs have to be protected. Therefore, the new LPRs really have to do only with personnel who will be licensed from now on . This is why I expect the teeth question to surface in the next few years (possibly sooner). 5. Some countries/organizations will knowingly choose crap tests. Seems to me that for your research to be complete, you should not be putting all the onus on the test providers, but shining a light on the people that select the test for their country/company, and ascertaining their motives. They, as well as the test providers, bear some responsibility. When you choose a gym to become a member of, you dont just look at the glossy marketing pamphlet, but you also do your own appraisal of whether the machines are safe before jumping on the running machine or lying under a benchpress. 6. I can share with you my opinion that many pilots who have been certied at ICAO 4 by the X test are in my judgement closer to ICAO 2. Others here are of the same opinion, and an effort is being made to meet a higher standard, one which meets the real objective of improving safety. 7. The prevailing attitude (pressure from airlines, inter alia) is to meet minimum standards just barely if a grade of C is passing, then I passed, and thats that. So, when ofcialdom and the airlines etc look at the LPR scales, what they see is All I have to do is to gure out what the trick is to assure myself a C

In short, it would appear that there are political and commercial issues at stake in aviation English testing, and that impression is conrmed in the sections that follow.

STUDY 2 A second set of studies (Alderson & Hork, 2008, 2010) investigated the plans for implementation of the LPRs requested of national civil aviation authorities (NCAAs) by ICAO. Because many NCAAs had failed to produce or accredit language tests by the March, 2008 deadline, it was agreed that the 2008 deadline could be extended to March 2011. Contracting States that were not able to comply with the language prociency requirement were urged
to post their language prociency implementation plans including their interim measures to mitigate risk, as required, for pilots, air trafc controllers and aeronautical station operators involved in international operations on the ICAO website (http://www.icao.int/anb/s/lp/lpcompliance1.cfm) as outlined in accordance with the Associated Practices below and ICAO guidance material (http://www. icao.int/fsix/lp/docs/Guidelines.pdf). (Circular A36-11 in Appendix 7 of Alderson, 2008, p. 47)

POLITICS OF AVIATION ENGLISH TESTING

391

The research was carried out in four rounds. Round 1 addressed an e-mail letter to the person named on the ICAO FSIX website as responsible for implementation of the language prociency requirement in each national civil aviation authority. The letter requested the names of the tests that the authority had recognised or approved. The second round involved going back to the ICAO FSIX website just mentioned and conducting a detailed analysis of all LPR implementation plans posted on that website as of October 2008. The third round was a replication of the second round, completed in spring 2010. It should be noted that the third round indicated the state of readiness for the implementation of the ICAO requirements only 1 year before the deadline of March 2011. The fourth round monitored the FSIX website 3 months after the 37th General Assembly of Member States of ICAO (see the ICAO and Monitoring/Accreditation section) in September 2010, and again 3 months after the supposed March deadline had passed (i.e., in June 2011). In Round 1 a mere 17 replies from NCAAs were received to the e-mail letter, which rather conrmed our suspicion that such national authorities do not feel compelled to respond to enquiries from the general public and appear not to be responsible to ICAO either for their actions. The tests that respondents claimed they were using are listed in Table 1. A total of only 10 different tests or assessment procedures was reported, seven of which were reported by a single respondent. The Mayower College procedure (for pilots and controllers) was reportedly in use by seven NCAAs, and the ELPAC test for air trafc controllers was being used by six NCAAs. Of these 10 tests, only one has produced a formal publicly available validation report. Postings from NCAAs on the ICAO FSIX website in Round 2 (late 2008) indicated that appropriate aviation English assessment procedures are not being widely used. Of the mere 53 out of 190 NCAAs stating compliance with the LPRs, different countries replied with varying degrees of information, which rarely constituted evidence of compliance. For those 14 states that provided estimates of the language prociency levels of pilots and ATCs, it is far from clear how accurate these estimates are, as details of the assessments are scant and not always obviously relevant. The actual tests or procedures mentioned are typically vague and unhelpful, with terms like verbal testing, Written, Listening, Interview, direct testing procedures, and linguistic history. For details, see Alderson and Hork (2008, Table 4). Alderson and Hork (2008) commented,
TABLE 1 Tests and Assessment Procedures Reported in Round 1 Aerosolutions (1) ALITE (1) Assessment Services (FLE of New Zealand/ Aviation Services) (1) Aviation Australia (AEPT) (1) Aviation Services Ltd (1) ELPAC (6) FAA test (to be customised) (1) Mayower College (7) RELTA (2) TELLCAP (1) Note. Numbers in brackets denote the number of States that claimed to use this form of test or assessment.

392

ALDERSON

Clearly some of these methods of assessment are, at best, suspect. What do direct testing, formal evaluation, conversation or interview actually mean? Why does Ethiopia use a written test and what does this consist of? Why are there so many gaps even in this information? (p. 9)

Summarising the results of Rounds 1 and 2, Alderson and Hork (2008, p. 1) concluded that the lack of (evidence for) compliance gives cause for concern, ongoing detailed monitoring of implementation plans and compliance is essential, there is reason to suspect that ICAOs recommendations with respect to evidence for test quality are not being taken seriously, and the conclusions are conrmed of Alderson (2008) that we can have little condence in the quality of several of the aviation language tests and assessments currently available for ight crew and air trafc controller licensure. The results of Round 3, completed in March 2010, show some changes from 2008, but with only 25 new postings, mainly from small states, and with no details of which tests had been accredited, if any. To achieve a more complete picture of which tests NCAAs had accepted, we proposed to send a simple questionnaire to ICAO Member States requesting details. We considered that response rates would be much better than they had been in Round 1 if ICAO allowed us to send out the questionnaire under their auspices. The reply we received from ICAO was discouraging:
While I can understand how a questionnaire with ICAO blessing would get a better response, I am not comfortable with the idea. We have sent a State Letter AN 12/44.6-09/53 in July this year and we are still collecting responses for it. We actually made a plea with our regional ofces to canvas States in their region to respond to it asap. The letter asks a very simple question: will you be compliant by March 2011 or not? Weve received a little over 70 replies so far. All positive, of course. So, sending another letter, with another questionnaire, would probably be considered too much by management and we would potentially get pushback from States.

We received a similar unhelpful response from the International Civil Aviation English Association (ICAEA). It is hardly surprising that Member States claim they have implemented the ICAO requirement, but can their claims be believed? Correspondence on the Flight English Forum (a discussion list for teachers of aviation English) recently conrmed rather incredible claims at a March conference:
I attended the ICAO workshop on the LPRs yesterday and today. Funnily, every member state has declared themselves compliant. The Italian Service Provider claims to have no ATCOs under level 4. They also went on to say that there has never been an incident or accident in Italy due to communications. I guess he meant after October 8, 2001. Air France has granted some 70 something percent of its pilots level 5! Romania is and has been compliant with all level 4s for years now.

Finally, in January 2011, 137 Member States were still not compliant with the LPRs, but by June 2011, 43 more Member States had declared that they were compliant with the LPRs but most gave virtually no information, and therefore it is legitimate to question mere claims of compliance without supporting evidence. ICAO has not maintained any credible oversight or any proper monitoring of progress toward the March 2011 deadline. It seems highly likely that claims of compliance will not be challenged

POLITICS OF AVIATION ENGLISH TESTING

393

and there will be no ICAO investigation of what tests or assessment procedures are being used, let alone whether these are valid and reliable. ICAO claims that it is the NCAAs responsibility to do such monitoring, but that is not what happens with ICAO Safety Audits, so why are decient or nonexistent assessment procedures not considered a threat to safety?

NCAAS: THE CASE OF THE UKCAA To investigate an NCAA in somewhat more detail, questions were asked on the two main aviation English discussion listsnamely, the Flight English Forum and the ICAEA discussion list about expertise in language testing and assessment within the UKs Civil Aviation Authority (UKCAA). The general tenor of responses was that the authority itself acknowledged that it did not have such expertise:
1. In the documentation the CAA sent to [Flight Training Organisations] they mentioned that they did not have expertise in the area of assessment and would have to bring in outsiders. I have no idea who the organisation is that has produced/is producing a test. 2. March 2010: I was just sitting next to the English CAA rep and he said they are still working on how to approve tests.

Further investigations revealed the existence of the UKCAA Institute, a wholly owned subsidiary of the UK Civil Aviation Authority (http://www.caainternational.com/). Despite the apparent lack of expertise in language testing of the UKCAA, this website advertises the UKCAA Institutes English for Aviation Language Testing System (CAA International Ltd., n.d.) and makes the following statements about this test:
Fully compliant with all published ICAO standards and recommended practices (SARPs) and associated documents the CAA International EALTS has been designed and developed by operational and linguistic experts specically in support of the ICAO March 2008 language prociency requirements. (para. 5) The CAA International EALTS is an effective means for obtaining from aviation personnel the required gradable language sample from which accurate and reliable assessments of language prociency in the context of aviation can be made for professional licensing purposes. (para. 7)

The following is an extract from a document on the website (CAA International Ltd.):
The EALTS is being used or has been used to assess the English for Aviation language prociency of pilots and air trafc controllers of the following ICAO Member States for benchmarking and/or license endorsement purposes: Armenia, Belarus, Kazakhstan, Libya, Nigeria, Poland, Romania, Russian Federation, Qatar, Saudi Arabia, Spain, Sweden, Turkey, Ukraine, United Kingdom (para. 21)

It should be noted that none of the entries of these countries on the ICAO FSIX website investigated in Study 2 reported earlier makes any mention of this test or of its recognition for the purposes of the implementation of ICAOs LPRs. Given the lack of hard evidence on the CAA website about the tests quality, it was decided to address the two aviation English discussion lists in order to gain answers to the following questions:

394 1. 2. 3. 4.

ALDERSON

Does anybody out there know about this test? Is anybody out there involved in helping to develop this test? Does anybody out there know anything about the evidence for the quality of this test? Does anybody think that there might be a potential conict of interest between the wholly owned CAAI developing this test and the UKCAA recognising language tests for aviation purposes?

Among the responses received the following is typical:


This is totally new and somewhat alarming, not least for those who are investing in time and expertise to develop a credible test. The answer to your questions 1, 2, and 3 is No, No and No again. The answer to 4 is that there is a denite conict of interests.

One respondent provided the following information:


This is a reply which we received today when we asked about the status of the relationship between CAA and CAAi Dear Mr xxx CAAi is a separate company which provides advice and services to overseas states in the eld of aviation, mainly in the Far East. They have developed their language testing system in co-operation with a UK-based language school and now offer it to those National Aviation Authorities that would not wish to develop their own system for testing English. It is unlikely that it will be used by individuals as will the system used by the CAA which accepts aviation english tests conducted by a school accredited to the British Council or other appropriate organisations such as EnglishUK. We list on our website information on those schools that have advised us they have a test that meets the ICAO criteria.

It should be pointed out that the British Council has no expertise in aviation English testing, or indeed in the development of any serious professional language test. However, the British Council has consistently failed to respond to enquiries as to its role in aviation English testing. It is also important to note from the correspondence just presented that listing claims of language schools about their tests does not amount to professional scrutiny and is, in our view, irresponsible. What appears to be clear is that the UKCAA and its wholly owned subsidiary are involved in commercial operations using tests for which there is no evidence of validity and reliability and are using the name of the UKCAA to commercial advantage with a wide range of other civil aviation authorities. They have also clearly abdicated responsibility for any form of professional oversight or monitoring of quality, presumably in the interests of commercial gain. However, this case is simply one local example of a wide range of vigorous marketing and sales campaigns internationally, devoid of any evidence for the quality of the products and services being touted, and without any degree of regulation nationally or internationally.

TESTS BEING DEVELOPED AND SOLD In an effort to identify which tests of aviation English are currently being developed and sold internationally, in addition to periodically searching the Internet, discussions on the two main

POLITICS OF AVIATION ENGLISH TESTING

395

discussion lists for aviation English were monitored, namely, the Flight English Forum and the ICAEA discussion list. Interestingly, very few names of tests have emerged, but what has emerged very strongly is the sense of those posting that most tests are being developed and sold without any attempt to provide evidence for the validity, reliability, or quality of these tests. Strong feelings are expressed about the existence of cowboy operators and smooth sales-talk, and the following quotations are typical of the discussions:
Person 1 Im sure there are solid reasons why companies and organizations do not comply (lack of awareness, of knowledge, of time, of money, and lack of scruples, etc.), but regrettably none is acceptable in law. Or in practice. One notable exception is a well-established provider which has so far declined to publish anything in support of their test product. People from this group have been early contributors to the eld, and have been highly inuential in solutions being conducted in a major country. But there doesnt appear to be any published support data for what they do. In personal communications these folks have politely shown regret for this continuing lapse. However, I suppose they know this means ALL the test data they are providing for that country is, frankly, highly suspect. Person 2 Its necessary to measure what a test really doesotherwise you have no evidence to say that its scoring what you want it to score. Amazingly, many amateur test producers are getting around that little issue by just ignoring itby collecting no data and doing no analysis at all. Such test results are pretty much worthless.

It may be that what matters more to airlines and to NCAAs is not so much safety issues but that pilots and ATCs are found to be at Level 4 at least, whatever the nature of the test. It is clear from these comments, and many more, that many organisations are claiming validity without any evidence and likely without any attempt to nd evidence. Moreover, not a few organisations make the wholly misleading claim that their test has been approved by a supranational body like the Joint Aviation Authority (JAA), or ICAO. Some further extracts from the discussion lists challenge such claims.
Person 3 It would seem to be the case that claiming JAA accreditation on the back of local accreditation is dubious. This rst appeared with a UK school in Bournemouth (BBSI) who claimed that because they tted the extremely imsy UK CAA recognition criteria at the time (any language school with British Council recognition) they were free to operate such accreditation in the entire JAA area. Obvious nonsense! Person 4 The fact that the CAA has (cynically) abrogated its duty in this regard to a list of schools recognised by the British Council as offering a service in general English language teaching does not mean that instructors recruited by those schools to teach aviation English are JAA-accredited. In no way whatsoever. . . . The next letter should be to the British Council standards ofce. . . . They will not be pleased to nd themselves embroiled in the world of aviation accreditation.

396

ALDERSON

CERTIFICATION OF AVIATION TRAINEES The prior discussion of the role of NCAAs and the JAA in test accreditation and monitoring of false claims raises a related question: What are the NCAAs doing with respect to the certication of English of those who are trained in aviation under their jurisdiction, be they native speakers of English or not? It would appear that if pilots have received aviation training in the United States, at least, then all such applicants for certication of their English language prociency need to do is to pay $2 to have their license stamped English procient (EP) without having to take a test of aviation English. The argument appears to be that if they have been trained through the medium of English, they must, ipso facto, be procient in English. Again, I have recourse to the not inconsiderable evidence of anecdote:
Person 5 The $2 pays for the plastic to issue or renew a license with EP on top left hand corner. Inquiries and concerns might be referred to: <Theresa. J. White@faa.gov> As soon as this process started, reports started to come in of foreign (non U.S.) pilots who had gained the new EP pilot license from FAA, but had failed the test ruling in their own country.

THE CERTIFICATION OF LEVEL 6 AND NATIVE ENGLISH SPEAKERS One issue that has raised some concern in aviation English circles is whether native speakers of English should also undergo the same certication process as nonnative speakers. In the United States in particular, it would appear that not only are those who apply for an English certicate somehow automatically granted one on payment of a trivial fee, but also all those who were already operating as pilots or air trafc controllers under previous legislation are entitled to retain their license at Level 6 under what are known as grandfather rights (grandfather rights refers to the rights conferred on certain extant operators to operate under the previous regulations when new regulations/legislation come(s) in; Alderson & Hork, 2008, p. 9). Moreover, in the ICAO (2009) Circular 318 entitled Language Testing Criteria for Global Harmonization, Section 4.2, on the assessment of language prociency at Expert Level 6, it is claimed that no professional expertise in language assessment is needed to identify native speakers of English.
Since language prociency at both ends of a prociency scale is relatively easy to evaluate, it is not difcult to recognize Expert (including native or native-like) prociency. For these reasons, the assessment at Level 6 should be carried out by a trained and qualied rater but not necessarily by a language testing specialist or require the use of a fully developed specialized language test. (p. 11)

In fact, the regulations allow for a simple conversation to be held with putative native speakers, by staff who are not qualied in language assessment, in order to certify native speakers as being at Level 6 (which affords lifetime certication) without formal testing. This in effect means that native speakers of English are automatically regarded as English procient for the purposes of aviation. It is not, however, self-evident that this is a good idea, as the following contributions to the discussion lists make clear:

POLITICS OF AVIATION ENGLISH TESTING

397

Person 6 For too long English mother tongue speakers (of whom I am one) have failed to address the problems of being able to communicate in their own language. Recently I visited AMS and on the outbound ight from LGW the safety announcement was given by someone whom I would judge from their English dialect comes from the Gatwick area and who was almost unintelligible even to me .Yet on the inbound ight the announcement was made by someone who was Dutch but who was able to communicate far more clearly than the English speaker. This is symptomatic of an important issue in that mother tongue English speakers within the aviation industry need to have training in order to be able to communicate effectively. Person 7 . . . a common complaint from European pilots that US RT (radiotelephony) language is too fast, often non-standard AND plain English, and with an accent that is often hard to understand. Listen to any RT transcript from the US and you will nd the above. Its not just US AT: a friend of mine working at CDG/LFPG has often said that US pilots are the hardest to understand for the very same reasons quoted above.

THE ROLE OF ICAO An interesting question, in light of the apparent lack of any oversight of the quality of aviation English tests, is: What is the role of ICAO, which, after all, issued the LPRs, and whose PRICESG developed the Language Prociency Scales?
There is a widespread misunderstanding of ICAOs functioning, mandate and budgetary constraints. ICAO is an international legislative body, but not its police force. Both its decisions and their implementation are entirely dependent on its sovereign Member States. This very sovereignty may explain in part some reluctance to take part in such a survey conducted by an English-speaking country. (Alderson, 2008, p. 7)

One person on the discussion lists asked,


Can someone conrm that there are no APPROVED ICAO tests, and that the only requirement is that the test results be expressed according to the ICAO rating scales.

Person 8 provided the simple answer, and added a damning comment:


There are no ICAO-approved tests to assess the ICAO LPRs (which themselves were prepared with no empirical evidence to support their publication).

Person 9 identied the source of the problem as being lack of oversight or test accreditation:
The problem is that there is no accrediting body saying that these people, that we people, are what we claim. . . . Eventually there will be a body with the ability and know-how to accredit the AE world.

I return to this issue shortly, but rst we must consider the nature of ICAO. In fact, already in Study 1 reported earlier, its nature became somewhat clearer:
We were somewhat surprised to discover that ICAO itself has not chosen to approve or disapprove of any testing procedure. However, as an organization which is part of the United Nations, ICAO can set

398

ALDERSON

out the requirements of standards, and indeed has provided guidelines on implementation of language prociency requirements for Member States (ICAO, 2004) but apparently cannot enforce these. It is the responsibility of the national civil aviation authorities to decide which tests or assessment procedures they will accept. Whether such national aviation authorities have the competence to judge the quality of the tests available is unclear to us at present. (Alderson, 2008, p. 4)

The research presented in Study 2 and in the NCAAs: The Case of the UKCAA section above suggests that, in general, NCAAs vary considerably in their competence to judge test quality. Nevertheless, in defence of the ICAO, it is worth quoting Person 10:
It should be remembered that ICAO is not infallible and as an agency of the United Nations it is subject to the same political and other constraints as any other participant of that organisation. ICAO is not a police force. In 1944 many well intentioned people with wonderful foresight met in Chicago to consider the future of civil aviation after the end of the worlds greatest military conict. As a consequence ICAO was subsequently established as an agency within the UN and has achieved enormous success for the aviation industry but nothing is perfect. But on a productive note, ICAO has delivered a lot of solid ideas on how to establish a testing process to produce respectable scores. The recent harmonization document from ICAO contains some of those many elements and ends with a checklist that should be helpful in choosing a respectable test provider.

Of interest, Person 11 replied,


I concur with XXX regarding the role of ICAO and the power it has and has not. It is a rule-making body providing advice and assistance to Contracting States. It is, however, up to each Contracting State to pledge their compliance (or not) to the thousands of standards and regulations set out by ICAO. What continues to bafe me is the ability of those holding international responsibility in ICAO . . . to continue to maintain what appears to be an ineffectual silence, possibly in the belief they are not in a position to lead positively.

It should be noted that ICAO (2010) recently produced a second edition of the Manual on the Implementation of ICAO Language Prociency Requirements. However, as far as we can ascertain, there is no ICAO monitoring of the FSIX website and no (direct or indirect) assistance to those who have not posted compliance, or who are evidently having difculty in complying with ICAO directives.

ICAEA VERSUS FLIGHT ENGLISH FORUM The ICAEA is the ofcial representative organisation for aviation English teachers, has frequent dealings with ICAO, and has recently been recognised as an institutional member by the International Language Testing Association (ILTA). However, its discussion list is much less active than the Flight English discussion list already referred to. The description of the Flight English Forum reads as follows:
This group brings together trainers and practitioners from the world of professional aviation with the common goal of sharing ideas and experiences about Aviation English training. The groups attention

POLITICS OF AVIATION ENGLISH TESTING

399

will focus on the role of English in the operation of international aviation and as more demanding tests of language prociency are introduced, the importance of a high degree of excellence in the elds of training and testing.

Flight English is very critical of both ICAO and ICAEA, and certain individuals on Flight English are particularly vociferous in expressing their opinionsa good example of not-so-hidden agendas and micropolitics. It is, nonetheless, worth examining selected quotations to get a better feel for the politics of aviation English testing.
1. There used to be just one forum of this type. It was set up by ICAEA following requests for an internet forum after the ICAEA Warsaw symposium in 2002. It quickly became obvious that the leadership of ICAEA did not like any form of overt questioning or criticism of the ICAO LPR proposals even though many others had serious misgivings about the form the initiative had taken. It was basically impossible to put forward or discuss contrary points of view on that forum. As a result, I set up this forum to allow open, unfettered discussion of the issues to take place. 2. That ICAEA has become no more than a front group for mainly commercial interests is suspected. That ICAEA seeks close co-operation with ICAO is well known. That there is plenty of opportunity for hanky panky is a reasonable conclusion. Please assure us that this is a wild conspiracy theory and nothing more.

That request received no reply. The Flight English Forum is particularly critical of the LPRs developed by the PRICESG group on behalf of ICAO, and the following post is typical:
This is a massively awed system and it has to be withdrawn from service, redesigned from the ground up and not relaunched until it is t for purpose. Legislation which is barely workable and subject to constant questioning and being pushed and pulled to try to make it t a mould is not legislation, it is a joke.

Such forthright opinions seem to be fairly comprehensively ignored by both ICAO and ICAEA, but they certainly nd support on the (very active) Flight English Forum and attest to a growing dissatisfaction with ICAO, its perceived desire to leave test recognition up to market forces that are no substitute for professional quality control and oversight, and its remarkable inactivity and reluctance to play a more proactive role in promoting quality among NCAAs. This then raises the question of the responsibilities of ICAO and test accreditation

ICAO AND MONITORING/ACCREDITATION As we have already seen, ICAO had no authority to accredit aviation language tests, and it appeared they have neither resources nor the desire to do so. But it is not the case that ICAO has done nothing to encourage implementation of the LPRs, which they produced in 2003 and which had been developed by the PRICESG group which they sponsored. In addition, the Manual on the Implementation of ICAO Language Prociency Requirements (also known as Document 9835) is available on their website (for a price), they have produced the document Language Testing Criteria For Global Harmonization referred to earlier (ICAO, 2009; otherwise known as Circular 318, also available for a price on the ICAO website), and they have run numerous

400

ALDERSON

international workshops on the LPRs. In addition, they sponsored the production of a CD containing a few speech samples from various assessment procedures. Unfortunately that CD is not easily obtained, several of the recordings are of poor quality, and there are very few samples. As a result of dissatisfaction with this CD, ICAEA is currently coordinating a much more carefully thought-through rating of speech samples (known as the Rated Speech Sample project), which will shortly be available free of charge on the ICAO website. Nevertheless, criticisms of ICAO abound.
1. The question about a universal oversight body. This was raised within ve minutes of ICAO announcing the LPRs and the scales to the world in Warsaw in 2003. They scoffed at the proposal then and they continue to scoff at it. Their solution you will remember, The market will provide. Well, the market has provided with the results that we all know about. 2. ICAO is a very guarded organisation which will not admit to errors of judgement even though they are well aware of them. Everybody makes mistakes and receives poor advice. ICAO was poorly advised in this case. They should be honest enough to admit it then we can get on with a proper revision of the case. 3. The part of ICAO responsible for this debacle is a small part of the overall organisation and is not necessarily representative of it in its level of incompetence but those who are running the language prociency requirements should have the good grace to admit their failings and do the honourable thing by resigning. Those who have supported them and encouraged them in their error for the most self-serving of reasons (nancial gain) should be shunned.

Strong words, strong opinions. The question is, What can be done about this apparently unsatisfactory situation? Research has revealed the lack of evidence for quality of many aviation English tests. It has also revealed the lack of international oversight of the ICAO regulations, despite the high-stakes nature of aviation English testing. It is worth reporting the conclusion of Alderson (2009a): However, even more important, it is argued, is the urgent need for some means of monitoring the quality of the aviation tests and assessment procedures that are, or will be, available before March 2011 (p. 181). In fact, an attempt is now under way to bring some order into what seems to be a chaotic situation. The author made a proposal to ILTA that it might be possible to set up some sort of an accreditation scheme that would be voluntary, that would not be compulsory, and for which test developers would pay a fee to cover the costs of the accreditation process. Such a fee would cover running expenses and the fees for the aviation and testing experts who would be needed to carry out the accreditation itself. ILTA, through its then-president Dr. Carolyn Turner, agreed to set up a task force to examine the matter in some detail. Members of the task forceUte Knoch, Candace Farris, Rob Schoonen, Dan Douglas, and the authorheld a number of virtual meetings during the summer of 2009 and reported the results of our discussions to ILTA. ILTA agreed that a Joint ILTAICAO Task Force should explore the possibility of creating some sort of test accreditation process, and a meeting was held in Cambridge, United Kingdom, at the end of October 2009 between the president of ICAEA, the ICAO person responsible for the LPRs and myself. ICAO subsequently ofcially welcomed the initiative in late November 2009, and the ILTA Task Force developed a set of guiding principles for such a joint task force. The initial advice of ICAO was that an agreed accreditation scheme should be set up in the spring of 2010, in time to be announced at

POLITICS OF AVIATION ENGLISH TESTING

401

the March 2010 meeting of the ICAO Council. Furthermore, at least one voluntary accreditation process should have been completed in time for the scheme and its rst results to be announced during the General Assembly of ICAO, which was due to meet in September 2010. That would leave only 6 months before the March 2011 deadline, but we were told that if an accreditation system was seen to be in place, then the perceived danger of the LPRs being indenitely postponed, or cancelled, by the General Assembly, would have been averted. The rst meeting of the Joint Task Force was held at ICAO headquarters in January 2010. Participants included representatives of ICAO, the ILTA and ICAEA Presidents, and a representative of the International Federation of Air Line Pilots Associations. That meeting conrmed what had already been informally establishednamely, that despite the recognised weaknesses of the LPRs, these cannot be changed in the medium term; that the question of legal liability must be addressed, if not resolved, before any scheme can be approved; and that a business plan and a budget need to be established before work can begin in earnest. Unfortunately, none of these matters was actioned by ICAO, and a project manager was appointed only in the late summer of 2010. Meanwhile, the ICAO Council received no notication of progress, ICAOs own deadlines were not met, and the General Assembly of Member States of ICAO in September 2010 agreed a proposal from China and Nepal that Member States be allowed a exible approach to compliance with the LPRs. The already-postponed deadline from March 2008 to March 2011 appears to have been effectively abolished and noncompliant Member States are merely urged to post their implementation plans on the FSIX website. We have already seen (see the Study 2 section) what an ineffective tool that website is and that ICAO has neither powers nor desire to enforce implementation of its own LPRs. The reaction of one person to the 2008 survey appears to be true:
The ICAO has teeth only when (the most powerful of) its member states let it open the box and take them out. All states are sovereign, but some are more sovereign than others.

It appeared that the following prediction on the Flight English Forum had indeed come to pass:
My personal prediction is that, having reviewed the facts and the state of things, ICAO will quietly announce that the LPR initiative has been postponed sine die or perhaps until some time in the future (say 2020) when all the present suspects are safely retired on generous Montreal pensions. The get-out clause will contain phrases as . . . technology not yet ready, . . . adequate oversight not yet possible, . . . test accreditation not yet feasible . . . and the abandonment will be mollied by references to the fact that it has not been a total failure as awareness has been raised and attention focussed on the problem etc., etc. . . . Expect this announcement sometime in mid 2010.

However, more than one year behind schedule, and very late in the day, ICAO has just, at the time of writing, mounted a website (http://www.icao-aelte.org), which is intended to be the web portal for the Aviation English Language Test Endorsement service. It is as yet unclear who the endorsers will be, what evaluation criteria they will use, and what kind of reports will be written, although the fee for this serviceUS$5,000, plus $3 for every endorsed test administeredhas already been established. This is certainly to be welcomed, but what uptake it will receive is unknown (the service is voluntary), and it remains to be seen whether it will have any impact on the quality of language tests currently in use.

402

ALDERSON

LET US NOT FORGET THE CONSEQUENCES OF TESTING Meanwhile, testing continues, no doubt frequently using unreliable or at best unproven instruments. Examiners may or may not be adequately trained and monitored, the validity of the test outcomes is in question, and the comparability of levels across different tests remains to be investigated, let alone established. Thus we can have little condence that when a pilot or air trafc controller is judged to be English procient at Level 4the minimum level acceptable for licensurethat person has indeed attained the necessary standard. The implications of this for aviation safety are obvious.

REFERENCES
Alderson, J. C. (2008). Final report on a survey of aviation english tests. Unpublished manuscript, Lancaster University, Lancaster, United Kingdom. Retrieved from http://www.ealta.eu.org/documents/archive/Alderson_2008.pdf. Alderson, J. C. (2009a). Air safety, language assessment policy, and policy implementation: The case of aviation English. Annual Review of Applied Linguistics, 29, 168187. Alderson, J. C. (2009b). The micropolitics of research and publication. In J. C. Alderson (Ed.), The politics of language education: Individuals and institutions (pp. 222236). Bristol, UK: Multilingual Matters. Alderson, J. C. (2010). A survey of aviation English tests. Language Testing, 27, 5172. Alderson, J. C., & Hork, T. (2008). Report on a survey of national civil aviation authorities plans for implementation of ICAO language prociency requirements. Unpublished manuscript, Lancaster University, Lancaster, United Kingdom. Alderson, J. C., & Hork, T. (2010). A second report on National Civil Aviation Authorities implementation of ICAO LPRs. Unpublished manuscript, Lancaster University, Lancaster, United Kingdom. CAA International Ltd. (n.d.). Examination services: English for Aviation Language Testing System [Web page]. Retrieved from http://www.caainternational.com/site/cms/contentCategoryView.asp?category=291. Cookson, S. (2009). Zagreb and Tenerife: Airline accidents involving linguistic factors. Australian Review of Applied Linguistics, 32, 22.122.14. doi:10.2104\aral0922 Cushing, S. (1994). Fatal words: Communication clashes and aircraft crashes. Chicago, IL: University of Chicago. Emery, H. (2011). Testing Language for Specic Purposes (LSP): Reections on the issues revisited from the perspective of a test developer. The case of aviation English. Manuscript under review. Estival, D., & Molesworth, B. R. C. (2009). A study of EL2 pilots radio communication in the general aviation environment. Australian Review of Applied Linguistics, 32, 24.124.16. doi:10.2104/aral0924 European Association for Language Testing and Assessment. (2006). Guidelines for good practice in language testing and assessment. Available from http://www.ealta.eu.org/guidelines.htm. Farris, C., Tromovich, P., Segalowitz, N., & Gatbonton, E. (2008). Air trafc communication in a second language: Implications of cognitive factors for training and assessment. TESOL Quarterly, 42, 397410. Huhta, A. (2009). An analysis of the quality of English testing for aviation purposes in Finland. Australian Review of Applied Linguistics, 32, 26.126.14. doi:10.2104/aral0926 International Civil Aviation Association. (2004). Manual on the Implementation of ICAO Language Prociency Requirements (Doc. 9835). Montreal, Canada: Author. International Civil Aviation Association. (2009). Language Testing Criteria for Global Harmonization (Circular 318). Montreal, Canada: Author. International Civil Aviation Association. (2010). Manual on the Implementation of ICAO Language Prociency Requirements, 2nd edition (Doc. 9835). Montreal, Canada: Author. International Civil Aviation Association. (n.d.). ICAO in brief. Retrieved from http://www2.icao.int/en/Home/ Pages/ICAOinBrief.aspx. Kim, H., & Elder, C. (2009). Understanding aviation English as a lingua franca: Perceptions of Korean aviation personnel. Australian Review of Applied Linguistics, 32, 23.123.17.

POLITICS OF AVIATION ENGLISH TESTING

403

Mell, J. (1992). tude des comunications verbales entre pilote et controleur en situation standard et non-standard [A study of verbal communication between pilot and air-trafc controller in standard and non-standard situations] (Unpublished doctoral dissertation). Toulouse, France: University of Toulouse. Prinzo, O. V. (1996). An analysis of approach control/pilot voice communications (DOT/FAA/AM-96/26). Washington, DC: Federal Aviation Administration. Read, J., & Knoch, U. (2009). Clearing the air: Applied linguistic perspectives on aviation comunication. Australian Review of Applied Linguistics, 32, 21.121.11. van Moere, A., Suzuki, M., Downey, R., & Cheng, J. (2009). Implementing ICAO language prociency requirements in the Versant Aviation English Test. Australian Review of Applied Linguistics, 32, 27.127.17.

You might also like