![]() |
Volume 4 ~ May 2012ISSN # 2150-5772 – This article is the intellectual property of the authors and CIT. If you wish to use this article in your teaching or in another format, please credit the authors and the CIT International Journal of Interpreter Education. |
Intake Tests for a Short Interpreter-Training Course:
Design, Implementation, Feedback
Jim Hlavac, Marc Orlando, & Shani Tobias
Monash University
Correspondence to: jim.hlavac@monash.edu
Download PDF (498 KB)
1. Introduction
This article discusses the contents of an entrance test designed for potential interpreter trainees and assesses the contents of the test on the basis of responses from test-takers and trainers. Construct validity in interpreting testing has been the focus of a number of studies (e.g., Clifford, 2005; Lee, 2008; Eyckmans et al., 2009; Turner et al., 2010) but few studies examine the contents of entrance tests that determine trainee selection. Among the tools available for test evaluation, this article focuses on the psychometric category of authenticity, that is, the relationship between test contents and the elicitation of skill performance during the test are those skills that were the focus of post-test training. Responses are gathered not only from test-takers but trainers who were not involved in the test design itself. This article is a contribution to the small but growing body of research on entrance test design (cf. Moser-Mercer, 1985; Bernstein & Barbier, 2000) and in particular to test design for vocationally focused initiatives that target particular potential candidates from less-represented language communities.
he entrance test was designed for selection of candidates for training in an interpreter skills training course. The course was an initiative funded by an Australian state government department, the Victorian Multicultural Commission, that is responsible for multicultural policy and programs that support linguistic diversity. The initiative sought not only to build basic interpreting skills among speakers of new and emerging languages but also to promote interpreting as a career pathway.
The initiative targeted speakers of new and emerging languages (see the Appendix for a list of the targeted languages). However, speakers of other languages could take the entrance test and be considered for inclusion in the training In most cases, individuals eligible to apply for the training were proficient in at least two spoken languages, but they were not expected to have had experience in interpreting or translation, nor were they required to have specific formal training or a minimum level of education. With no specific prerequisites, the program therefore began with an intake test to assess potential trainees. The test was designed to elicit specific information about testees’ educational and occupational profiles and included exercises and questions to diagnose English language level and general aptitude and interest in interpreting. (Language skills could be elicited in English language only—an assessment of testees’ proficiency in languages other than English [LOTEs] was not possible.)
Rather than simply document acquisition of linguistic forms or grammatical structures in a manner typical of traditional language tests, this intake test assessed abilities using competency- and function-based approaches (cf. Quinn, 1993). Such functionally focused testing, that is, testing to see if a testee can perform a particular task using any linguistic form appropriate to the task regardless of complexity, is an approach now commonly used in the assessment of language-focused courses in adult vocational education in Australia. Both second-language-acquisition assessment (e.g., the Australian Adult Migrant English Program) and language and literacy teaching for first-language speakers (e.g., adult literacy and basic education courses) have adopted such teaching and testing to a particular situation or context for their curricula and assessment tasks. In the context of training, the value of the intake test lies in how well it elicits individuals’ functional abilities. The test’s value as an instrument subjected to analysis lies in how easily it can be implemented by testers and used by testees, and whether it relates to skill elicitation that is relevant to subsequent training.
To explore the value of the intake test, we first outline components of existing tests and test materials developed for potential and targeted community interpreting trainees. We then discuss the design of the test used and describe how each subsection was administered to and completed by testees. We examined testees’ performance on the test through a needs analysis, and this determined the content and pedagogical approach for the training. Finally, we measured this intake test’s “authenticity” or “validity”—its ability to elicit responses relevant to the training of interpreters and to allow diagnosis of capabilities and proficiencies—through evaluative feedback from both trainers and trainees. In the conclusion, we provide a comparison and summary of findings.
2. Entrance Tests for Community Interpreter Training
The testing of potential applicants for community interpreter training has received little attention in the relatively modest body of literature on community interpreter training. Most studies on interpreting training focus on the testing requirements for specialized, high-level courses, many of them 2 years in length and postgraduate, that typically train students for simultaneous or conference interpreting (e.g., Bowen & Bowen, 1989; Clifford, 2005; Gerver, Longley, Long, & Lambert, 1989; Lambert, 1991; Niska, 2005; Pippa & Russo, 2002; Timarová and Ungoed-Thomas, 2008). Lotriet (2002) discussed the selection of elements of an intake test for a group of potential trainees for a 1-month crash course in simultaneous interpreting. This test included diagnostic exercises for both languages but also other features such as the individual’s reading activities, personal interests, handling of controversial questions, and self-concept that were thought to be important for the nature and content of future assignments.
The International Council for the Development of Community Interpreting (Critical Link) has recently promoted research into training and entrance testing for community interpreting. Papers from Critical Link conferences have touched on issues relevant to the testing needs we explore in this article. For example, Straker and Watts (2003) discussed training students from refugee backgrounds, many of whom speak languages that are new to their new place of residence; the authors also pointed to the “activist” nature of such training for disadvantaged groups. Michael and Cocchini (1997) also focused on the emancipatory and empowering effects of training young adult bilinguals and placing them as employed interpreters in their local neighborhoods, within language communities that are familiar to them. Penney and Sammons (1997) discussed in detail the experience of training community interpreters in a remote area (of Canada), whereas Valero Garcés (2003) bemoaned the haphazard training of large numbers of community interpreters in a country (Spain) that has only recently experienced large-scale immigration. In Australia, the notion of formal testing for community interpreters is usually discussed in the context of testing competency level for accreditation or formal recognition (e.g., Bell, 1997; Roberts, 2000), rather than as a means of selection for future training.
Few studies have focused on the characteristics of intake testing for community interpreter training. Mikkelson and Mintz (1997) suggested that asking ethics questions is as important a starting point as testing English-language level. Corsellis (2008) argued that test design for low-level training should elicit macro-skills in both languages as well as prior education and employment and include short role plays, sight translations, brief written translations, and free written compositions in both languages exploring the applicant’s motivations. Gentile, Ozolins, and Vasilakakos (1996) also suggested a comprehensive list of features, to test general language skills with a focus on the macro-skills of listening and speaking, knowledge of cultural mores within each language community, basic note-taking techniques, memory retention exercises, and professional ethics.
Of course, an entrance test may include other components that a training course may further develop, such as questions and exercises to assess the applicant’s knowledge of (cultural- or linguistic-specific) discourse-pragmatic norms, topic- and domain-specific terminology; skill level in voice modulation (i.e., enunciation in the L1, pronunciation in the L2), handling or establishing turn-taking conventions, and whispered simultaneous interpreting; and other elements such as stress-management training, dual-tasking exercises, freelance business and self-management procedures, and use of audio-recording and playback technology (Hale, 2007). The test design used here seeks to address skills and competencies that are process related, heeding Hatim and Mason’s (1997) cautions that interpreting and translating testing run the risk of providing only “once-off” demonstrations of skills that are otherwise process rather than product related.
The small body of research on community interpreting training is due to the unfortunate general lack of training that is either available for or required of interpreters. Pöchhacker (2004) lamented the low remuneration and frequent absence of training for community interpreters, which have commensurate effects on the attention and resources afforded to community interpreting, either from governmental or educational institutions. Hale (2007) identified the lack of recognition and pay for interpreters as a cause of the lack of interest or finances available for training, leading to a low demand for training courses and therefore a general paucity of courses available. Government-funded focused training programs are infrequent but welcome, and this study examines an example of pretraining screening for one such program as a general contribution to an emerging area of interpreting studies research.
Australia has particular challenges in developing appropriate practice models for interpreting. In Australia, provision of interpreting for new and emerging languages (see the Appendix for a list of these languages) includes some languages that have only recently completed a process of codification (i.e., the process of choosing which lexical items and forms, syntactic rules, and orthographical conventions are to be accepted in a language’s standard) and standardization (the systematic ordering of rules, conventions, and norms together with the distinction between those forms that the standard includes and those forms that it does not, i.e., “dialecticisms”). For some interpreting-training participants, there may be no established standard in their language/s, which makes the development of appropriate practice models for interpreting extremely difficult (cf. Penney & Sammons’s [1997] suggestions for trainees about noncodified terminology for Inuktitut). To be sure, this is not to suggest that community interpreting in any way disregards the needs of speakers of nonstandard varieties or regional dialects. As many community interpreters in Australia can attest, (at least passive) proficiency in a variety of dialects or nonstandard varieties is highly desirable, if not essential, for many language groups.
3. Method
To encourage a large number of bilingual speakers of such languages to develop interpreting skills, the Victorian Multicultural Commission’s initiative set very few restrictive criteria for its applicants, making the intake test a crucial screening tool. Questions and exercises to assess a number of the elements mentioned above were included in the test design, including: (English) language proficiency, educational and occupational profile, aptitude/motivation, interpreter-specific skills, and writing, reading, and listening (Corsellis, 2008; Gentile et al., 1996; Timarová & Ungoed-Thomas, 2008). The test also asked for a brief written translation (Corsellis, 2008) based on a source text typical of sight translation tasks often performed by interpreters. Questions on ethics (Gentile et al., 1996) and role-relationship—which, in Chesher, Slatyer, Doubine, Jaric, and Lazzari’s (2003) study, was rated as a highly important personal quality among a sample of 92 community-based interpreters—were also included. The mix of tasks was congruent with those included in the various test designs developed according to the International Second Language Proficiency Rating (cf. Wylie and Ingram, 1999) that elicit general and specific functional abilities. Other test design models based on conversational or pragmatic performance (e.g. Walters, 2007) or on control of categories of linguistic forms (cf. Carr, 2006) were not considered. The test comprised 10 areas of focus:
- Educational level and employment experience
- English language level
- Language level of LOTE(s)
- Knowledge of specialist language and terminology
- Knowledge of interpreting skills, ethics
- General motivation
- Reading and writing
- Listening, note-taking, and memorization skills
- Speaking and communicative pragmatics
- Translation exercises
The test was administered to 32 potential trainees in Victoria, Australia—21 in Geelong, 11 in Morwell—in late August and early September 2010. Potential trainees were informed that the macro-skills of listening and speaking would be tested specifically and that the entire test would last 2.5 hr. Testees were required to attempt all sections of the test.
None of the program’s five trainers was involved in the test design or delivery or in the needs analysis of testees’ performance (neither were any of the authors of this article trainers in the program). The first author invited trainers to evaluate the intake test and provide feedback by participating in an anonymous, online SurveyMonkey questionnaire. The questionnaire contained 10 components, with responses graded along a 5-point Likert scale ranging from very important to not at all important. Three of the five trainers completed the questionnaire. Total responses were averaged out to the closest whole number and are discussed below in Section 6.
We also asked accepted applicants to assess the test. Upon completion of the training, all trainees were given a blank copy of the intake test and a letter inviting them to provide feedback, via either a one-page questionnaire addressing the 10 components of the test or an anonymous online survey. Trainees were asked to rate the importance of each of the test’s 10 components for admission to training. Responses were graded along a 5-point Likert scale ranging from very important to not at all important. Sixteen of the 25 trainees completed the questionnaire—all opted for the paper version. Total responses were averaged out to the closest whole number and are discussed below in Section 7.
3. Test Content and Delivery
The test was administered by two testers. The first tester, the first author of this article, is a National Accreditation Authority for Translators and Interpreters (NAATI)–accredited professional interpreter with experience as a community interpreter in Australia and Europe. Both the first and second testers are trained and experienced testers of the International Second Language Proficiency Rating (ISLPR, formerly ASLPR) and the National Reporting System for Adult Literacy and Numeracy (NRSALN), both standard tests used in adult and postsecondary educational settings for ESL and EFL students in Australia. The second tester is also an accredited International English Language Teaching System (IELTS) tester. The tester orally asked the first three questions of the test and wrote down testees’ responses verbatim.
1. Personal details
Name Age Citizenship Date of arrival in Australia Ethnicity/Nationality
2. Languages
What is (or are) your first language(s)? Other languages learnt in childhood?
Foreign Languages learnt later at school/ in adult life?
3. Family
Which language/s did you speak with your parents and other members of family?
Testers also orally asked Questions 14, 15, 17, 20, 25, and 28, (see Sections 4.1 to 4.10). Testees completed all other questions independently—assessing testees’ functional literacy skills in reading instructions and writing answers, skills also required for interpreting when completing sight translations, note-taking, and so forth.
The test included exercises that systematically elicited performance in the four macro-skills and note-taking. For a narrative writing exercise, testees had to read two texts together and answer reading comprehension questions. The testers administered two listening tests to evaluate comprehension and note-taking skills. Both listening tasks were played only once, and testees could take notes in the test papers. After listening, testees were invited to use their notes (and memory) to complete listening comprehension exercises: factual questions and responses (listening test 1) and information-gap cloze questions (listening test 2).
4.1 Education Level and Employment Experience
Information was elicited about primary and secondary education, technical or occupational training, university study, and any other courses completed, along with details about the location, period, duration, language of instruction, and content of coursework (where relevant). Testees provided information about their current and previous employment, including formal job title, place of employment, duration, and duties performed. In addition, questions were asked relating to testees’ voluntary or unpaid work and their knowledge of and eventual engagement with those language communities in Victoria for whom they could be potential interpreters. Information for this section was gathered from responses to Questions 5–9, 16, 17, and a curriculum vitae that testees were requested to provide.
5. Primary Education
Where? When? How many years? Language/s of instruction?
6. Secondary Education
Where? When? How many years? Language/s of instruction?
7. Further Education / University
Where? When? What studied? How many years? Language/s of instruction?
8. Technical / Occupational Training
Where? When? What learnt? How many years? Language/s of instruction?
9. Other courses
Name of course/s? Where? When? How many years? Language/s of instruction?
16. Employment
Please list previous employment, paid or voluntary. (Last job first)
4.2 English Language Level
Information from Questions 5-9 provided a guide to testees’ length and intensity of contact with English as a first, second, or subsequent language. Where childhood and adult education occurred outside Australia or not in another Anglophone country, questions were asked about formal instruction in and/or informal contact with the English language (place, time period, duration, level completed or formal qualifications gained) and subsequent contact with English as a formal subject and/or language of instruction in an Australian educational institution. This was elicited in Questions 10–12 and 14–15 and partly from Questions 1-4 (see above Section 4.1).
10. English Language Learning Overseas
Where? When? How many hours per week? How many months/years?
Level completed / Qualification gained:
11. English Courses in Australia
Where? When? How many hours per week? How many months/years?
Level completed / Qualification gained:
12. Certificates/Qualifications of English level
Name of certificate/Qualification: When awarded?
14. What do you find easy to do in English?
15. What do you need to work on most in English?
4.3 LOTE Language Level
For all testees, acquisition of English followed contact and acquisition of one or more LOTEs. For most testees, proficiency levels of LOTEs could not be diagnostically tested or verified; therefore, testers made inferences about testees’ formal and informal acquisition of language/s and functional use thereof from information about educational and occupational experience. Further information to more closely ascertain proficiency levels in LOTEs was gained through a variety of other questions. These included questions about informants’ first language, language/s of which they consider themselves to be native speakers, any other languages learned in childhood or adulthood, language choice with parents, and language choice with other family members. Questions 4 and 13 below, in addition to Questions 2–3, 5–9 (see above 4.1) focus on self-declared functional proficiency.
4. Personal use
Which language/s do you think in?
Which language/s do you count quickly in?
Which language/s do you speak to yourself in?
When you are angry, which language/s do you speak in?
13. Other language courses:
Where? When? How many hours per week? How many months/years?
Level completed / Qualification gained?
Self-estimations of LOTE proficiency were also elicited through tables in which testees listed functional abilities for each macro-skill from simple to complex abilities. For example, for the macro-skill “speaking,” functional abilities commenced with “use simple greetings,” “small talk,” “talk about your life,” “talk to strangers for 5 minutes” to more complicated oral abilities such as “tell a joke,” “take part in a job interview,” “debate an issue,” and “talk to a group of people of people for 15 minutes about health/education/law.” Testees were invited to indicate which capabilities they had for each of their languages. Self-declared proficiency for the macro-skills of speaking and listening were elicited in Questions 20 and 21, presented below. Analogous questions were asked for the remaining two macro-skills, reading (Question 22) and writing (Question 23), which are not shown below.
20. Speaking
English |
Language 1 |
Language 2 |
Language 3 |
|
Use simple greetings |
||||
Small talk, e.g., talk about the weather |
||||
Talk about your life |
||||
Talk to strangers for 5 minutes about your first country |
||||
Make an enquiry about a job |
||||
Tell a joke |
||||
Take part in a job interview |
||||
Debate an issue (e.g., Should smoking be banned in public?) |
||||
Talk to a group of people for 15 minutes about health/education/law |
21. Listening
English |
Language 1 |
Language 2 |
Language 3 |
|
Can’t understand anything |
||||
Follow someone giving directions |
||||
Understand weather forecast |
||||
Understand a television drama |
||||
Listen to jokes |
||||
Understand radio interview with a famous person |
||||
Listen to a university lecture |
||||
Listen to a doctor /lawyer/university professor talking about their subject area |
4.4. Knowledge of Specialist Language and Terminology
Testees were invited to provide self-ratings of ability, from not good and fair to good and excellent for all languages in relation to specialist areas. Testers generally did not administer this question directly to testees but explained the intention of the question, namely, proficiency in special terms commonly used in these areas and the ability to understand speech and to be able to speak fluently in these subject areas as well.
19. How good is your knowledge of specialist language in English and in your other language/s?
Please rate your knowledge as: not good fair good excellent
English |
Language 1 |
Language 2 |
Language 3 |
|
Medical terms |
||||
Legal terms |
||||
Economics terms |
||||
Political jargon |
||||
Consumer affairs / advertising / marketing |
||||
Literature |
4.5. Knowledge of Interpreting Skills, Ethics Questions, General Motivation
The information flyers for the training did not state that previous interpreting experience was necessary or even advantageous. We anticipated however, that many testees would have experience as formal or informal interpreters. We gauged levels and circumstances of previous interpreting experience through invitations to provide details (Question 18), knowledge of assumed attributes for an interpreter, (Question 26), and anticipated areas of difficulty (Question 27).
18. Have you ever worked as an interpreter before?
26. What do you need to do to become an interpreter or bilingual language worker?
27. What do you see as the hard things about being an interpreter or bilingual language worker?
Two ethics questions were asked. The first was a “faithfulness to dutifully interpret the source text” versus “faithfulness to the truth” test (Question 28). A question relating to personal management of stress in a difficult situation was also asked (Question 29). A further ethics question dealt with confidentiality (Question 30).
28. You are interpreting for a client in a court. The client says something that you know is untrue. What do you do?
29. You are employed by a government agency to interpret for a person who has personal problems. The person becomes abusive to both you and the government agency for whom you are interpreting. What would you do?
30. A husband and wife are divorced and the wife has custody of the children. She has moved to a different city with the children. You have interpreted for the wife and you know where she now lives. By chance you meet the husband and he asks you to tell him the new address of his former wife and children. What would you do?
Motivation is often measured through apparent and initial displays of behavior such as keenness, degree of seriousness shown to the testing situation, attentiveness, evident or assumed diligence in attempting tasks, and others. But these need not be reliable indicators of a person’s aspirations. Questions about future plans (Question 24) and reasons why a testee wants to work in a chosen field (Question 25) are standard questions in job interviews. Written responses may corroborate demonstrated behavior.
24. What do you plan to do in the next two years?
25. Why do you want to work as an interpreter or bilingual language worker?
4.6. Reading and Writing
Two texts were presented for testees to read through. The first text, “How to Become an Interpreter,” was 350 words long and was an abridged and adapted version of a text from the Web site Spanish-translation-help.com (n.d.). The text was modelled to contain a moderate level of difficulty. For example, some sentences contain multiple clauses. The content of the text was specific to matters concerning trainee interpreters only; the text’s register is didactic and advisory, typical of that found in the opening pages of training manuals. Five questions related to information contained in the text and checked the testee’s comprehension. Questions did not require testees to make extra-textual inferences.
The second reading text was directions of use for medication. The text contained 150 words and was a typical example of a sight translation text. Questions related to information contained in the text. Answers were judged for their content accuracy; grammatically incomplete sentences, spelling mistakes, and poor handwriting were not taken into account. The text was comparable, in content and in linguistic complexity, to texts encountered by community interpreters and to texts used in Certificate II to III level courses in postsecondary ESL courses.
Reading test 2: Medication—Directions of use
Directions of use
This medication is pleasant to take and starts to work quickly because it forms a clear solution and is ready to be absorbed as soon as you drink it.
It provides fast and effective relief from: toothache, migraine, cold & flu symptoms, sore throat, muscular pain.
Dosage: Adults take 4 tablets dissolved in half a glass of warm water every six hours.
Children over 15 years: Take 3 tablets dissolved in half a glass of cold water every twelve hours.
Children under 15 years: Do NOT give to children under 15 years of age.
Do not take this medicine if you suffer from stomach ulcers or asthma, if you are pregnant, or if you have an allergy to non-steroid anti-inflammatory drugs (NSAIDs).
Seek the advice of your doctor if you are over 65 years of age and if you take medication regularly.
Questions:
- Does this medication start to work as soon as you drink it?
- Name two things that it provides fast relief from:
- How often should adults take the medication?
- How many tablets should children over 15 take in one dosage?
- Is this medication recommended for children under 15 years of age?
- Name two conditions or symptoms that prevent you from taking this medication
Literacy skills are not a prerequisite for interpreting, although in the context of interpreter training, literacy skills that enable trainees to not only read but also note and record information are still important. Therefore, although the test focused on oral/aural skills, it included a small written component, designed to elicit testees’ ability to narrate, order, and describe visual stimuli. The writing test was based on the narration of a series of events. Visual stimuli for the events was provided in the form of 10 photographs that sequentially showed a cyclist riding on a road, being knocked over by a car, and being attended to by bystanders. The instructions for the writing test read: “Write a story about what happened in the pictures.” Testees were requested to write approximately 80 words over 15 lines, and performance was judged on the accurate replication and the correct sequencing of the visual stimuli in writing. We looked for appropriate past tense forms—past simple, past continuous, present perfect—as well as linking words of sequence. Spelling and handwriting were not taken into account, and the written section was weighted less than the speaking and listening sections of the test. Testees’ performance in the reading and writing sections contributed to the information presented about testees’ acquisition of English (cf. Questions 10–13).
4.7. Listening, Note-Taking, and Memorization skills
There were two listening tests. The first was a conversation between two people, unknown to each other, who engage in casual conversation on the street. Topics of conversation include the weather, daily activities, and a description of the duties of a travel agent. The dialogue was 350 words long and lasted approximately 3 min. We told testees that the listening text would last a few minutes and they were required to take notes, which would enable them to answer questions about the listening text’s content. Testees were able to see in advance the questions that they would be asked in relation to the two listening exercises; however, they were actively discouraged from answering the questions while listening to the listening exercises.
Comprehension of the first listening test was tested through eight questions that required short responses of five words or fewer. Responses were judged according to their accuracy. The second listening exercise was a news report of a car accident. The text was a monologue of 310 words which lasted just over 2 min. Both exercises were designed at a level of complexity comparable to that of the tasks that were contained in the training itself.
No guide or instructions were given to testees as to how they were to take notes. Note-taking was primarily assessed by the number of correct responses that testees gave to the questions, that is, the form and content of note-taking was not assessed as such, simply its ability to assist testees in retrieving information required for question-answering. Testees were not expected to produce notes showing the use of symbols, acronyms, abbreviations and/or contractions for particular content-bearing items which are skills taught and acquired in formal training. Where notes demonstrated knowledge of symbols, contractions, and so forth, this was weighted positively. Attempts to capture every word (which were inevitably unsuccessful) were negatively judged. Lastly, Question 17 asked if testees had had a job that required note-taking or memorization skills:
The script and questions for Listening Test 1 follow:
Listening Test 1: Conversation in the street—Transcript
A. It looks like it’s going to rain soon.
B. Yes, it does.
A. That’s good. We need the rain. The ground is so dry at the moment. It must be very hard for the farmers.
B. Yes. My brother is a farmer and he said that if it doesn’t rain soon, the price of fruit and vegetables will go up.
A. Oh. Look, it’s raining already.
B. Yes, do you have an umbrella? I don’t want to get wet. I’m wearing my best clothes and I have a job interview in about 30 minutes.
A. No, I’m sorry, I don’t have an umbrella. Maybe we should go to the bus stop over there. At least there we won’t get wet.
B. That’s a good idea.
A. So you have a job interview in just half an hour. What job are you applying for?
B. I’m applying to work as a travel agent.
A. Gee, that sounds like a good job. What sort of things do you have to do as a travel agent?
B. You have to help people with enquiries about their travel plans both on the phone and face to face. This means that you have to look up on the computer the possible dates of travel and various ways that a person could travel to a particular destination. You also have to handle questions about accommodation and booking hotels. Another thing is organising insurance for travellers. And, of course, you have to find the best possible travel route at the cheapest possible price for your customer because there is a lot of competition amongst all the travel agencies. People often ring up four or five times to check different prices and routes. So you have to be able to deal with people well, both on the telephone and in person, so you need to have good communication skills.
A. Well, I hope you get the job. It sounds interesting and I wish you the best of luck!
B. Thanks a lot. Oh, look, the rain has stopped. Now, I must go now to be on time for the interview.Listening Test 1: Conversation in the street—Notes
Please make notes here as you listen:Read through your notes and answer the following questions with short answers:
i. Is the ground dry?
ii. What job does his brother have?
iii. What is he going to do in half an hour?
iv. Does he have an umbrella?
v. Where do they go when it starts raining?
vi. What job is he going to apply for?
vii. Name two things that a travel agent does?
viii. At the end of the conversation, is it still raining?
4.8. Speaking and Communicative Pragmatics
The entire test lasted 2 to 2.5 hr. During this time, testees were usually reading, writing, or listening independently, without any interaction with others. There was a brief opportunity for testers to engage with testees before the testing started, and the tester orally asked the first three questions of the test and Questions 14, 15, 17, 20, 25, and 28. There were also numerous other opportunities to test the aural/oral skills of testees in other areas of the test questionnaire. Assessment of speaking skills covered clarity and ease of expression, fluency, grammatical accuracy, pronunciation, volume, word-attack skills, and prosody. Pragmatic features such as turn-taking, comprehension of indirect imperatives (e.g., “It would be good to include as many details as possible”) salutations, and taking leave were also assessed. Features such as eye contact, body language, conventions of personal space, or emotional disposition were not judged.
After the initial salutation, welcome, identity verification, explanation of test format and other ambient small talk, testers filled in numeric responses for each of the nine questions (1, 2, 3, 14, 15, 17, 20, 25, and 28) from the questionnaire that they orally posed to the testees. Testers recorded numeric responses on the ISLPR scale (Wylie & Ingram, 1999) for these questions, and scored other spoken (solicited or unsolicited) responses. Both scores were collated to a total score at the end of the test. A score of 2 on the ISLPR scale for English speaking and listening skills was envisaged as a threshold for admission to training. A score of 2 was applied as the minimum entry level for the nine trainees accepted at Morwell, whereas a higher entry level of 2+ was retrospectively applied for applicants for Geelong, due to the overall higher level of oral/aural skills displayed by applicants at this center.
4.9. Translation Exercises
The test included translation exercises into English. Translation is not a prominent part of interpreting training, and examples of transfer from written sources are usually modelled as sight translation exercises (cf. Corsellis, 2008). However, translation reading and writing exercises have been used in testing for certification in some community interpreting situations (e.g., Beltran Avery, 2003), and because we could not directly test testees’ LOTEs, we included a translation exercise to give us some idea of testees’ transfer abilities from the LOTE to English. We chose materials in LOTEs that had been translated from the same source-language (SL) English texts and that were topical at the time of testing, in August and September 2010, during the Australian federal elections. Materials from Arabic, Croatian, Greek, Italian, Macedonian, Persian, Serbian, Spanish, and Turkish were taken from an information page of the Australian Electoral Commission (Australian Electoral Commission [AEC], 2010). We present an example of one of the English SL texts below. Speakers of those languages not covered in the AEC translations were provided with short translations from the same English SL texts from the Department of Human Services of the Victorian Government, which has a Health Translations Directory database (Department of Human Services, 2009). Translations for the following languages were taken from this Web site: Amharic, Burmese, Dari, Dinka, French, Pashtu, Sinhalese, Tamil, and Urdu. We could not test translations from some languages spoken by multilingual testees, such as Ewe, Goun, Kiswahili, Liberian Creole, Mina, Nuer, Shilluk, Twi, and Watchi. We tested these multilingual testees in the translation skills from another of their languages for which there were translation materials.
We checked the translations from the LOTE texts (as back-translations) against the English source texts for content, expression, and grammatical accuracy. The content of the texts: registering to vote, how to vote, and community health information are common topics for community interpreting. Below is an example of an English source language text from the AEC.
Figure 1: Example of English source text. LOTE translations of this text were compared against this ST.
4.10. Assessment of Test Performances
In a prototypical sense, intake tests seek to ascertain that applicants have minimum levels of ability that conform to a preconceived standard required for subsequent training. In relation to this test, “minimum levels of ability” relate to language level (we elicited this for English, but for the LOTE this was in most cases deduced through biographical and self-reported information) and motivation level.
We weighted other skills and abilities in our assessment, but these were not essential for selection for training. Occupational experience in any country or context demonstrates knowledge of “the world of work” and is a desirable asset that a candidate brings to “the world of interpreting work.” Educational level provides a guide to a testee’s length of contact with formal and focussed training, which can influence his/her readiness to undertake further training. Familiarity with the relevant LOTE community or communities is important so that trainees know how to interact with them in a culturally suitable manner. Knowledge of interpreting skills, whether personal or anecdotal, shows that testees view interpreting as an ability beyond that of simply knowing two languages. Answers to ethics questions indicate whether testees can conceptualize the interpreter’s role as one in which professional as well as moral standards apply.
Additional skills are of less importance as the training presumed no previous knowledge of note-taking or specialist terminology. Although interpreters employ oral/aural skills far more than the macro-skills of reading and writing, community interpreters in Australia are required to acquaint themselves with a great deal of written information in a variety of fields. Even where a testee’s LOTE has no formal orthography or writing system, Australia requires interpreters to have basic writing skills in English for training and employment. In order to model exercises, where possible, we attempted to include at least two speakers for each LOTE, so that trainees could use both languages with another trainee in role-play activities.
In Geelong, 16 of the 21 testees were accepted to the training. As stated, selection was based on demonstrated language level in English and demonstrated as well as inferred level of interest and motivation toward potential training. Nine of the 16 were speakers of new and emerging languages (see Appendix). The Geelong trainees were between 32 and 57 years old (average age = 41), and length of residence in Australia varied from 1 year to entire lifetime (average length of residence = 18 years). Almost all trainees had extensive employment histories, many had been or were currently employed in geriatric care, education, social work, and business and retail. Table 1 below presents the key characteristics of those accepted for training in Geelong.
Table 1. Testees accepted for training—Geelong.
Languages |
Age |
Length of residence (years) |
Education level |
Occupation |
Previous experience |
Note-taking skills |
Prof. with terminologies |
Turkish |
49 |
34 |
Dip. Arts, Dip. Ed. Assoc. Dip – IT – Aust. |
Casual relief teacher |
Informal – family |
Yes. Good |
Very high |
Macedonian. Also Serbian, Croatian |
43 |
23 |
Yr 12 – Macedonia |
Aged care worker |
Informal – family |
Fair, good |
Good |
Croatian |
52 |
41 |
Yr 11 – Aust. Cert. III – Asset management |
Accounts clerk |
Informal – family |
Did well in consec. exercise. |
Good |
Croatian |
54 |
51 yrs |
Yr 11 – Aust |
Unemp. aged care worker |
Informal – family |
Good |
Fair |
Croatian. Also Serbian |
52 |
50 yrs |
Yr 11 – Aust. Cert I, II in Pharmacy |
Pharmacy Assistant |
Informal – customers, family |
Good |
Fair |
Serbian. Also Croatian |
34 |
Aust. born. Returned to Serbia aged 1 ½. Back to Aust ‘95. |
Yr 12 – Serbia. Cert. IV in Nursing |
Aged care Worker |
Informal – family. Residents at aged care |
Very good |
Very good |
Italian |
54 |
40 yrs |
Yr 10 – Aust |
House-keeper |
Informal – family |
Good |
Fair |
Pushto. Also Dari, Hazargi, Urdu |
33 |
9 yrs |
Sec. School –Pakistan, Cert II – Transport & Logistics |
Transport, logistics |
Informal – family |
Very good |
Good |
Sudanese Arabic, Shilluk |
49 |
6 yrs |
Limited primary |
Aged care worker |
Informal – Sudanese youth |
Poor |
Fair |
Persian, Dari |
43 |
4 yrs |
Yr 12 Iran. Nursing degree Iran |
Teacher at Aust school |
Informal – family. Family – teacher |
Good |
Good |
Persian, Dari |
46 |
2 yrs |
BSc – Iran |
PhD student |
Informal |
Good |
Good |
Persian, Arabic |
38 |
8 yrs |
Yr 12 Iran, B Acc Iran |
Accounting assistant |
Informal |
Not tested |
Good. |
Spanish |
32 |
1 yr |
B. Comm. Spain |
Volunteer – Heart Foundation |
Informal |
Good |
Good |
Burmese, (Karen) |
47 |
1 yr |
B Sc. Burma |
Interpreter – Welfare |
Formal – Centre-link |
Poor |
Fair |
Ewe. Also French, Mina, Goun |
44 |
2 yrs |
Yr 10 Togo |
Unemployed. Plumber |
Informal – community |
Poor |
Unclear. Poor |
Ewe, Mina, Watchi, Twi, French |
55 |
2 yrs |
Yr 11 Togo. Cert II, Community services |
Student |
Formal, informal |
Not tested |
Fair |
In Morwell, nine of the 12 testees were accepted. All but one spoke at least one of the prioritized new and emerging languages. Trainees were between 18 and 42 years old (average age = 31). Length of residency in Australia ranged from 2 to 6 years (average length = 4 years). Many were tertiary (i.e., university) students, meat packers, or laborers.
Table 2. Testees accepted for training—Morwell.
Language/s |
Age |
Length of residence in Aust. |
Educational level |
Occupation |
Previous experience |
Note-taking skills |
Prof. with terminologies |
Arabic |
21 |
5 yrs |
Yr 12 Aust, Cert IV – Bus. Admin |
HR officer |
Informal |
Good |
Good |
Dinka, Sudanese Arabic |
37 |
6 yrs |
Completing: B Soc. And Comm. Welfare |
Multilingual teacher aide |
Formal. Schools, |
Fair |
Good. Knowledge of limits. |
Nuer, Sudanese Arabic, Amharic |
36 |
2 yrs |
Yr 10, Cert III Aged care |
Student |
None |
Fair |
Fair |
Dinka, Sudanese Arabic, Kiswahili |
35 |
4 yrs |
Yr 11. Cert II Disability, Comm. works |
Multicultural education aide |
Formal. Schools |
Good |
Good |
Nuer |
35 |
5 yrs |
Yr 11. Cert II, IV |
Student |
Informal |
Not good |
Fair |
Nuer |
18 |
4 yrs |
Currently yr 11 |
Student |
Informal |
Fair |
Good |
Dinka, Sudanese Arabic |
37 |
4 yrs |
Yr 9, Cert IV. Disability Services |
Meat packer |
Informal, formal,kindergarten |
Fair |
Fair |
Nuer, Sudanese Arabic, Amharic |
41 |
6 yrs |
Yr 10, Cert IV. Mental Health, Dip. Youth Work |
Student support worker |
Formal. NGOs, refugee camps |
Fair |
Fair |
Nuer, Amharic, Sudanese Arabic |
42 |
3 yrs |
Yr 12, Ethiopia, Hospitality course |
Cleaner |
Informal |
Poor |
Good |
5. Relation of Test to Training
Section 4 above described how the intake test sought to ascertain minimum capabilities. In regard to training, the test also sought to do the following: first, gauge personal profiles and ability levels to see what applicants could do—allowing for cross-comparison of abilities to see how homogenous a group the selected trainees would be, and second, ascertain strengths and weaknesses, with the intention that the training would particularly address the latter. Third and least important, the intake test aimed to solicit evidence of performance against which exit test performance could be compared.
To structure the training, we used as a basis a course outline (conforming to the overall guidelines for the training set by the Victorian Multicultural Commission) that contained an exhaustive and maximalist list of features. Specifically, training developed skills in the following areas:
- Basic skills in dialogue interpreting, including competence in active listening, memory retention, paraphrasing, summarizing, note-taking, and accurate transfer between English and LOTE (test elements: listening and note-taking, speaking, translation exercises).
- An understanding of the role and responsibilities of an interpreter working in community domains (test elements: knowledge of interpreting skills).
- Skills in researching terminology and preparing for assignments (test elements: knowledge of specialist language and terminology).
- An understanding of the Australian Institute of Interpreters and Translators (AUSIT) Code of Ethics and of how interpreters should handle ethical dilemmas and intercultural challenges (test elements: ethics questions, knowledge of cultural practices of speakers of LOTE and Australian English, communicative pragmatics).
- Knowledge of pathways for further training and qualifications (test elements: educational level and occupational experiences).
5.1. Delivery of Training
We offered the training course in basic interpreting skills in two regional locations concurrently. Training consisted of 30 hr delivered over five 6-hr blocks on alternating Saturdays from late September to early December 2010. Five different instructors led the training. The curriculum was designed to implement the objectives outlined above and was based on the needs analysis undertaken in light of intake test results. For example, we found that overall English proficiency, particularly literacy levels, were significantly higher among the cohort at one location compared to the other. On average, the length of time living in Australia was also greater for the former cohort, and educational levels were higher. This information was useful for the curriculum designers, who were able to tailor the training package accordingly: For the group of relatively new arrivals, written materials provided in the course workbook were simplified, as was the content of linguistic exercises and the final assessment, and more introductory information on Australian social systems was incorporated. In addition to its importance for curriculum design, the intake test also assisted instructors by providing a profile of the trainees, which enabled them to pitch the content, discussions, and activities at an appropriate level and judge a suitable pace for the training delivery. The information was also used to choose topics and culturally appropriate examples that trainees could relate to, thereby facilitating greater engagement with the training.
The course content delivered in the 30 hr of class time included:
- Introduction to what interpreters do, the different modes of interpreting, and the situation of community interpreting in Australia/Victoria, including training pathways and accreditation.
- The role of the interpreter, ethical requirements including the AUSIT Code of Ethics, and ways to deal with ethical challenges.
- Linguistic exercises to build interpreting skills such as memory training, accurate repetition, listening comprehension, paraphrasing, summarizing, note-taking, and shadowing. These were introduced as monolingual (English) activities and then progressed to activities that involved linguistic transfer (LOTE<>English).
- Preparing for interpreting assignments by building research skills and creating glossaries.
- Dialogue-interpreting techniques related to seating, turn-taking, using the first person, eye contact, controlling the pace and flow of the conversation, asking for clarification, the attitude of the interpreter, and cross-cultural communication issues.
- Role-play activities of dialogues in community settings involving both monolingual memory tasks and bilingual interpreting practice.
- Sociocultural contexts and challenges of interpreting in community settings in the health care, legal, social security, and other domains. This information was then linked to the role-play activities.
To supplement the face-to-face sessions, trainees were also required to undertake self-study tasks from a course workbook, which were discussed in class the following week. These included building glossaries, researching case studies on community interpreting settings and presenting them in class, answering questions posing ethical dilemmas, practicing interpreting from CDs with bilingual dialogues in (most of) their languages, preparing for role-plays, and reading articles on community interpreting. A variety of written information and links were provided to assist with self-study.
6. Feedback From Trainers About Intake Test Content and Allocation
The intake test was designed by staff persons who are practicing interpreters and who have much experience teaching both interpreting courses at the postgraduate university level and language courses in adult, postsecondary vocational educational institutions. Materials from existing intake tests, however, could be used to a limited extent only, as we could not use exercises such as paraphrasing or examples of written or spoken consecutive interpreting dialogues eliciting responses in both languages. Conversely, the inability to test LOTE skills meant that a significant part of the test needed to elicit detailed information about self-reported functional abilities in the LOTE. Further questions in the intake test also functioned to check self-reported abilities that had been elicited elsewhere. Our intake test also emphasized questions about future plans and motivations—such questions are redundant in intake tests for fee-paying, university-level courses.
Logistic and staffing constraints meant that the intake test and course design were completed by staff persons other than those who delivered the training. Although such an arrangement may be disadvantageous for the cohesion of test and training, this has the advantage of allowing trainers to examine a text, free of the need to defend “their test.” Trainers were presented with a table containing a list of the sections of the test, divided in a way similar to the ordering of Sections 4.1 to 4.10 above, that is, “educational level and occupational experience,” “English language level,” “LOTE language level,” and following. Trainers were asked, “How important were these things for admission to training?” and were invited to provide responses on a 5-point Likert scale that contained the following degrees of quantification: very important, important, neither/nor, not so important, and not at all important. Table 3 includes comments representing three trainers’ combined and averaged responses. For readability, total averaged responses are allocated to the closest whole response.
Table 3: Responses from trainers about intake test content and training content
Profile / Ability / Skill | Section of entrance test and question nos. | Importance of these elements for admission to training? | ||||
1 | 2 | 3 | 4 | 5 | ||
Education, Employment | General education: primary, secondary, vocational, tertiary (Q. 5–9) | X | ||||
Employment experience (Q. 16, 17, 24) | X | |||||
English language | Level of English, including evidence of acquisition in formal settings (Q. 10-13) | X | ||||
Subjective assessment of strengths / weaknesses in English (Q. 14, 15) | X | |||||
LOTE | Information about LOTE, circumstances of its acquisition and use (Q. 2-4, 20-23) | X | ||||
Terminology | Knowledge of specialist language and terminology (Q. 19, 20-23) | X | ||||
Knowledge of interpreting skills, ethics | Knowledge of interpreting skills (Q. 18, 26, 27) | X | ||||
Ethics questions (Q. 28-30) | X | |||||
Motivation | Enthusiasm, motivation to become a well-skilled interpreter (Q. 24, 25) | X | ||||
Reading and writing | Reading for specific information (Reading test 1, 2) | X | ||||
Writing a narrative (Writing test 1) | X | |||||
Listening, note-taking and memorization | Listening for gist and specific information (Listening test 1, 2) | X | ||||
Note-taking and memorization (Q. 17, Listening tests 1, 2) | X | |||||
Speaking and communicative pragmatics | Clarity and ease of expression, fluency, pronunciation, volume, etc. (cumulatively assessed throughout spoken interactions) | X | ||||
Translation exercises | LOTE into English translation (Select tests from AEU or DHS Web sites | X |
Note. 1 = very important; 2 = important; 3 = neither/nor; 4 = not so important; 5 = not at all important.
The feedback from the trainers included both expected and unexpected responses. Trainers rated level of English, in particular clear and fluent speaking skills, as well as listening skills and the ability to listen for particular information, very important. These features are generally minimum or “threshold” capabilities that determine admission to the course as well as active and successful participation in training. Trainers also rated knowledge of interpreting skills as a very important feature. This response was unexpected, and it reflects trainers’ assessment that skills specific to interpreting, for example, physical arrangements in a triangle, direct speech versus oblique oration, and speech flow and length are important parts of training. Trainers also listed educational level as important but previous occupational experience as unimportant. It appears that trainees’ aptitude in learning new skills with a pedagogic approach (that may have been unfamiliar to some) was determined more by a higher educational level than by any particular previous work experience.
Although trainers were unable to ascertain trainees’ knowledge and use of LOTE, this rated as an important factor for trainees to be able to attempt interpreting role-play activities or interpret into English texts played to them aurally. Prior subjective assessments about trainees’ strengths and weaknesses in English were of importance only where weaknesses impinged on trainees’ ability to engage successfully in certain activities. However, as trainers explained at the first session, the training was not intended to be a means for trainees to improve their English language skills. Building on listening skills, note-taking and the capacity to develop memorization skills were also important—this was one of the activities that was widely practiced in the training. Unsurprisingly, reading and writing as skills elicited in isolation were not rated as important skills before admission. Reasonable literacy skills are presumed, however, for a number of activities, for example, reading about ethics, reading role-play dialogues, glossary compilation, and texts on interpreting health, legal, education, immigration, social security, and other domains.
7. Feedback From Trainees
The 16 trainees at Geelong and nine at Morwell completed a variety of final assessment tasks and course feedback forms in the last session of the last day of training. Eleven Geelong trainees and five Morwell trainees completed the optional survey about the content of the intake test and its relationship to the training. As outlined above in Section 3, trainees were given a blank paper copy of the intake test to reacquaint themselves with the test content. Trainees were asked to think about the content of the training and to then consider whether each of the 10 components of the test were important for them to commence and undertake the training. The question posed to trainees was “How important were these for admission to the test?” Trainees recorded responses on a 5-point Likert scale from 1 (very important) through 3 (neither important nor unimportant) to 5 (not at all important). Trainees were not required to give their names, and trainers were not present when trainees completed the survey. Combined and averaged responses to the closest whole number are set out below in Table 4.
Table 4: Responses from trainees about intake test content and training content
Profile / Ability / Skill | Section of entrance test and question nos. | Importance of these elements for admission to training? | ||||
1 | 2 | 3 | 4 | 5 | ||
Education, Employment | General education: primary, secondary, vocational, tertiary. (Q. 5-9) | X | ||||
Current and previous employment. (Q. 16, 17, 24) | X | |||||
English Language | My level of English, (Q. 10-13, 14, 15) | X | ||||
LOTE | My level of LOTE (Q. 2-4, 20-23). | X | ||||
Terminology | Knowledge of specialist language and terminology (Q. 19, 20-23) | X | ||||
Knowledge of interpreting skills, ethics. | Knowledge of interpreting skills (Q. 18, 26, 27) | X | ||||
Ethics questions (Q. 28-30) | X | |||||
Motivation | My attitude and level of motivation. (Q. 24, 25) | X | ||||
Reading and writing | Reading for specific information (Reading test 1, 2) | X | ||||
Writing a narrative (Writing test 1) | X | |||||
Listening, note-taking and memorization | Listening for gist and specific information (Listening test 1, 2.) | X | ||||
Note-taking skills and memorization (Q. 17, Listening tests 1, 2). | X | |||||
Speaking and communicative pragmatics | Clarity and ease of expression, fluency, pronunciation, knowing how to communicate clearly with English and LOTE speakers | X | ||||
Translation Exercises | LOTE into English translation (Select tests from AEU (2010) or DHS (2009) websites) | X |
Note. 1 = very important; 2 = important; 3 = neither/nor; 4 = not so important; 5 = not at all important.
There is great similarity between trainers’ and trainees’ responses about the relative importance of various sections of the intake test. Unsurprisingly, trainees rated their LOTE proficiency as just as important as their English proficiency in being accepted for the training—trainers, of course, were unable to ascertain trainees’ LOTE levels. Trainees also rated their current or previous employment as being more important for their admission than did trainers. Terminology was also rated more important for trainees than trainers. This is also unsurprising, as trainees may have had little need to consider the particular features of language use in specialist contexts. Listening skills were predictably rated as important, whereas reading and writing skills were considered less so. Written translation exercises did not figure in the training so their importance as a test feature was also not rated highly.
8. Conclusions and Findings
This article examined the design and implementation of an intake test that specifically sought to elicit demonstrations of skill levels and to elicit information indicative of skill levels that could not be directly tested. This led to the design of a sizeable test, containing over 30 questions, tables, and exercises, that required between 2 and 2.5 hr to complete. Often in community interpreter training only one of a trainee’s two languages can be systematically tested; trainers usually can monitor performance in simulated or role-play activities in only one language. Therefore, detailed questioning, often using various questions to elicit skill capabilities, is not only justifiable but essential.
The test also contained many questions that sought to elicit answers about testees’ understanding of the interpreting profession and questions that contained hypothetical scenarios problematizing the interpreter’s role. These questions sought to discover testees’ prior knowledge about interpreting and to check, indirectly, if they had attempted to find out any information about it, in the absence of any (formal or informal) interpreting experience. Trainers reported that, at intake level, these questions had little bearing on training content and trainee involvement, as the training itself contained explanation and situation modelling of basic interpreting techniques and concepts so that prior knowledge was neither essential nor expected.
Both trainers and trainees identified education level as an important indicator of trainee suitability to training and to a trainee’s capacity to engage successfully; education level was reported by both groups to be more important than employment history.
Overall, in determining the suitability and success of applicants to community interpreter training, skills activities such as reading and reading comprehension, writing, sight translation, and written translation exercises are of limited value. Intake tests for community interpreter training programs should therefore instead take care to solicit applicants’ education level, as well as test performance in the key areas of listening, speaking, and communicative pragmatics.
9. References
Australian Electoral Commission. (2010). Translated information and telephone interpreter service. Retrieved from http://www.aec.gov.au/About_AEC/Translated_information/
Bell, S. (1997). The challenges of setting and monitoring the standards of community interpreting. In S. Carr, R. Roberts, A. Dufour, & D. Steyn, D. (Eds.), The Critical Link: Interpreters in the community (pp. 93–108). Amsterdam, the Netherlands: John Benjamins.
Beltran Avery, M. P. (2003). Creating a high-standard, inclusive and authentic certification process. In L. Brunette, G. Bastin, I. Hemlin, & H. Clarke (Eds.), The Critical Link 3: Interpreters in the community (pp. 99–112). Amsterdam, the Netherlands: John Benjamins.
Bernstein, J., & Barbier, I. (2000). Design and development parameters for a rapid automatic screening test for prospective simultaneous interpreters. Interpreting, 5, 221–238.
Bowen, D., & Bowen, M., (1989) Aptitude for interpreting. In L. Gran & J. Dodds (Eds.), The theoretical and practical aspects of teaching conference interpretation (pp. 109–125) Udine, Italy: Camapanotte.
Carr, N. (2006) The factor structure of test task characteristics and examinee performance. Language Testing, 23, 269–289.
Chesher, T., Slatyer, H., Doubine, V., Jaric, L., & Lazzari, R. (2003). Community-based interpreting: The interpreters’ perspective. In L. Brunette, G., Bastin, I., Hemlin, & H. Clarke (Eds.), The Critical Link 3: Interpreters in the community (pp. 273–291). Amsterdam, the Netherlands: John Benjamins.
Clifford, A. (2005) Putting the exam to the test: Psychometric validation and interpreter certification. Interpreting, 7(1), 97–131.
Corsellis, A. (2008) Public service interpreting: The first steps. Houndmills, UK: Palgrave Macmillan.
Department of Human Services (DHS), State Government of Victoria. (2009). Health translations directory. Retrieved from http://www.healthtranslations.vic.gov.au/
Eyckmans, J., Anckaert, P., & Segers, W. (2009). The perks of norm-referenced translation evaluation. In C. Angelelli & H. Jacobson (Eds.), Testing and assessment in translation and interpreting studies (pp. 73-94). Amsterdam, the Netherlands: John Benjamins.
Gentile, A., Ozolins, U., & Vasilakakos, M. (1996). Liaison interpreting. Melbourne, Australia: Melbourne University Press.
Gerver, D. Longley, P., Long, J., & Lambert, S. (1989). Selection tests for trainee conference interpreters. Meta, 34, 724–735.
Hale, S. (2007). Community interpreting. Basingstoke, UK: Palgrave Macmillan.
Hatim, B., & Mason, I. (1997). The translator as communicator. London, UK: Routledge.
Lambert, S. (1991). Aptitude testing for simultaneous interpretation at the University of Ottawa. Meta, 34, 586–594.
Lascar, E. (1997). Accreditation in Australia. An alternative means. In S. Carr, R. Roberts, A. Dufour, & D. Steyn, D. (Eds.), The Critical Link: Interpreters in the community (pp. 119–130). Amsterdam, the Netherlands: John Benjamins.
Lee, J. (2008). Rating scales for interpreter performance assessment. The Interpreter and Translator Trainer. 2(2), 165–184.
Lotriet, A. (2002) Can short interpreter training be effective? The South African truth and reconciliation commission experience. In E. Hung (Ed.), Teaching translation and interpreting 4 (pp. 83–98). Amsterdam, the Netherlands: John Benjamins
Michael, S., & Cocchini, M. (1997) Training college students as community interpreters. In S. Carr, R. Roberts, A. Dufour, & D. Steyn (Eds.), The Critical Link: Intepreters in the community (pp. 237–248). Amsterdam, the Netherlands: John Benjamins.
Mikkelson, H., & Mintz, H. (1997). Orientation workshops for interpreters of all languages: How to strike a balance between the ideal world and reality. In S. Carr, R. Roberts, A. Dufour, and D. Steyn (Eds.), The Critical Link: Interpreters in the community (pp. 55–64). Amsterdam, the Netherlands: John Benjamins.
Moser-Mercer, B. (1985). Screening potential interpreters. Meta, 30, 97–100.
Niska, H. (2005). Training interpreters: Programmes, curricula, practices. In M. Tennent (Ed.), Training for the new millennium: Pedagogies for translation and interpreting (pp. 35–64). Amsterdam, the Netherlands: John Benjamins.
Penney, C., & Sammons, S. (1997). Training the community interpreter: The Nunavut Arctic College experience. In S. Carr, R. Roberts, A. Dufour, & D. Steyn, D. (Eds.), The Critical Link: Interpreters in the community (pp. 65–76). Amsterdam, the Netherlands: John Benjamins.
Pippa, S., & Russo, M. (2002). Aptitude for conference interpreting: A proposal for a testing methodology based on paraphrase. In G. Garzone & M. Viezzi (Eds.), Interpreting in the 21st century: Challenges and opportunities (pp. 245–256). Amsterdam, the Netherlands: John Benjamins.
Pöchhacker, F. (2004). Introducing interpreting studies. London, UK: Routledge.
Quinn, T. (1993). The competency movement, applied linguistics and language teaching: Some reflections and suggestions for a possible research agenda. Melbourne Papers in Language Testing, 2(2), 55–87.
Roberts, R. (2000) Interpreter assessment tools for different settings. In R. Roberts, S. Carr, D. Abraham, & A. Dufour (Eds.) The Critical Link 2: Interpreters in the community. (pp. 103–120). Amsterdam, the Netherlands: John Benjamins.
Spanish-translation-help.com (n.d.). How to become an interpreter. Retrieved from http://www.spanish-translation-help.com/how-to-become-an-interpreter.html
Straker, J., & Watts, H. (2003) Interpreter training for students from refugee backgrounds. In L. Brunette, G., Bastin, I. Hemlin, & H. Clarke (Eds.), The Critical Link 3: Interpreters in the community (pp. 163–176). Amsterdam, the Netherlands: John Benjamins.
Timarová, Š., & Ungoed-Thomas, H. (2008). Admission testing for interpreting courses. The Interpreter and Translator Trainer, 2, 29–46.
Turner, B., Lai, M. & Huang, N. (2010). Error deduction and descriptors: A comparison of two methods of translation test assessment. Translation and Interpreting, 2(1), 11–23.
Valero Garcés, C. (2003). Responding to communication needs: Current issues and challenges in community interpreting and translating in Spain. In L. Brunette, G., Bastin, I., Hemlin, & H. Clarke (Eds.), The Critical Link 3: Interpreters in the community (pp. 177–192). Amsterdam, the Netherlands: John Benjamins.
Walters, F. S. (2007). A conversation-analytic hermeneutic rating protocol to assess L2 oral pragmatic competence. Language Testing, 24, 155–183.
Wylie, E., & Ingram, D. (1999). International second language proficiency ratings. Brisbane, Australia: Centre for Applied Linguistics and Languages, Griffith University.
Appendix
Information flyer from the Victorian Multicultural Commission for regional interpreter training