Close this search box.

iCORE Logo (Innovative & Creative Opportunities for Research & Education)Training Interpreters in the Public Schools: 

The TIPS Process

Doug Bowen-Bailey
Digiterp Communications
Patty Gordon
St. Catherine University
Dr. Bernhardt Jones
TASK 12 Project at Utah State University
Laurie Shaffer
University of Virginia
A lack of qualified interpreters in mainstream classrooms is a challenge facing many school districts. To address this, the Training Interpreters in Public Schools (TIPS) program supports interpreters working to meet state-determined Educational Interpreter Performance Assessment (EIPA) certification requirements.  The program is an outgrowth of Training and Assessment Systems for K-12 Interpreters (TASK12).  TIPS has used a data-driven approach to developing online modules that increase interpreters’ knowledge and skills leading to more effective educational access for students in mainstream classroom. This paper explains the process of development of the TIPS program and how it is delivered both using both a face-to-face orientation session and online modules.  Additionally, there is discussion of results for participants and the potential implications for interpreter education.
The practice of including deaf and hard of hearing students in a wide variety of educational environments requiring the presence of interpreters presents a significant challenge for the field of interpreter education.  As standards are established regarding minimum competencies for interpreters working in these settings, the demand for qualified interpreters far surpasses the supply.   In the face of this challenge, the Training and Assessment Services K-12 (TASK 12) Project is working to both evaluate interpreters working in K-12 settings as well as provide research-based training to support them in meeting standards and delivering a higher level of service to students. The Training Interpreters in Public Schools (TIPS) program is one effort to accomplish this goal.  This paper focuses on the development of the TIPS program, its methodology and approaches, an evaluation of its success to date, and a discussion of the implications for both interpreter and deaf education.
Background to TIPS program
The current mission of the TASK 12 project is:
to provide valid and reliable evaluation of K-12 educational interpreters who serve deaf and hard of hearing students in educational environments throughout the member states and to design from interpreter evaluation results appropriate training to improve qualifications of K-12 interpreters throughout TASK12 states.
Initially, the project was the Assessment Services K-12 (ASK12) project and its focus was solely on assessment.  Dr. Bernhardt Jones, the director of the project, focused on proctoring the Educational Interpreter Performance Assessment (EIPA) for interpreters in the 14 states served by the project.  Using the EIPA, developed by Dr. Brenda Schick and Kevin Williams, and administered by the Boys Town National Research Hospital, the project initially documented the level of competence of interpreters working in K-12 settings.
However, the ASK 12 Project determined that simply assessing interpreters was not a large enough mission. Unfortunately, the results of the evaluation often demonstrated that interpreters were inadequately prepared to be working in classrooms.  So, the project added training both to its mission and its name – using the data collected as the basis for an educational component  designed to meet the specific needs shown in the EIPA results.
In March 2009, Dr. Jones contacted Patty Gordon and Doug Bowen-Bailey to create a targeted training based on the results of the EIPA evaluations the TASK12 project had administered.  Because the majority of the states served by the project use 3.5 as the standard, the decision was made to design the program to focus on educational interpreters who scored between a 3.0 and 3.4 on the EIPA. Additionally, it was clear that the training needed to be primarily delivered in an online format since the TASK12 project serves such a large geographical area – and that many of the interpreters who would be potential candidates for the program might live in more rural settings.
As stated in the TASK12 mission statement, the training needed to be designed based on “interpreter evaluation results,” so Gordon and Bowen-Bailey first looked at the trends in the collected data, from the results of EIPA tests administered by the project, on interpreters who scored in the 3.0-3.4 range.
To understand what served as the basis for TIPS, it is important to give a brief description of the EIPA itself.  The EIPA provides options for different testing based on the language use of the deaf student and educational level.  So, an interpreter selects stimulus material using ASL/PSE, PSE/ASL, or MCE/PSE, and either elementary or secondary level.
The interpreter’s abilities are rated in the following four domains:

  1. Grammatical skills: Use of prosody (or intonation), grammar, and space. 
  2. Sign-to-voice interpreting skills: Ability to understand and convey child/teen sign language
  3. Vocabulary: Ability to use a wide range of vocabulary, accurate use of fingerspelling and numbers
  4. Overall abilities: Ability to represent a sense of the entire message, use appropriate discourse structures, and represent who is speaking

Interpreters then receive scores between 0 and 5 for 38 different skill areas distributed between the 4 domains.
In looking at the results collected by the TASK12 project of 226 candidates who scored in the 3.0-3.4 range, Gordon and Bowen-Bailey were able to identify the specific skills that had the lowest average scores.  The findings are shown in Table 1.
Table 1
Lowest average/mean scores for interpreters with a rating of 3.0-3.4

Skill area Domain Score
Follows Principles of Discourse Mapping IV:  Overall abilities 1.9
Production and use of non-manual adv/adj markers I:  Grammatical skills 2.3
Appropriate use of Fingerspelling III:  Vocabulary 2.4

Since the plan was to offer three online modules to prepare for an exit exam using the EIPA, we initially considered focusing one module on each of these skill areas.  However, in discussion with Kevin Williams, who was one of the creators of the EIPA, we soon realized that it was important to have an introductory module that focused on the unique factors for interpreters working in educational settings.
Based on Williams’ advice, Gordon and Bowen-Bailey selected Fingerspelling and Discourse Mapping as the two skill areas for the subsequent modules.  Fingerspelling was selected because while it is a more discrete skill, it is significant in supporting students’ development of literacy.   According to Williams, many interpreters who take the EIPA simply do not fingerspell or demonstrate the ability to identify key vocabulary that need to be represented in multiple ways, including fingerspelling, to reinforce students’ language acquisition.
Of the remaining two indicators, Gordon and Bowen-Bailey chose Discourse Mapping because of its significance to the overall clarity of an interpreted message.  The skill of representing ideas appropriately in space, with strong prosodic features is what makes an interpreted message make sense to a student.  This is a skill that develops late in the second language learning process and many interpreters do not have long enough exposure to this feature of ASL to internalize the skill fully before beginning their careers.  As the skill area with the lowest average score, our hypothesis was that an increase in this area would make a significant impact on the EIPA scores of the interpreter and on the ability of the students working with those interpreters to understand content within the classroom.
In developing the modules, Bowen-Bailey and Gordon invited experts from our field to share insights and perspectives for inclusion in the modules.  The modules drew on the expertise of the following:

The TASK12 project remains extremely grateful to these individuals for being willing to share their insights and perspectives as part of this program.
Program Delivery
The TIPS program uses the delivery of the EIPA as a screening tool for identifying potential participants.  Individuals taking the EIPA sign a release sending their diagnostic workup to TASK12 so that a report can be generated to each TASK12 state with recommendations for improvement of the interpreters working in their public schools.  From these results, individuals scoring within the range of Level 3.0-3.4 are individually invited to participate in a TIPS cohort.  Once invited interpreters decide to be a part of the program, they go through 5 components.
Face-to-Face Orientation: The program begins with the group coming together in one location.  This session focuses on developing a sense of community that will carry over into the online experience as well as beginning to support people with being comfortable with the technology used in delivering the TIPS program.
Module 1: Interpreting Educational Discourse:  This module provides an overview of the unique nature of language use in classroom settings.
Module 2: Fingerspelling in Education:  This module focuses on the important role that fingerspelling plays in developing literacy skills for Deaf and hard of hearing students.  For interpreters, the focus is more on the why and when to fingerspell more than how to fingerspell.
Module 3: Discourse Mapping for Education:  This module focuses on the ways that ideas are connected together within discourse to guide the audience through its understanding.
Exit Exam:  Take the EIPA test after completion of the modules.
The modules are delivered through Moodle, an open-source course delivery software which allows for online readings, lessons, discussion forums, and text chats.  In addition, Adobe Acrobat Connect Pro is utilized to provide for 2 real-time video chats during each module. All modules are approximately 5 weeks in length and consist of background readings and videos, and then activities to put this new learning into practice.
These activities build on each other using authentic classroom stimulus materials.  For the Interpreting Educational Discourse module, we use a segment from a 4th grade classroom with a classroom teacher introducing different geometric shapes.  For the Fingerspelling module, we use a portion of a lecture from a high school anatomy course focused on neurons.  The discourse mapping module uses a longer segment of this anatomy lesson.
It is critical to use such authentic source material because it provides opportunity for looking at genuine discourse from a classroom.  Additionally, these materials were selected to allow participants to build on the specific skills needed to raise scores in both fingerspelling and discourse mapping on the EIPA.
As part of the activities, participants create a sample interpretation, and record it to video that is then uploaded as an unlisted YouTube video so others in the course can view it and take part in dialogue about the work.
Creating a Dynamic Learning Environment Online
A challenge for any on-line educational experience is how to create a collaborative environment in which learners can work together to construct meaning (Hiltz, 1998).  Designers and facilitators of the distance learning platform may have superb technology and cutting edge instructional design but it will not be a successful experience if the interest, commitment and engagement of the participants wanes.  The TIPS staff gives serious consideration to this challenge and, as a result, has seen a very low attrition rate.
TIPS has 2 facilitators to approximately 20 – 25 participants.  Though this ratio increases the cost of the program, it allows facilitators and participants to develop a relationship that fosters individual learning and maximum participation.  As mentioned previously, each cohort initiates with an on-site all day face-to-face session designed to give everyone an opportunity to review all the program components and expectations as well as to test out all the associated technologies.  In addition, the face-to-face also begins to develop a sense of community that can be maintained and developed online.  In our experience, having this time together at the beginning of the experience positively affects participants’ interactions with each other in the online environment.
Prior to the on-site orientation with a given cohort, participants are asked to post a video and a photo of themselves on the course site.  The facilitators do the same. The video serves two purposes: to allow the cohort to begin to get to know each and the program staff and to give the participants an opportunity to engage with the technology associated with the modules.  Facilitators and/or Doug Bowen-Bailey, as the technology specialist, respond to members of the cohort to troubleshoot any issues that arise.  Initially these videos were made during the on-site workday.  However in the interest of time and cost, the team opted to move this component to the pre-program time frame and reduce what had been two workdays to one.  Any remaining technological questions are addressed once on-site.
Once the face-to-face orientation is complete, the process moves to a primarily asynchronous environment. As participants work through the modules in Moodle, distance and technology can either enhance or degrade the human connectivity of the cohort.  One way to encourage consistent participation is for facilitators to “mold, model, and encourage desired behavior.”  (Hiltz, 1998, p.7)  Expectations for participants such as number of comments posted weekly on the course site, the completion of readings and activities, and the number of hours logged into the site allow facilitators to monitor both quantity and quality of participation.  In turn, facilitators commit to consistent responsiveness to all efforts on the part of participants.   The staff goes to great lengths to create a balance between acknowledging and responding to the work of participants and leaving room for participants to interact with each other.  TIPS project staff work as an extended team offering personal and technical support throughout a given week within the course site and, when needed, via email and phone.  We aim to create a sense of cyber-community that supports group and individual learning.
Twice during each 6-week model, the program facilitators schedule 2 live chats.  These are optional and are offered to allow for open-ended discussion of material and activities from the modules, of the EIPA, or of the challenges present in the participants’ immediate work environment.  Chats are offered via Adobe Connect Pro.  The participants are able to both see and hear the facilitators.  Again, this supports the human aspect of the program, allowing for the cohort to re-connect as people.
Results of the Program
One of the unique parts of the TIPS program is the degree of evaluation built into the program.  It collects more subjective measures of participant satisifiaction and perception with pre- and post-assessments of each module.  We also are able to use a more objective measurement of results on the EIPA both before and after participation in TIPS.  This segment describes both of these types of assessement.
Qualitative feedback from participants indicates a satisfaction with the content and facilitation of the modules.  Participants regularly report on how they are integrating their learning into their daily work.  Often the participants find the students they interpret for are more attuned to the interpretation, gain interest in partnering with the interpreter for problem-solving, and often show immediate progress in the classroom.
Negative feedback is often related to the delivery system of the course.  Participants find the time commitment a challenge, particularly over the range of all three modules.  Life events interfere with their ability to stay on track.  Some find the motivation to work with a computer difficult to maintain.  Some have technical problems or are relying on other’s technical knowledge to participate in the course, record work samples and post videos online. Negative comments are sometimes related to the timing of the course.  Participants want to apply their knowledge immediately and, on occasion, have ended the module sequence during a time when they are not actually working.  They find it challenging to incorporate new information into their work before re-testing and prefer to complete the modules during the school year.
While participants report the content of the course to be challenging, no one has commented that the content is not valid or applicable to classroom work as well as the EIPA. Overall satisfaction reports for all cohorts have been extremely high.  The following comments from the post-program assessments are characteristic of participant perspectives on the strength of the program:

Quantitative data is culled from the pre and post surveys as well as the testing results for participants. The quantitative data from the pre and post surveys for the modules shows that participants perceive themselves to have grown.   Figure 1 is an example from the summary responses from one cohort’s pre and post survey for the second module “Fingerspelling in Education”.  The questions on the right and left are exactly the same so you can see the positive change in responses from before (left) and after (right).  Answers are rated on a scale of 1-5 with 1 representing “strongly disagree” and 5 representing “strongly agree.”   In the chart, the lower part of the bar graph represents 5.
These are the statements participants were asked to agree or disagree with:

Figure 1:  Summary responses from a pre and post survey on the fingerspelling module
A more objective measure comes in comparing the EIPA results of participants before and after the TIPS process.  While more analysis needs to be done on specific indicators of the EIPA, overall, the majority of participants (81%) improve their EIPA score after completing the modules.  The average improvement is .25.  For a participant holding a 3.3, a .25 improvement is enough to bring them to the 3.5 level required for employement in many of the states served.  For participants with an initial 3.1 however, the improvement average is not likely to carry them far enough to meet the required standard.  Figure 2 represents the data from 6 cohorts containing 83 participants overall.
Figure 2:  Average EIPA improvement of TIPS participants
Within the group who successfully completed the TIPS program, 81% increased their score, 10% had the same score at a subsequent testing, and 10% had a lower score at a subsequent testing.  This is shown in Figure 3.
Figure 3:  Participant Results on Subsequent EIPA
Implications for the Field
The TIPS program utilizes innovative pedagogical approaches delivering material that is grounded in research and data from our field.  To some degree, it has seen some successes. Certainly, participants have reported that it has been a significant contribution to their professional development.  Yet in assessing the way that TIPS has been able to improve participants’ scores on the EIPA, the efforts have resulted in marginal gains of on average .25.
In some ways, this can be a cautionary tale for our field.  We can design what we see as really impressive programs, gather the expertise of our profession, and still end up with small gains in actual results.   Of particular importance, we should be wary of assuming that because participants really enjoyed a program and report learning a great deal that it necessarily translates into a significant change in practice.
This is not to discount the efforts of the TIPS program or others like it to improve the skill levels of interpreters. We state this so that as a profession we can see more clearly the challenges of producing enough qualified interpreters to meet the needs of an educational system that is so often choosing the option of placing deaf, deaf-blind or hard of hearing students in classrooms with hearing peers.
About the Authors
Doug Bowen-Bailey is an interpreter, mentor, and resource developer who lives in Duluth, MN.  His passion is using technology to support interpreters in reaching their professional development goals.  Patty Gordon (MLS, RID CI/CT, NIC-A, EIPA 4.7) is an interpreter, interpreter educator, consultant, translator, and mentor in private practice since 1988. Her current interest is in blended instruction and translation of ASL media. Dr. Bernhardt Jones has been Director, Training and Assessment Systems for K-12 Educational Interpreters (TASK12) since 2002.  In 2009, he created the Training of Interpreters in Public Schools Project (TIPS) and continues to direct these two projects within the Technical Assistance for Excellence in Special Education Center at Utah State University.  Laurie Shaffer is MS, CI & CT, NIC-A, EIPA 4.5  has held national certification for 22 years and has taught about various aspects of the profession in  4 year degree programs, continuing education seminars and professional workshops.  .  Currently she is staff interpreter at University of Virginia and Director of the  “From a Distance…” tutoring and mentoring program for educational interpreters.
Special thanks to Cheryl Sheffield and LeeAnn Lundgreen for their support in the administering the TIPS program and Amy Williamson for her part as one of the facilitators.
Hiltz, S.R. (1998).  Collaborative Learning in Asynchronous Learning Networks:  Building Learning Communities.  WebNet 98 World Conference of the WWW Internet and Intranet Proceedings.  Retreived from .