Past Clients

GOVERNMENT

Defense Language Institute Foreign Language Center

Development of Very-Low Range Items for the Defense Language Proficiency Test 5

In September 2010, SLTI was awarded a contract by the Defense Language Institute Foreign Language Center (DLIFLC) to develop Very Low Range (VLR) items for the Defense Language Proficiency Test 5 (DLPT5) in three less commonly taught languages: Baluchi, Cebuano, and Chavacano. Over 1,200 items and associated reading and listening passages at ILR levels 0+, 1, and 1+ were ultimately delivered to the client. The project concluded in the summer of 2012.

Development of Constructed-Response Items for the DLPT5

In September 2010, SLTI was awarded a contract by the DLIFLC to develop constructed-response test items for the DLPT5 in eleven languages: Albanian, Amharic, Armenian, Greek, Haitian Creole, Hindi, Kurmanji, Norwegian, Sorani, Urdu, and Yemeni Arabic. Over 1,000 items and associated reading and listening passages at ILR levels 1 through 3 (inclusive of the plus-levels) will ultimately be delivered to the client. The project will conclude in the fall of 2012.

Development of Multiple-Choice Items for the DLPT5

In 2010, SLTI was contracted by Avant Assessment to partner as a sub-contractor in the development of multiple choice reading and listening test items for the DLPT5 in Spanish, Brazilian Portuguese and European Portuguese. Over 1,600 items and associated reading and listening passages at ILR levels 1 through 3 (inclusive of the plus-levels) will ultimately be delivered to the client. The project will conclude in the fall of 2012.

Review of DLPT5 Test Items

In 2007, SLTI was contracted by the DLIFLC to carry out an evaluation of the DLPT5 in Arabic. The primary goals of the item review was to ensure that the items were appropriate for texts at the intended ILR skill levels, to assess whether the English test items accurately reflect the texts, and to judge the quality and appropriateness of the key and the distractors. Following our evaluation, the DLIFLC made immediate changes in the test and then contracted SLTI to evaluate tests in a number of different languages, including: Egyptian Arabic, Modern Standard Arabic, Azeri, Cebuano, Chavacano, French, Hebrew, Japanese, Khmer, Korean, Kurdish-Kurmanji, Kurdish-Sorani, Pashto, Persian, Russian, Spanish, Tausug, Turkish, and Mandarin Chinese. The project continued through 2009.

Development and Modification of Translation and Interpretation Tests

In September of 2009, SLTI was awarded a subcontract by CyraCom to develop and modify Translation Performance Tests and Interpretation Performance Tests for the DLIFLC. Tests were developed in three language combinations, Dari-English, Pashto-English, and Persian-English. Based on the ILR Skill Level Descriptors for Translation and Interpretation Performance and an extensive needs analysis, the tests were designed to identify the performance level of U.S. military field linguists. A total of 12 tests were developed and delivered in the summer of 2010.

In September 2010, SLTI was awarded a contract by the DLIFLC to develop and modify Translation Performance Tests and Interpretation Performance Tests in eight languages: Dari, Pashto, Farsi, Arabic, Spanish, Chinese, Korean and French. The tests, based on the ILR Skill Level Descriptors for Translation and Interpretation Performance and the tests developed by SLTI in 2009-2010, were designed to identify the performance level of U.S. military field linguists. A total of 32 tests were delivered in the fall of 2011.

Development of the Defense Language Aptitude Battery (DLAB) and Pre-DLAB

Under contract to the Center for Advanced Study of Language (CASL), SLTI developed four new parallel forms of the DLAB. Following the successful completion of this project, SLTI was contracted to develop a screening test for language aptitude, called the Pre-DLAB. After the successful development and validation of the first form, three additional forms were developed, and all four forms of the Pre-DLAB were equated.

Federal Bureau of Investigation

FBI Translation Skills Assessments

From 1994 to 2009, SLTI developed tests of translation ability for the FBI. Listening summary translation exams were developed in Spanish, Minnan, Persian, and five dialects of Arabic. SLTI also created document translation exams in Spanish, Persian, and Arabic for the FBI. The tests are used to assess the competency of individuals applying for full-time positions as monitors, translators, language specialists, and contract linguists within the agency. As part of this project, SLTI also created a self-instructional rater training kit that allows non-specialist FBI personnel to train themselves to score the tests. The self-instructional rater training kit allowed the FBI to conduct rater training at any time and in any location. Eliminating the need for an expert trainer not only saves the FBI time and money; it makes the training of raters much more feasible by helping institute a practical training system and allowing periodic refresher training.

Administrative Office of the US Courts

Federal Court Interpreter Certification Exam

From 1995 to 1996, SLTI was contracted to evaluate the Federal Court Interpreter Certification Exam (FCICE) for Spanish by the Administrative Office of the US Courts. After completing its evaluation, SLTI was contracted to develop an eight-year plan to improve the test program. In 2001, under contract to the National Center for State Courts, SLTI assumed responsibility for developing the written exam, which tests translation and interpretation skills in English and Spanish. SLTI developed new test specifications to make the exam more job-relevant by simulating the types of tasks court interpreters actually do in the courtroom. Then, SLTI developed both the Spanish and English sections of the written exam using federally certified court interpreters to write the items. Since the initial round of item development, SLTI has continued to provide currency reviews for existing items, as well as develop and field test additional parallel forms of the test, with item writing occurring most recently in 2012.

National Language Service Corps

Marshallese Reading Proficiency Interview

Between December 2008 and March 2009, SLTI developed the reading proficiency interview (RPI) for the National Language Service Corps. The RPI is a new testing method intended to address the rapid test development needs in small-population languages. The RPI measures how well a candidate reads in a language by asking a series of reading comprehension questions about two reading passages selected from a passage bank. The RPI reports a candidate’s reading proficiency on the Interagency Language Roundtable (ILR) scale. The key benefits of the RPI are that test development and tester training can be completed in approximately six weeks and that the test can be administered over the telephone or face-to-face. As a proof-of-concept, SLTI developed an RPI in Marshallese.

Foreign Service Institute

Evaluation of FSI Language Proficiency Test in Reading

From 2008 to 2009, SLTI was contracted to review materials and procedures relating to the Foreign Service Institute (FSI) Reading Test Program, including the Testing Manual and Reading Kits in French, Hindi, Nepalese and Turkish. SLTI staff observed a number of Reading Tests, evaluated the instructions and materials given to examinees, and evaluated the Reading Tests themselves, particularly with regards to the validity and reliability of the test. In preparing a final written report, SLTI discussed the relevance of the reading texts, some of them quite dated, to 2008 audiences; the quality of Examiner/Tester training; the efficiency of test administration; and the overall validity of the Reading Test as a measure of the ILR skill level description for Reading.

Development of a Passage Rating Methodology

In the fall of 2010, SLTI was contracted by the FSI to develop a methodology that FSI employees could use to select passages for their Reading Proficiency Tests and rate them on the ILR scale. SLTI incorporated traditional notions of passage rating, such as text mode and cognitive level, into the passage rating methodology, and expanded on the role of language features in analyzing the level of texts, focusing on linguistic complexity and linguistic markedness as key contributors to passage level.

SCHOOLS

National Assessment of Educational Progress (NAEP)

Since the fall of 2006, SLTI has provided item review services for NAEP assessments. The principal role of SLTI is to review items in core subjects (math, science, social studies, English language arts) for linguistic accessibility, and to ensure that the tests are equally accessible to non-native English speakers. SLTI also conducts translation verification studies for the NAEP assessments that are translated into Spanish and has translated numerous NAEP documents, including three reports on the performance of students on the NAEP mathematics assessments in Puerto Rico. From 2006 – 2009, SLTI performed NAEP work under contract to NAEP Education Statistics Service Institute (NAEP ESSI). From 2009 through the present, SLTI has been under contract to Educational Testing Service (ETS) to perform item review and translation services for NAEP.

New Mexico Standards Based Assessments

Since 2009, SLTI has developed versions of the Spanish Language Arts Reading and Writing assessments for the New Mexico Standards Based Assessments (NMSBA). These tests are developed separately, but are parallel to the English versions of the Language Arts and Writing assessments. SLTI performs this work under contract to Measured Progress, who has the prime contract for the entire NMSBA system. Working in conjunction with Measured Progress, SLTI meets with New Mexico teachers during item, bias, and data review meetings and responds to requests from the New Mexico Public Education Department.

Massachusetts English Proficiency Assessment

In 2003-4, SLTI worked with Measured Progress to develop the Massachusetts English Proficiency Assessment (MEPA). This test is based on the Massachusetts English Language Proficiency Standards. In addition to writing items, we reviewed items written by Measured Progress staff and attended item review meetings with Massachusetts ESL teachers. We also developed score reports and parent guides in 10 languages for this test.

From 2009 to 2011, SLTI participated in the expert review of the MEPA test items. The focus of the review was to assure that the items were accessible to and appropriate for the target ELL population, and the items were sound with respect to standard test development practices. From 2009-2010, the project funded through a subcontract from WestEd, and in 2011 the project was funded through a subcontract with Measured Progress.

CORPORATIONS

The California Endowment

Medical Interpreter Tests

Beginning in 2006, SLTI was contracted to design, implement, and coordinate the pilot testing of a series of Language Proficiency Tests and Interpreter Readiness Tests in Spanish, Cantonese, and Hmong for medical interpreters in California. The Spanish tests were originally developed to be implemented in a paper and pencil format. SLTI was further contracted to evaluate these tests and made recommendations regarding their operationalization. SLTI has developed additional forms of the Spanish tests for implementation in a web-based delivery platform, including the development of all audio and video recordings. The web-based versions are currently being beta tested.

Pearson Language Tests

Pearson Test of English – Academic

In 2007, Pearson Language Tests (PLT) contracted SLTI to develop over nine hundred original ESL test items for the incipient Pearson Test of English – Academic. Listening comprehension items, reading comprehension items, speaking items, and writing items were developed based on the Common European Framework of Reference (CEFR). In 2008 and 2009, more than 500 additional items were developed by SLTI item writers each year, and a further five-hundred items were commissioned in early 2010. SLTI continues to write items for the Pearson Test of English Academic (PTE Academic), as well as conduct reviews of items written in the UK and Australia.

Based upon our extensive experience with the PTE Academic, in 2008 SLTI was contracted to write a major portion of the Official Guide to the Pearson Test of English Academic. The guide was published in 2009 by Pearson Longman with attribution given to SLTI.

Pearson Test of English – General (formerly London Test of English)

In 2008, SLTI was contracted by PLT to write items for a new version of the Pearson Test of English – General (formerly the London Test of English). Listening comprehension items, reading comprehension items, speaking items, and writing items were developed based on the Common European Framework of Reference (CEFR). Since the initial contract, SLTI has written over 800 items for the PTE –General. In addition to creating items, in 2010 PTL contracted SLTI to provide substantive revisions and editing to the PTE – General item specifications at all six proficiency levels. SLTI continues to write items for the Pearson Test of English under contract with Pearson. We also review items written by Pearson staff, giving them feedback which they then use for item revision.

CTB/McGraw-Hill

LAS-Links Español

Under contract to CTB/McGraw-Hill in 2005, SLTI adapted the English versions of the LAS Links into Spanish (subsequently named the LAS Links: Español). LAS-Links is the revised version of the Language Assessment Scales, a major English language proficiency test that has been used in schools since 1975. SLTI also assisted CTB is their successful application to the Texas Education Agency for approval for the use of the LAS-Links: Español in Texas. In 2010-2011, SLTI was contracted to develop additional items for the LAS-Links: Español, as well as to perform expert review of items for both the LAS-Links and the LAS-Links: Español.