USCIS (U.S. Citizenship and Immigration Services) is now embarking on a nationwide trial of planned changes to the naturalization test that are causing great concern among educators and organizations that serve and advocate for citizenship applicants. The current oral civics questions would be replaced by a multiple-choice exam that requires a significantly higher level of reading skill and test-taking ability. The current English-speaking evaluation, which consists of personal information and application-related questions, would be replaced by a formal test that uses photograph prompts and a scoring rubric for evaluating language proficiency.
In a previous article, I described problems with the formats of the planned civics and English test designs. Now that trial testing is moving forward, this is an appropriate time to raise concerns about the process, issues of transparency, and whether USCIS is repeating test redesign errors from two decades ago.
Learning from Past Mistakes
Back in 2001, the Legacy INS (Immigration and Naturalization Service) contracted with an assessment company to revamp the naturalization test format and content. This resulted in a contemplated civics test that contained 20 multiple-choice questions, and an English test consisting of photographs of everyday life that applicants would describe. Although some citizenship service organizations were invited to provide input to the contractor, the prevailing critique was that the opportunity for stakeholder input was inadequate, the resulting test design was faulty in content and format, and it would impose new barriers to naturalization.
As a result of considerable pushback, the newly formed USCIS contracted with the National Academies Board on Testing and Assessment in 2004 to evaluate the redesign process. The Board formed a committee, which was sufficiently concerned that it issued an unplanned interim report in December 2004 calling for a redesign process and advisory structure that was more “open, transparent, and accountable” and recommending suspension of further test development until a better process was in place. A letter report from the Department of Homeland Security Office of Inspector General in June 2005 concurred on the main points but disagreed regarding the size of the advisory structure the Board recommended due to the time and cost involved in complying with standards of the Federal Advisory Committee Act (FACA).
The outcome of that three-plus year mis-start was a much-improved redesign roadmap starting in mid-2005 under the Office of Citizenship, which included a significant number of stakeholder meetings and discussion group sessions, an impact analysis by the American Institutes for Research, substantial input by subject matter specialists into the design of test formats and items prior to pilot testing, and ongoing internal USCIS working group meetings including representatives from Field Operations, Policy and Strategy, and General Counsel.
The participation of Field Operations staff in the stakeholder sessions was especially valuable and included officials from Chicago, Dallas, Detroit, Los Angeles, New York, Newark, Philadelphia, Sacramento, San Diego, and Washington, DC. Having frontline field personnel at the table with service providers during these meetings resulted in important reality-checks about the logistical implications of test recommendations for interview length, officer training requirements, and impact on overall application processing. The current test, which was first implemented in 2008, is the result of this very careful development process.
The test redesign currently underway arguably repeats mistakes of the current test’s first three years of development from 2001 to 2004. I believe that the redesign would not pass muster in an independent evaluation for reasons of process, at least. Although USCIS has conducted public engagement sessions on the test redesign, the format of these sessions has been limiting, and there has been little transparency in the process. The Federal Register notice provided no opportunity to post formal comments in the usual manner that allows for public access and review. As a result, there is currently no window into stakeholders’ responses to this initiative other than listening to participants’ questions and comments during engagements.
In addition, the entire process of external participation that preceded the 2008 test implementation has been eliminated, including stakeholder meetings, examination of data on current testing, and interaction with Field Operations. Finally, test formats for piloting were predetermined and test items were prepared internally at USCIS rather than being designed or vetted by specialists. A Technical Advisory Group of subject matter experts has just recently begun work — late in the process, since the TAG’s task is to review trial test data for formats and items that are already in place. In this respect, the current design process is more troublesome than the failed process two decades earlier.
- For transparency, USCIS should make publicly available all past and future comments, views, written data, and arguments on all aspects of the trial testing submitted by mail, or by email to the “natzredesign22” address designated for such submissions.
- The agency should consider the current pilot testing a formative research phase that compares both the proposed and existing civics and English test formats, considers alternatives, and analyzes the impact of a reading-based multiple-choice civics test and a picture-based speaking test on applicants at lower levels of language proficiency.
- The English speaking test should contain questions about basic personal information, as in the current evaluation, rather than questions about photographs. Applicants can produce significantly more language telling about themselves than about a photo. (USCIS offers the example “mother, daughter, cook” as an acceptable description of a kitchen scene, leaving the test format open to criticism about its content and scoring guidelines.) A test that focuses on basic personal information will prompt a much greater and more meaningful amount of English.
- USCIS should convey to the Technical Advisory Group all comments from institutions participating in the trial testing so that the subject matter experts can consider the impact of the test redesign on examinees and test administrators, and the impact of new educational resources and test-preparation materials on students and instructors.
- USCIS should be prepared to forego implementing a reading-based multiple-choice civics test if the format has a negative impact on applicants’ pass-rates, or if stakeholders report that potential applicants will refrain from applying due to the higher level of reading skill required.
- If a reading-based multiple-choice civics test is implemented, the existing oral test should be retained as an optional accommodation for any applicant who requests it.
Technical Advisory Group (TAG):
- Subject matter experts serving on the TAG should fulfill the group’s complete scope of work — to review the content, structure, and design methodology for the redesigned speaking test and civics test. This should not be interpreted narrowly as simply providing technical assistance on the content of test items in formats that have been preordained by USCIS. The formats themselves need to be evaluated in terms of their design frameworks, appropriateness for the applicant population, and logistical requirements for test administration during the interview.
- The TAG should consider the test redesign an iterative process guided by formative research principles. In its initial phase of work, the TAG should advise whether the test formats determined internally by USCIS need to be revised or replaced prior to the commencement of pilot testing and a specific focus on test item content. It should follow the example cited above of the National Academies Board on Testing and Assessment, which in 2004 expressed concern about the direction of a previous USCIS test redesign process by issuing an unscheduled report soon after its committee commenced work. If the TAG indicates similar concerns about the test design frameworks or other issues, the contractor should provide an interim report to convey this feedback and indicate the need for changes or improvements before pilot testing proceeds.
- The TAG should require that pilot testing include a comparison of administration of the existing oral civics exam and the proposed reading-required multiple-choice exam with attention to passing rates of examinees at different levels of English proficiency, especially applicants at National Reporting System (NRS) Levels 2 and 3. Many applicants at these levels are able to pass an oral exam but do not have sufficient reading skill or previous test-taking experience to pass a multiple-choice exam.
- The comparison of oral and multiple-choice civics test formats should also include the amount of time required for test administration, as any increase in the time requirement will impact interview length.
- The TAG should consider the initial personal information locator questions of the BEST Plus Test as a prototype for an alternative to a photograph-based English speaking exam. Such questions could standardize key interactions during the interview as the officer asks about and verifies the applicant’s personal information relating to naturalization. Officers can be provided with a list of questions and a simple scoring rubric for asking about or verifying the applicant’s name, address, date of birth, country of birth, country of citizenship or nationality, contact information, family members (parents, children, current spouse), marital status, current employment, and current residence. (This aligns with recommendations to USCIS by the Citizenship Test Working Group on September 17, 2021.)
- The TAG should review feedback of trial testing participants as it considers and reports on the impact of the test redesign and the effectiveness of new educational resources and materials.
Pilot testing institutional participants:
- Community-based organizations and other institutions that participate in the trial testing should monitor the impact of the new test formats on their students and programs.
- Students from a wide range of proficiency levels need to participate, especially students at NRS Levels 2 and 3 who will face the greatest challenges with the new tests.
- Participating institutions should keep a journal and other documentation of their experience during trial testing. If the participation agreement allows, they should communicate with each other and share information — in addition to whatever communication among institutions, if any, is facilitated by USCIS.
- The role of these institutions is critical in assuring that participation in the trial is not skewed toward students at higher-than-average levels of proficiency, previous education, family resources, or other factors.
Citizenship and immigration advocacy organizations:
- National and local organizations should keep informed about developments in the trial testing and especially maintain ongoing communication with any of their constituent program sites that are participating.
- In the likelihood that USCIS is not willing to publicly disseminate the comments, views, written data, and arguments on all aspects of the trial testing that have been submitted by mail or by email, one or more advocacy organizations should consider filing a records request for this information and other departmental communication related to the trial testing under the terms of the Freedom of Information Act (FOIA).
A Long-Term View
Both contemplated civics and English test innovations are more appropriate if these assessments are separate from the interview and are optional. That would be their greatest value in reducing officer tasks and interview time, and the most cost-effective way to accomplish this would be via technology. The current trial period could be an excellent Phase 1 discovery process that leads to eventual implementation of a Phase 2 technology solution, rather than burdening the interview with two new test formats that will be time- and task-intensive. This trial period would not be wasted effort if it doesn’t result in immediate implementation, but instead serves as the starting point for longer-term development of a testing solution that genuinely improves this part of the naturalization process.
Bill Bliss has worked in English language and civics education since 1974 as a teacher, trainer, curriculum developer, and consultant, and has provided technical assistance on citizenship training and testing since the 1980s legalization program. He is the author of two civics courses: Voices of Freedom, for learners preparing for the naturalization process; and Basic Civics for U.S. Citizenship, a review text for secondary school students in states that require a civics exam for high school graduation. The views expressed in this article are solely those of the author.