Completed Projects
Consortium on Useful Assessment in Languages and Humanities Education (CUALHE)
Investigator: AELRC
This conference served as a vehicle to bring foreign language education stakeholders together and create opportunities for disciplinary dialogue on topics related to student learning outcomes assessment. Project work included an annual summit on student learning outcomes assessment in college foreign language programs, with round tables on best practices, paper/poster sessions, and colloquia addressing student learning outcomes assessment in college foreign language education.
Workshop on Language Assessment and Understanding: Application and Use
Investigator: Meg Montee, Meg Malone
This hands-on assessment workshop covered key concepts in formative, classroom-based language assessment with a focus on how educators can integrate language assessment with daily instruction. The workshop provided educators with a foundational understanding of language assessment as well as principles and strategies for implementing assessment. Participants left this workshop with an understanding of practical ways to use formative assessment to better understand students’ language proficiency and their language development.
Short-cut estimates of foreign language proficiency for research purposes
Investigator: John M. Norris
The project culminated in publication of a book on C-test development for eight different foreign languages in US university settings, stemming from research conducted through the AELRC. The chapters each describe in detail the development process – including text identification, selection, deletion strategies, piloting with L1 native speakers and college-level foreign language learners, and data analysis – tailored to each language. The languages included are Arabic, Bangla, French, Japanese, Korean, Portuguese, Spanish, and Turkish.
Developing a Chinese C-test for Research Purposes
Investigators: Yiran Xu, Todd McKay, Meg Malone
This project developed and validated a short-cut proficiency measure of Mandarin. Domain experts, Mandarin native speakers, and learners of Mandarin were included in the development stages for feedback and quality control. Think-aloud data were collected to inform native and learner processes of linguistic items. The results of the C-test were further correlated with learners’ performance on OPIc and the ACTFL reading test. Five optimal texts of high reliability, good separation indices and fit statistics were selected to inform future L2 research.
Student learning outcomes assessment in community/junior college language education: Trends in capacity and use
Investigators: John McE. Davis, Young A Son
The purpose of the project was to learn more about the patterns of program assessment/evaluation activity and levels of support in US community college foreign language programs. Data was collected via a national questionnaire in order to better understand how program assessment can be a useful activity that impacts teaching and learning in productive ways. Results were disseminated at conferences of relevant educators such as CUALHE and ACTFL.
Review of evaluation and assessment in heritage language learning
Investigators: Young A Son, Amy Kim
The purpose of this project was to offer a general overview of studies that focus on both research and examples of practice focused on implementing assessment and evaluation of heritage language learners and heritage language education programs. It also intended to highlight issues persistent in the heritage language learning literature.
Evaluation planning and capacity building in self-access language learning labs/centers
Investigators: John McE. Davis, Todd McKay
The project sought to understand and develop approaches to evaluation capacity building, logic model development, and situational analysis for college language laboratory/self-access centers. A key purpose of the study was to identify best educational practices for university and college language labs.
Examining the Differential Difficulty of Languages
Investigators: Meg Malone, Yiran Xu, Vashti Lee, Charlene Polio (MSU)
To date, claims of differential difficulty have neither been substantiated through empirical research nor fully supported in the SLA literature. This pilot study represents an initial attempt to investigate these claims via both quantitative and qualitative methods, examining the progress of complete beginners in a summer university-level immersion program that included a promise to use the target language (Chinese, Russian or Spanish) exclusively. Participants took standard, large scale listening, reading, and oral interview (OPI) tests after six to seven weeks of study. Interviews with instructors and directors, as well as classroom observations, were conducted to determine the comparability of instruction across languages and how perceptions of language difficulty influenced instruction. Results show that the principal differences emerge in reading scores across languages, while speaking results are slightly higher in Spanish than in the other languages. Dr. Charlene Polio, a colleague of the AELRC, gave an invited talk on this project at Cornell University in April, 2019.