Times are displayed in (UTC-05:00) Central Time (US & Canada) Change
About this srcd poster session
| Panel information |
|---|
| Panel 18. School Readiness/Childcare |
Abstract
Early mathematics skills are a key predictor of later achievement and therefore an important component of school readiness (Duncan et al., 2007). In this study, we describe the expansion and validation of the Early Mathematics Assessment System (EMAS; Ginsburg & Pappas, 2016), as a tool for understanding kindergarten students’ mathematics readiness as part of the [blinded] statewide comprehensive school readiness assessment system.
The EMAS was selected to be expanded and adapted to use as part of [blinded] because it: a) demonstrated prior evidence of validity; b) aligned with the [blinded] Standards of Learning; c) was feasible for teachers to administer accurately by teachers using a flip book and manipulatives within a reasonable time-frame; and d) offered data to teachers to guide instruction. Additionally, Dr. Ginsburg was willing to allow [blinded] to use, adapt, and expand the assessment.
To create a vertically aligned assessment that measured growth from fall to spring, we developed approximately 200 additional items capturing the sub-domains of numeracy, computation, geometry, and patterning. We consulted with early childhood mathematics experts and cross-walked each item with Clements’ and Sarama’s (2009/2021) learning trajectories and the [blinded] Kindergarten Mathematics Standards of Learning. We then pilot-tested each new item with approximately 300 children, ranging in age from 4 to 7 years old.
Based on an analysis of each item, including Differential Item Functioning to ensure measurement invariance across groups, we constructed the fall (35 items; See Table 1 for item descriptions, skills, and SOL/trajectory alignment) and spring (34 items) kindergarten EMAS forms. We selected items to represent a range of skills across the four mathematics subdomains and to target an appropriate average level of difficulty. The EMAS scores were converted into scaled scores so that teachers and schools can track students’ mathematics growth over time.
The 2021 – 2022 school year was the first that the EMAS was administered statewide in the fall and spring (see Table 2 for descriptives). Scores were normally distributed in the fall, as was fall to spring growth, with strong internal consistency (α = 0.92 fall, α = 0.91 spring). Although third grade standardized tests are not yet available for this cohort, we conducted predictive validity analyses with a smaller cohort who took the EMAS in 2019 and had third grade [blinded] Standards of Learning (SOL) assessments available (n = 68,239). Using regression analyses with cluster-robust standard errors, and controlling for children’s demographic characteristics (see Table 2), we found that EMAS scores were a significant predictor of both mathematics (β = .319, p = .000) and reading (β = .241, p = .000) SOL scores. In sum, results suggest the revised and expanded EMAS is able to be administered at scale, shows strong reliability and promising results related to predictive validity, and measures growth over time. Given the importance of early mathematics skills, it is critical that psychometrically sound and feasible measures of mathematics skills be used at kindergarten entry that help teachers’ support children’s mathematics growth trajectories.
Author information
| Author | Role |
|---|---|
| Jessica E. Whittaker, Ph.D., University of Virginia | Presenting author |
| Virginia Vitiello, University of Virginia | Non-presenting author |
| Traci Kutaka, University of Virginia | Non-presenting author |
| Jamie DeCoster, University of Virginia | Non-presenting author |
| Amanda Williford, University of Virginia | Non-presenting author |
⇦ Back to session
Understanding Children’s Mathematics Readiness at School Entry: Developing and Validating a Statewide Kindergarten Mathematics Assessment
Submission Type
Individual Poster Presentation
Description
| Session Title | Poster Session 10 |
| Poster # | 70 |