Student achievement is significantly influenced by non-school factors. While demography is not entirely destiny, it is well-known that academic achievement is correlated with factors like the income of parents, family structure, and the level of education achieved by parents. The Coleman Report from 1966 is credited with this discovery.
So let’s take this insight and see how it can help us understand student achievement and school performance.
Let’s compare two area schools, Pocopson Elementary and Greenwood Elementary. Pocopson is the largest elementary school in UCFSD; Greenwood is just across Route 1 from Longwood Gardens, and is in the Kennett Consolidated School District. On a map, the catchment area for Greenwood runs up against UCFSD boundaries.
Take a look below at the academic achievement stats for both schools in 20 13-14. Question: which school is doing a better job educating its students?
Math – % proficient PES: 96% GES: 80%
Reading – % proficient PES: 94% GES: 77%
Science – % proficient PES: 98% GES: 84%
Writing – % proficient PES: 90% GES: 83%
Is your answer PES? At first glance, that seems like the obvious answer, as more students at PES are reaching higher levels of academic achievement. But it may be a mistake to attribute that success to the school. As I wrote in my previous post, non-school factors like parental education and income play a significant role in student outcomes.
So how do these two schools compare on those non-school factors? Unfortunately we don’t have data on family income, parental background, or other well-studied factors. But we do have profile information on both schools that can stand-in for these metrics. From each School Performance Profile, we find the following information:
% Economically Disadvantaged– PES: 3% GES: 40%
% Gifted – PES: 7% GES: 1%
% English Language Learner– PES: 1% GES: 14%
Although imperfect as proxies, economically disadvantaged is an (inverse) indicator of community affluence; “% gifted” is a rough indicator of high-IQ students; % English Language Learner (which used to be called bi-lingual students) is another student characteristic that tends to correlate with lower performance on achievement tests, especially in elementary grades (some of this is due not to a true difference in academic ability, but due to the disadvantage of taking standardized tests in a second language.)
Let’s zero in on economic disadvantage. PES has 3% economically disadvantaged students, while GES has 40% … a 37 percentage point gap. How might this difference impact PSSA results?
A study was recently done that gives us an indication. That research shows a strong relationship (r=0.83) between the aggregate math proficiency of students and family income … note the downward sloping cluster of data points in the chart below. The more economically disadvantaged students there are in a district, the lower the PSSA results.
Similar effects (r=0.87) can be seen on PSSA reading proficiency:
So based on the characteristics of the student body, we would expect PES to have better PSSA results than GES, right? Looking at the scatter plots and the best-fit lines, we can estimate that for Math, a 40% difference in economic disadvantage equates to about a 20% lower result on PSSA math proficiency. Reading scores have a similar relationship. So one demographic factor by itself (economic disadvantage) might explain almost all of the achievement gap between Greenwood and Pocopson. And there are several other non-school factors we have not even explored (student IQ, level of parent education, etc.).
Other than looking at more scatter plots, how can we isolate the true performance of the school, and disentangle the non-school effects like family income and parental education?
Statisticians and researchers have developed an answer. Their technique was first conceived for research purposes in the early 1970s, applied to school systems in the 1990s, and significantly expanded into a mainstream evaluation tool in the 2000s. The research is now advanced, and most methodological issues have been resolved. The technique is called Value-Added Modeling (VAM). And it is able to a pretty good job isolating school-specific effects on student achievement.
In my next post, we will take a closer look at VAM and what it can tell us about the performance of our schools.