Are for-profit companies incompatible with public education? Some opt out advocates think so. Critics of testing argue that we are sending millions of dollars into the hands of ‘Big Testing’ companies. And those companies, motivated by profits, shouldn’t be trusted with our student’s futures. (Examples of this thinking here here and here)
Instead of sending taxpayer money to Pearson, CTB McGraw-Hill, and Houghton Mifflin Harcourt, we should instead send those funds to our local schools … to reduce class sizes or raise teacher salaries.
Moreover (the critics say) the large testing companies use their profits to lobby state and federal government officials, which only leads to more and more emphasis on testing from our Department of Education
In my last post I argued that only PSSAs provide benchmarked, validated, and reliable results about school performance. Perhaps I have convinced you that PSSA data is uniquely useful to schools, especially to those who run them.
But even if tests are useful, is testing worth the additional stress it places on our already stressed-out students? Continue reading
In my last post, I made this argument:
if we want to accurately assess student growth and achievement, we must use a criterion-referenced and valid instrument, administered to all students in Pennsylvania, under controlled conditions. And the assessment must be linked to what we’re teaching … the PA standards.
Only the PSSA hits all of these criteria. In the attached chart, I show how I arrived at that conclusion. How do the tests that are widely used in our classrooms stack up? For those who like charts … enjoy! (Corrections welcome — leave a comment) Continue reading
In my recent “Opt In” posts, I argued that PSSA data is powerful and valuable. And if we “Opt Out” of testing, then we undermine the ability of our schools to see their true level of performance, be accountable, take corrective action, and deliver an even better education to our students.
“Maybe the data is useful,” the critics may counter, “but we have too many tests in schools. And the PSSAs don’t even provide any diagnostic value back to the teacher. Test results aren’t available for 6 months, and by then it is too late to help individual students.”
Let’s unpack this critique and address it in three parts: Continue reading
Last night the school board voted 8-1 to approve the Fact Finder’s report. The UCFEA approved the same report last week. As result, the two sides have effectively agreed to the terms of a new four-year contract, covering the period July 1, 2015 to June 30 2019. (More info on the Fact Finding process, under the PA Labor Relations Board, is here),
(I expect more information will be released on the District’s website soon, including the full fact finder report, which, under the Byzantine laws and regulations of Pennsylvania, could not be released to the public prior to the board vote.)
I have reproduced below my full remarks delivered at the board meeting. The short version:
- The compensation provided to our teachers is affordable within the taxation limits of Act 1
- By approving this agreement, we can turn our energy toward initiatives to further improve our schools
- Although the settlement is more generous than might be necessary, it is a fair compromise
One of the problems with the ‘Opt Out’ movement is its disdain for standardized testing. Without testing, we would not have access to rich data sets that tell us how our schools are truly performing.
In previous posts we have looked at UCF test results for PSSA Math and PSSA Reading. Today we look at Keystone results for Algebra and Biology.
Although “Opt Out” has a few things right, the movement’s desire to do away with standardized testing is, in my opinion, a major mistake.
Without testing, we would not have access to rich data sets that tell us how our schools are truly performing.
With testing, we have an honest assessment of the academic strengths and weaknesses of our program. With testing, we can see achievement results (performance against a standard) and growth results (progress between two points in time). With testing, school boards have objective and benchmarked information on school performance.
In yesterday’s post we looked at PSSA Math results for UCF. Today, it is PSSA reading.
In my most recent Opt In post, I argued that PSSA data provides powerful insight into student achievement and school performance.
Today I take a break from the theoretical case against “Opt Out” and get very practical. What data is available from standardized tests? How is that information be used? What insight does that information provide to parents, administrators, teachers, and school directors?
In my next three posts, I will publish data taken from PDE’s public test score database, which is found here. First up — how are our students and schools performing on PSSA math?