An Open Letter from Superintendent Jon Sills: State Assessments and PARCC Exams

November 12, 2015

Dear Faculty and Families,

I am writing to provide you with important information regarding the latest round of state assessments, the PARCC exams.  Specifically, I want to:

  • share my perspective on standardized assessments
  • explain the district’s decision to choose the PARCC assessments last year
  • share the district’s latest results and relevant testing conditions information

Standardized Assessments

To address the need for a new generation of assessments that will measure students’ learning of the more advanced comprehension, problem solving and analytical skills that are embodied in the Common Core curriculum (which Massachusetts has integrated into its curriculum frameworks), most states have joined one of two assessment consortia (Smarter Balanced or PARCC).

The Massachusetts Board of Education voted to pilot the PARCC assessments and gave districts a choice last year to give either MCAS or PARCC, and for the latter, either the paper and pencil or the computer version.  All districts had to give 10th graders the MCAS high stakes test.

If one believes, as I do, that the primary purpose for standardized assessments should be one, to help ensure equity of educational opportunity, and two, to provide teachers with useful, timely information about student achievement that can lead to improved instruction, then it follows that annual testing, grades three through eleven, is excessive, and that students’ performance data should be available long before the six month wait that presently prevails.  It follows as well that during a pilot period for a new computer-based test, like PARCC, which was riddled with administration issues and technological glitches, publishing the results can only lead to confusion and disillusionment.  Unfortunately, many advocates of standardized tests believe that their primary purpose should instead be teacher accountability and are therefore trying to tie teacher evaluation to students’ test scores, are less concerned about over-testing, and are determined to publish results even when they have no useful purpose.  In our (Massachusetts’) case, the Department of Elementary and Secondary Education (DESE) has chosen neither to disaggregate the results of last spring’s PARCC Performance Based Assessment and the PARCC Summative Assessment nor to provide districts with item analyses, and so these results have no appreciable value for the districts, their teachers, their parents or their students.

I worked with other superintendents last spring to draft the attached position paper on standardized assessment, which we have shared with the Massachusetts Board of Education in anticipation of its November 17th decision on PARCC vs. MCAS 2.0, and which I will speak to at the Board’s public hearing on November 16.  The position paper, which embodies some compromises in order to garner statewide superintendent support, nevertheless is unequivocal about the purpose of testing and the conditions under which new assessments should be rolled out.  Personally, I would go further to advocate for significantly less frequent and time-consuming testing and for choosing to develop a new generation of MCAS so that we can have more control over the test than committing to a multi-state consortium would allow.

So Then Why Go With PARCC Last Spring?

While not convinced that PARCC is the best choice for a next generation assessment, I strongly support the creation of assessments that will measure students’ higher order thinking skills and their ability to apply what they have learned.  A year ago, it appeared that Massachusetts would end up choosing PARCC (Commissioner Chester is the president of the PARCC Board), and so after considerable debate we decided to go with PARCC for several reasons:

  • Our students would have an opportunity to become familiar with the new test before it became high stakes
  • The district would become more familiar with its administration, particularly the computer-based component, before it became high stakes
  • We should “put our money where our mouth is” by choosing a test that assesses the deeper level of comprehension, problem solving and analysis that we believe in

While agreed that sticking with MCAS, was also a credible path, for the reasons listed above we chose to go with PARCC paper and pencil at Lane and computers at JGMS.

A year later, the political landscape has changed dramatically: Secretary of Education Peyser is pushing for MCAS instead of PARCC and many states have pulled away; even Commissioner Chester has begun to talk publicly about a third alternative (MCAS 2.0).  My guess is that the Massachusetts Board of Education will choose to develop a new generation MCAS test instead of choosing PARCC but that it will remain in the PARCC consortium and “borrow” from the PARCC assessment.

Bedford’s Performance

As previously mentioned, the DESE is releasing very little useful information about the spring tests- certainly not enough for us to make use of relative to underperforming student populations, individual students’ strengths and weaknesses, or adjustments to curriculum and instruction.  We certainly did not anticipate that the DESE would provide such scant information.  We had hoped to receive the kind of information that has previously followed MCAS tests so that we could learn how well our students were learning the higher order thinking skills that the PARCC test purported to measure.

According to the DESE press release, Commonwealth-wide, fewer PARCC exam-taking students scored Meeting Expectations than their MCAS-taking counterparts (except for fourth grade).  The Job Lane School was one of only 76 schools out of the 777 statewide whose students scored 80% or above in the Meeting Expectations category.  Unfortunately, because of the limited information the state has provided, all that we can tell from the data is how our various grades and subgroups did relative to the state averages.  We can see that grades 3-5 performed better than the state average on the ELA  (English Language Arts) and math tests and that grades

6-8 performed better than the state average on the math tests as well.  On ELA, our 6-8 grade students did not fare as well despite performing consistently high on ELA MCAS tests over the past ten years.  Test results are attached.

As it turns out, however, at JGMS where the students took the computer-based tests, the ELA PARCC tests were given first and all of the major computer problems occurred during their administration.  Pearson, Inc. (the PARCC test creator) failed to incorporate a “save” feature in their long composition questions, so when students kept getting kicked off the internet, which happened all too frequently, they lost all of their information.  I am sorry to say that we actually had students in tears over this.  Pearson had also failed to inform districts that multi-channel wireless relays had to be reconfigured in order to work well with their servers.

Hats off to our technology department for working through all of the problems despite their difficulty getting timely customer service responses from Pearson, and to our JGMS administrators, teachers and students for their persistence and patience.

A snapshot of Bedford’s performance follows.  The full reports may be accessed on our webpage tomorrow.  Please review the M.A.S.S. superintendents’ association position paper.  Feel free to contact your building principals or my office with any questions.

Sincerely,

Jon

PARCC-Table-2015

On a scale of 1-5, only students who received a 4 or 5 are considered to meeting the grade-level expectations set by the consortium of PARCC states

Jon Sills
Superintendent, Bedford Public Schools
781-275-7588
[email protected]

Print Friendly, PDF & Email
Subscribe
Notify of

0 Comments
Inline Feedbacks
View all comments

All Stories

This summer I'm planning on visting: (please check all that apply)

View Results

Loading ... Loading ...
  • Junior Landscaping
Go toTop