Pages

Friday, January 23, 2015

The One Thing I Regret
Amy Wolpin, Ed.D.

I retired 3 years ago after 36 years of elementary public school teaching in Massachusetts with an Ed.D in mathematics and science education. I am proud of my many accomplishments and feel fortunate to have touched the lives of so many students and their families as well as worked in collaboration with dedicated staff. There is one thing that I regret. I was taken astray by high-stakes standardized testing.
After 11 years working as an elementary special education teacher, I chose to transfer to a fourth grade classroom teaching position. Massachusetts began its state standards-based assessment initiative (MCAS) in 1998 and my 4th graders in that first year were tested in English/Language Arts, Mathematics, Science/Technology, and with two writing tests. In total my students took at least 18 hours of testing that first year - all in one month. The following year a History/Social Science Test was added. Right from the beginning, I saw the problems with the tests and how they were affecting my students. Special education students were frustrated and defeated. Other students exhibited stress with physical ailments like tension headaches, nausea and even vomiting. No one was learning during and after the test period.
Our state tests are untimed meaning a student can be given up to an entire school day to complete a test session. Most students needed twice the recommended time suggested in the test administration manuals, but some students needed all day. Unless other staff was available to proctor, the students who were finished needed to sit silently at their seats to read so the other students could complete their tests. Little or no instruction could therefore take place in the afternoons of each test session.
The Massachusetts Department of Elementary and Secondary Education (DESE) realized that all these subject tests were too much for one grade and in subsequent years, fourth grade had just English/Language Arts, Mathematics, and one Writing test. Science/Technology and History/Social Studies were moved to fifth grade. Eventually, Massachusetts gave yearly standards-based tests from grade 3 to grade 10. English/Language Arts and Writing tests are given in March and other subjects are given in May. More tests means less teaching time.
I decided to become as informed as I could about the assessments to help prepare my students to do their best. First I studied our state curriculum frameworks for each subject, I worked on the state MCAS assessment development committee for mathematics and I was appointed to the blue ribbon panel to select the cut scores for the fourth grade mathematics MCAS. At the district level, I helped develop learning standards that aligned with the state curriculum standards and I provided professional development for teachers about the MCAS. I also studied specific test-taking strategies for each type of test item. For example in the MCAS for mathematics, there are multiple choice, short answer and open response questions.
Until 2008, the Massachusetts DESE released all test items used to score the student tests. Teachers and parents could see the actual test items as well as see their students’ results. At the district and building levels, teachers were able to examine how their students performed by test item, content area and item type. When I became a district elementary mathematics coordinator, I took seriously the analysis and dissemination of the yearly MCAS scores. I trained teachers in test preparation strategies. I modeled lessons for teachers with their students. I kept refining the test-taking techniques that I thought would help students “beat” the test. I planned my lessons and unit tests so they would mimic test sessions. I was sold. I had become regrettably a high-stakes test teacher leader. I had lost my way.
A number of reasons influenced me in changing my mind about high-stakes testing.
  1. The school scores are publicly posted in the newspapers and online. One newspaper even ranks the schools based on the scores. Schools have increasing difficulty meeting AYP and many scores remain flat over time. Schools are rated based on their student test scores.
  2. Massachusetts DESE in 2009 started releasing only about half of the items putting the district and school score analyses in the dark by 50%. We could no longer see all the problems our students were challenged by. The data no longer fully informed our instruction.
  3. Our state rates individual student scores on four levels (Advanced, Proficient, Needs Improvement, and Warning). To justify the Advanced rating, a certain number of items on each test are rated higher in difficulty (i.e. above grade level) but the ratings per item are not provided.
  4. In 2011 Massachusetts released new curriculum standards aligned with the Common Core State Standards along with a 4-year plan to gradually align the MCAS with the Common Core. The new standards were not as comprehensive as the 2000 Massachusetts standards and not as appropriate developmentally especially grades K-3.
  5. I started to see almost all of the publishers of curriculum materials being bought out by Pearson Education, Inc. which is the same company that is producing the PARCC test.
I realized I was bought out myself by the entire idea of high-stakes testing. The unfair testing of students started being used to unfairly evaluate teachers. By unfair I mean testing a full year of curriculum standards that in reality teachers only have 6-8 months to teach. Tests are especially unfair for special education and English language learners who may not be ready to be tested or need special accommodations. For example, the fourth grade math test in Massachusetts is given at the beginning of May which means that a teacher only has 7 months to teach a 10-month curriculum before the test minus the time for any previous subject tests or for test preparation practice.
Fourteen years ago research showed that the MCAS test scores really just describe the demographics of our students. In 2001 the Donahue Institute analyzed the MCAS results and reported, ″ One of the consistent findings of this research is that demography explains most of the variation in test scores from district to district. Results from this year's research are similar to results from last year's work: about 84% of the variation in test results (scores for all of the test-taking students for the nine MCAS tests combined) is explained by demography. That is why Weston and Wayland have high MCAS scores and why Holyoke and Brockton have low MCAS scores. Thus, though demography is not destiny, it sets a strong tendency… In the end, the MCAS scores tell more about a district's real estate values than the quality of its schools.″1
Now we are facing the prospects of mandated PARCC testing in Massachusetts to replace the MCAS tests. We need to take a step back to reflect on what we are doing to our students and what we are doing to our teachers with an over-focus on high-stakes testing. What about love of learning, creativity and the development of individual expression and interests? School cannot be just about test scores and neither can teaching.
1Gaudet, Robert D. Effective School Districts in Massachusetts: A Study of Student Performance on the 1999 MCAS Assessments. The Second Annual Report. Sponsored by the University of Massachusetts Donahue Institute, March, 2000. http://www.donahue.umassp.edu/docs/effective-districts-mcas-pdf

No comments:

Post a Comment