Starting next week, NYC teachers will begin to give our MOSL (Measure Of Student Learning) Assessments. Thanks to Governor Cuomo, the NYS Legislature, and the Board of Regents -- they now comprise 40% of our total evaluations under NY State’s APPR teacher evaluation system.
Since 2013, NYC teachers have been evaluated by a changing set of criteria every year. This year, the evaluation was changed again in January 2017, and went into effect March 7th. Each school’s MOSL committee had to finalize their selected methods and goals by April 7th, 2017 -- the same day NYC schools went on Spring break. MOSL tests begin as early as Monday, April 28th in some schools, yet no one really seems to know what form they will take, what skills they will test, or what content knowledge will be included. At this time, with two months left in the school year, most teachers do not have any idea about, or control of, over 40% of the content of their annual assessment.
Under the Danielson-based Advance system, a teacher observed with such a shoddily planned-out assessment for their students would rightly be given an Ineffective rating.
Now obviously what I have just described has no legal, or scientific legitimacy, and would be laughed out of any peer reviewed journal. These VAM based MOSL’s have as much credibility as alchemy, or phrenology. So of course, that method will be used to determine teacher proficiency and student achievement.
Besides the pressure these tests put on students, teachers’ careers are now being evaluated on short notice assessments, that no teacher has even seen, much less been able to prepare their students for. Teachers are educated professionals. I myself have a doctorate in education. We can all clearly see that this method of evaluation is a farce. The problem is, that school superintendents, administrators and the UFT leadership are going along with whatever nonsense emits from the DOE. They force-feed it to the teachers and students, and expect them to swallow it without complaint.
Perhaps the most egregious flaw in this year’s MOSL’s is that since they are based on student growth, and we only agreed on the assessments two weeks ago, there was no time to administer baselines to the students. Baselines are usually given at the beginning of the year to ascertain what level a student is performing at. The baseline assessment should be similar to the endline assessment. How do you measure growth if you do not have any idea where you started from?
Since the NYC DOE did not have the MOSL assessments ready in September, the city in its infinite wisdom, will be assigning students what amounts to composite “proxy” scores as their baselines. In other words, they are making up where each student should be, then giving a haphazardly compiled assessment tool, and comparing the actual student’s score to whatever algorithm they have created out of thin air. One fellow teacher asked: ”if they are making up a proxy baseline score, why not just make up a proxy endline score”?
The main reason for these last minute exams is that over the last three years New York has had almost a quarter of a million students opt out of the Common Core ELA and Math assessments for the third to eighth grades. With so many opt outs, Governor Cuomo had determined that this year, the tests could not be used for evaluative purposes for students or their teachers. The funny thing is, students can opt out of the MOSL’s as well.
Common branch teachers such as art, music or physical education etc. are linked to the performance of students in classes that they do not even teach. Teachers whose classes end in either a NYS Regents exam, the NYSESLAT, or an NYSAA will not have to give their students MOSL exams, as long as those classes make up 50% or more of their programs.
This Value Added Model of teacher effectiveness has continually been proven to be false, most recently in the case of teacher Sheri Lederman. VAM is junk science, and the MOSL’s that the NYC DOE has come up with are exactly that, junk.
The Department of Education may throw these incompetent assessments at us, and it is indeed insubordination not to comply with the directives, but we do not have to pretend there is any merit to it. And as soon as our final ratings are compiled, we all need to flood the legal system with lawsuits against this clearly erroneous method of evaluation. Our students, and teachers, deserve better than to be VAM-boozled like this.