Pages

Saturday, April 23, 2016

Teachers are More Than a Score: BATs Report VAM Systems from Around the Nation
Marla Kilfoyle, Executive Director BATs and Melissa Tomlinson, Asst. Executive
Director BATs

Across the country teachers are being evaluated by a system known as Value Added Measure.  According to the American Statistical Association, “A VAM score may provide teachers and administrators with information on their students’ performance and identify areas where improvement is needed, but it does not provide information on how to improve teaching. http://www.amstat.org/policy/pdfs/asa_vam_statement.pdf


BATs asked teachers from across the nation  to report on the various evaluation systems used in their state.  Many reported the demoralization of teachers, the sense of unfairness, and an utter disdain for the lack of focus on how to support the profession rather than demean it.  We were also particularly interested in the idea that teachers can attach letters to their end of the year evaluations.  In many districts around the nation, local unions can negotiate attaching a “letter of protest” to your end-of-the-year evaluations (even if your evaluation is PERFECT, you should still write a letter of protest).  We will invite all BATs to find out if this is the case and to utilize it as a way to document our protest, officially, of using junk science to evaluate our profession.   Here are what some of our BATs reported around the nation.



Florida


Teachers are evaluated via Marzano and VAM on student growth on state assessment (FSA). The state allows districts to decide their own percentages for each category, as long as student growth is not valued less than 30%. Evaluation is returned in May for teaching practice, then in October for full evaluation including student growth. We are allowed to write statements and put them in our files if we disagree with the score or perhaps have circumstances with a classroom of children who are perhaps more disadvantaged than others.


Teacher  evaluation has 3 metrics: admin observation (40%), Professional Development plan (10%), student data (50%). Because we wait for the data number, the evaluation is not finalized until Nov. 1 of the next school year - at the earliest. Teachers  are allowed to sign that they do not accept it and to write a response that will be attached.

Ohio


  • We are evaluated with 50% based on  principal observation through TDES (5 "touches" based on Charlotte Danielson).  We are also evaluated on 50% test scores - part of Cleveland Plan Law and now state law.  Our pay has been tied to our rating for 3 years.  Composites are due April 29th.  Teachers cannot say anything about their evaluations to the state.  Districts send threatening emails if teachers don't go into Battelle (online reporting system) and confirm their rating.


  • More from  Ohio -We are rated on  50% observation and 50% Student Growth Measure .  Our contract spells out a range of time between April 1 - May 10.  Teachers  can and do attach  letters  to their final evaluation.  


  • Ohio, OTES & SLO (utter stupidity from our inept state department), within the time frame, my principal encourages all of us to respond with anything she may have missed.


  • From Ohio BAT - OTES only if tenured: once every three years if accomplished, once every other if skilled, every year ongoing if developing, until the rating is increased [not sure how long they give, but this is the third year and I know of one teacher still trying to improve, but our contract protected us] and if ineffective then during the year to make sure you are meeting the goals of your improvement plan [only have that year to improve]; the OTES eval is approved by admin with a post-conference with chance to defend and possible change with further evidence; teachers who are not tenured are otes if they are not in the resident educator program; our district review plan went out the window with otes; we do have the ability to add a written response to anything added to our file

  • From Ohio BAT - I teach elementary art, and although the SLO process was supposedly discarded & my district opted for "shared attribution" , I must "show student growth" with some kind of before instruction and after instruction numbers, which means I must still give tests to two of my 28 classes of fourth graders, 4 times a year. Last year we had to enter those test scores into an Ohio Dept. of Education portal. Even so, some percentage of my evaluation is based on math and LA scores, subjects I do not teach.

  • Ohio, 50% observation,50% shared attribute (cause Special Ed) May. Yes, you can respond on all parts and attach proof or documentation

New Mexico


  • Teachers are evaluated by 50% test scores and teachers get docked in evaluation for taking sick days and getting poor reviews by students and parents.  Teachers receive their evaluations at the end of the year.

New Jersey


  • Most teachers are observed 3 times a year (tenured) or 4 times a year (not tenured), with some districts having a waiver for only 2 observations a year for the tenured teachers. The most common model in the state being used seems to be the Danielson model, but other districts are also using Strong or Marzano. There are two basic types of observations - long (around 45 minutes) and short (about 20 minutes). The people doing the observations can vary around the state. In some districts it is done by supervisors and building administrators. In some districts, any supervisor, even outside of your content area or specialty can do the observation. Some districts even bring in an outside evaluation team. The final evaluation score is based calculated through the state AchiveNJ portal. The evaluation score is a formula of the teacher observation ratings, a Student Growth Objective Score (SGO), and if the teacher is in a tested area with above a specific number of students, a Student Growth Percentile score that is based upon the standardized test scores (now PARCC). The combination of these scores gives the teacher their summative rating for the year. In tested grades and subjects, the breakdown is 70% observation score, 20% SGO, and 10% (SGP). The SGP was supposed to count towards more of the evaluation this year but the union fought against an increase due to the issues of PARCC. The breakdown for a non-tested subject is 80% observation score and 20% SGO. Individuals may write a response to the final summative evaluations, or to any observation throughout the year. But the union has not pushed out the initiative for all teachers to attach a statement to the summative evaluation score. New Jersey BATs has been pushing for action on this.

  • Danielson model.. Usually completed for me to see (online) within a week or two. I can write a response if I want to.  We are also rated on SGOs too. I teach K,1,2 students with multiple disabilities. Danielson and SGOs are ludicrous.


  • Atlantic City NJ...LOTI Model, which I don't really care for. Eval returned in a timely manner and written response is permitted, although I have not yet needed to write anything.

Louisiana;


  • 2x/yr (1 formal, 1 informal) using Danielson rubric; get our observation report 1-2 wks later at a post-conference with that administrator; may attach a written rebuttal if desired to official report.

California.


  • I am evaluated by the principal or VP every other year on an online form based on the 6 California Teaching Standards. Each teacher gets a brief workshop on the process, then a re-conference with admin to cover understanding.


Connecticut


  • Tenured every three years for "official " eval. This means pre conference with admin, observation of lesson, scored by Danielson checklist, all entered into Pearson platform, of course . Complete paperwork on reflection of lesson. Worth 40% . 45% test scores, 40% 10% parent survey, 5% of whole school rating on SBCA. When not on official year, you have to prove your worth and upload evidence of indicators met via Danielson. ..non tenured are on official for 3 years. You can rebut, and have union presence at meeting. No set turnaround time for final score. Yeah, a score....


New York


  • We are evaluated on 40% standardized tests & 60% Danielson rubric through observations of supervisors with announced and unannounced visits. Next year this moves to 1 standardized test worth 50% and then outside evaluator 5% and 45% of direct supervisor both unannounced and announced based on Danielson rubric.

If you would like to send us your evaluation model please email it to our Executive Directors, Marla Kilfoyle or Melissa Tomlinson at Contact.BATmanager@gmail.com   We will continue to add to this post to keep everyone updated on evaluation models, based on junk science, throughout the country.  

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.