Wrong Drivers

Some of you are aware that I am a huge fan of Michael Fullan’s work.  His belief is that if we build up the teaching core and provide professional capital for our teachers, we will have better teaching and in turn better “results”.  He also believes that we have taken a turn and are utilizing more wrong drivers than right drivers, especially in the US and UK.  I wrote about this in my blog here.

Jeff Craig oversees the Instructional Support Department at OCM BOCES.  In light of the State of the State address by Governor Cuomo this past week, Jeff wrote a great blog about APPR that I would like to share with you as for your perusal.  Enjoyl

Vic

APPR Insanity

The definition of insanity is doing something over and over again and expecting a different result, according to Einstein. Or Franklin. Or Twain. All three of these noteworthy thinkers have been reported to say this – and most often this axiom has been attributed to Einstein. He was, after all, both smart and witty. As it turns out, however, no one has been able to find this in his writing. Nonetheless, this oft-used truism applies to the system of Annual Professional Performance Review (APPR) in New York State. In fact, we are trying to do the same thing, over and over, and expecting a different result.

The Widget Effect was published in 2009 and it described the state of teacher evaluation. It identified that, using evaluation systems used in the United States at that time, all teachers were satisfactory (less than 1% were rated unsatisfactory). The report also concluded that truly excellent teaching went unrecognized, that professional development was not connected to evaluations, and that poor performance was not addressed. Despite the overwhelmingly positive rating that teachers were receiving, 57% of teachers and 81% of administrators reported that there were poor teachers in their school. These findings pointed, we were told, toward the need for a new system of teacher evaluation.

The conclusions of the Widget Effect report were used to argue for a new APPR system in New York State. When you add the incentive of Race To The Top money for the state in a time period of fiscal contraction, you get the political context for a deal. The framework of the deal is the 20% + 20% + 60% APPR calculus. The first 20% is supposed to come from the state and be based on some measure of student achievement. Whether or not you agree that 20% coming from state assessments is a good idea, the state, as it turned out, could only determine the 20% for a small fraction of teachers. For the rest of the teachers, school districts were delegated with the authority to a mechanism from a few larger jurisdictions in the county known as “Student Learning Objectives” (SLO). Besides being poorly named (educators thought that an SLO was a learning objective for the students), the regulations about SLOs were written in such a way as to allow for many different interpretations.

The second 20%, according to the NY APPR plan, was to be a locally agreed-upon measure of student achievement that had to be different than the first 20%. There were even less regulations provided for this part of the evaluation, so the variation between districts was considerable.

The variation in local interpretation and implementation of the locally-determined 20% turned out to be nothing compared to the variation in the final 60%. The 60% portion was supposed to come from multiple measures which included evidence collected from a minimum number of classroom observations. Like the second 20%, this had to be negotiated with the local professional association. NYSUT, New York’s teachers’ union, introduced an extremely generous conversion scale that many districts adopted. Other districts literally gave teachers a significant portion of the 60 points just for submitting any artifacts with no assessment of the quality of the artifact whatsoever.

What was the result of this APPR cacophony? The result was that systems were locally constructed in order to be very generous to teachers. Yes, there was a great deal of drama among the teacher ranks about widespread and unjust teacher dismissal that would result from implementation of the new APPR system. The drama was unnecessary, as it turns out, because most of the decks were stacked in favor of high evaluation scores for teachers. How high? Well, the most recent information from the State Education Department indicates that just 1% of teachers were rated as ineffective. Swap the label “unsatisfactory” for “ineffective” and you end up with precisely the same number that the Widget Effect cited as a rationale for a different system of teacher evaluation.

Now, due in part to a feud between NY Governor Andrew Cuomo and the teachers’ association, another change to the APPR system seems possible. The Governor has seized the inflated teacher evaluation results as an opportunity to force changes to the system through the budget process. While the new (and old, for that matter) APPR system doesn’t work, there is no indication that the kind of changes the Governor desires will improve it. Based on an exchange of letters between the Governor’s leadership andSED, it sure looks like the present version of the APPR system is in the crosshairs. We’re hearing about new math, such as 40% + 60% or 25% + 75% or 50% + 50%… but this, too, is just more of the same and constitutes insanity as applied in this post.

Systems like the APPR system in NY mistakenly place an emphasis on human capital rather than social capital and thus are doomed to failure. Rooted in what Michael Fullan categorized as “wrong drivers of change,” systems that emphasize individual human capital over social capital and that emphasize the use of accountability data in a punitive way are simply doomed to failure. To replace old systems with similar systems, repeatedly, gets us to the insanity that some other than Einstein, Franklin, or Twain described. So far, our leaders haven’t learned from the past and haven’t read much Michael Fullan. To our north there lies a large system of education that is making progress based on an application of the “right drivers of change.” Ontario, which happens to have one very large city in it, with a few other good-size cities, and a lot of geographically diverse communities, is making the kinds of educational improvements that we can’t. Perhaps we should stop the insanity and apply a little common sense, research-based thinking in place of political vitriol. If we don’t, we’ll continue to get what we’ve always gotten.

Craig,-Jeff_WEBJeff Craig
JCraig@ocmboces.org

Other News

  • Bus Duty for Jan 20-Jan 30 Team 9:  Jessica Serviss, Brandi LaRue, Teresa Kiechle, Denise Croasdaile  Upcoming Bus Duty: Feb 2- Feb 13:  Team 1:  Kathy Palmer, Brooke Santamont, Kyle Baker, Jenny Nachamkin
  • What are the five things you are grateful for?  Make a list daily.  It does wonders!
  • Be the change agent for kids!  Be a champion for kids.  Every kid deserves a champion!   Just do right!

Other Items of Interest (Read at your leisure!)

(Provided by OCM BOCES IS Weekly Dispatch )

  • In his speech and in his budget proposal, the Governor offered no increases in aid to schools unless he achieves agreements for a more rigorous teacher evaluation system, tenure changes, and of the limit on the number of charter schools.
  • There’s been a great deal of attention paid to student engagement – this article reports on teacher engagement and its impact on learning. New York has the third highest percentage of actively disengaged teachers.
  • Perhaps some of the most effective ways to support students in poverty is through the little things we can do, according to this New York Times article.
  • Sometimes we need to be able to laugh at ourselves. Try this education jargon generator – you’ll impress your colleagues (and get a good laugh out of it).
  • This post describes the process of developing guiding questions in a step-by-step manner. It includes an important step to ensure that your question is standards-based.
  • The questions can be more important than the answer – and it’s better if the questions come from the students than from the teacher. Explore this post to get you thinking more about questions than answers.
  • You can improve the PowerPoint slides you use in the classroom (and improve the retention) by following this advice.
  • Small schools have additional challenges when transforming to a Professional Learning Community due to some of the scheduling and teaming issues. This column includes some suggestions.
  • This article describes the stages that teams in a Professional Learning Community might encounter during the transition toward becoming a truly effective team.
  • You and your students can compare all sorts of data about different aspects of our country at the Measures of America site. The site can provide facts to support and explain social issues.

A Touch of Humor

Advertisements