Massachusetts adopted new regulations in 2010 requiring districts to use “district-determined measures” and state test score data to create Student Impact Ratings for all licensed educators as part of the educator evaluation system. These state regulations stemmed from federal mandates that evidence of student growth be a “significant” factor in educator evaluations. After strong opposition to this federal mandate across the country, Congress passed a new law in 2015 — the Every Student Succeeds Act — repealing the requirement. However, the new state regulations linking test scores to evaluations are still on the books.
The MTA has joined other educators in working to repeal the Student Impact Rating and DDM requirements. Among other concerns, opponents believe that:
- Student test scores are an invalid and unreliable measure of educator quality.
- Developing and administering local tests solely or primarily for the purpose of evaluating educators is a waste of valuable teaching time.
- The mandate exacerbates the ongoing problem of an excessive focus on standardized tests in public education.
- Judging educators based on student test scores will make it more difficult to fill high-need teaching assignments and discourage collaboration.
Background on DDMS and the Student Impact Rating
The requirement for using District-Determined Measures and MCAS growth scores in the educator evaluation system was initiated in 2010 in the Department of Elementary and Secondary Education’s application for federal Race to the Top grant funds. The DESE’s commitment to this strategy was reinforced in the state’s successful 2012 application for a waiver from the federal No Child Left Behind act.
The educator evaluation regulations containing the DDM requirement were approved in 2013. Under these regulations, at least two measures of educator effectiveness based on student outcomes must be developed for every licensed educator.
For those who teach English language arts and/or math in MCAS-tested grades, one of these DDMs must measure trends and patterns in their students’ Student Growth Percentiles. A trend must consist of at least two years of data.
Student Impact Rating
DDMs are to be used to determine an educator’s Student Impact Rating of Low, Moderate or High. That rating in turn informs the length and content of an employee’s Educator Plan, most notably whether the employee is on a one- or two-year self-directed growth plan. The current implementation timeline calls for some districts to provide Student Impact Ratings for some educators this fall.
Many districts opted for an alternative pathway last spring that allowed an additional year before these ratings would be reported to the DESE. The department needs to clarify the impact of hold-harmless provisions due to the PARCC assessment and a new MCAS 2.0 in 2017.
The MTA maintains that the selection or creation of DDMs is a mandatory subject of bargaining. Negotiating over DDMs has been completed in some districts and is ongoing in others. Some districts and locals are in agreement that this effort is time-consuming and unproductive, so they have sought to minimize the requirement as much as possible.
The MTA maintains that the selection or creation of DDMs is a mandatory subject of bargaining.
Some educators have said they are not opposed to using common educator-developed assessments for informing instruction – for example, to determine how much students know about a particular subject before finalizing lesson plans. However, polls show that few educators support using scores on these assessments in the evaluation process given the multitude of factors that contribute to how well a student performs on a test. These factors include the student’s own effort, ability and emotional state and the quality of instruction the student received in prior years or other classes.
Student assessments designed to measure educator effectiveness are called Value Added Models, or VAMs. In November 2015, the American Educational Research Association issued a statement that included the following:
“There are potentially serious negative consequences in the context of evaluation that can result from the use of VAM based on incomplete or flawed data, as well as from the misinterpretation or misuse of the VAM results.” The statement includes eight criteria that must be met before a measure of student growth can validly be used in educator evaluation. No DDMs developed in Massachusetts meet the first of those criteria:
“VAM scores must only be derived from student scores on assessments that meet professional standards of reliability and validity for the purpose to be served.”
While the DDM mandate is considered onerous by many in Massachusetts, in some states it is even worse, with test scores counting for as much as 50 percent of an educator’s evaluation. Some states, on the other hand, declined to participate in Race to the Top or the NCLB waiver process because they opposed the mandate. For example, the state of Washington lost its NCLB waiver after refusing to implement a test-score-based evaluation system.
This issue was hotly debated across the country in 2015. It became an important factor in the lopsided vote in Congress in favor of replacing NCLB with the Every Student Succeeds Act in 2015. Under this act, the federal government is explicitly forbidden from requiring states to use test scores in their education evaluation system.
DDM Mandate Repeal Effort
Until and unless the DDM and student impact rating mandates are repealed, local associations may continue to rely on the implementation guidance provided in this toolkit.
MTA Opposes New District-Determined Measures/Impact Rating Regulations
Why the DDM Mandate Should Go
White paper authored by the MTA and AFT Massachusetts explaining why judging educators based on student test results is invalid and unproductive.
District-Determined Measures Overview
According to the regulations (603 CMR 35,02), District-Determined Measures shall mean measures of student learning, growth and achievement related to the Massachusetts Curriculum Frameworks, Massachusetts Vocational Technical Education Frameworks or other relevant frameworks that are comparable across grade or subject level district wide. These measures may include, but shall not be limited to, portfolios, approved commercial assessments, district-developed pre- and post-unit and course assessments, and capstone projects. This page will be updated as new information from USED, DESE, school districts and local associations becomes available.
MTA District-Determined Measures Guidance
MTA’s guidance on the development of District-Determined Measures (DDMs) provides a step-by-step approach that local associations and districts may use in identifying assessments currently in use that could be adapted or adopted as DDMs. This guidance also recommends a process for developing new DDMs. For each educator, there must be least two DDMs and at least two years of outcomes resulting in the educator’s impact on student learning rating as mandated by state regulations.
Educator Evaluation Framework Ratings
As part of the Educator Evaluation Framework, all licensed educators will receive a summative evaluation rating and an impact on student learning rating.
Student Learning Goal vs. Impact on Student Learning
Part of the five-step evaluation cycle, the student learning goal requires educators to complete a self-assessment as part of the educator plan.
Relevant State Statutes, Regulations and DESE documents
State Statutes and Regulations
Chapter 150E is the state statute governing public-sector collective bargaining, requiring negotiation with respect to wages, hours, standards of productivity and performance, and any other terms and conditions of employment.
Chapter 71, Section 38 governs school committees’ bargaining performance standards and establishes binding interest arbitration to resolve issues when parties are unable to reach agreement.
Final Educator Evaluation Regulations adopted by the Board of Elementary and Secondary Education.
Memos Related to District-Determined Measures from the State Education Commissioner
The regulations are clear that District-Determined Measures must be grounded in the Massachusetts Curriculum Frameworks, Massachusetts Vocational Technical Education Frameworks or other relevant frameworks. The Massachusetts 2011 English Language Arts and Mathematic Curriculum Frameworks include all of the Common Core State Standards. District and Association leaders may find that the overarching shifts of the Common Core State Standards may provide significant guidance in mapping out how to do this work. Some helpful links include:
MTA Analysis and Recommendations on DESE’s DDM Model Contract (February 7, 2014)
MTA username and password required to view this document in the MTA Leaders area.
MTA's Center for Education Policy and Practice worked with Holyoke public schools from January 2012 through June 2013 on full district implementation of the Educator Evaluation Frameworks, including the identification of district-determined measures. MTA and the Holyoke Public Schools presented this work at the Department of Elementary and Secondary Education's Spring Convening on Evaluation. Click here for a four-page review of this work.
An MTA user name and password are required to view the following documents.
NEA’s Teacher Evaluation: A Resource Guide for National Education Association leaders and staff.
QUESTIONS? MTA members with questions about the Common Core, PARCC or ACCESS may send an e-mail to DDMs@massteacher.org.