Numerous letters
and articles have been released by school superintendents statewide disputing
the results of the recently released State Report Cards. We would like to take this opportunity to
inform our community about the way these scores are generated and
reported.
One of the most
challenging mandates handed to us in recent years by State and Federal
authorities is the increased focus on testing.
Our Board of Education, administrative team and staff believe that we
must be held accountable for the success of our students, but we feel strongly
that the tests mandated by the State of Ohio do not portray an accurate picture
of student achievement in Vermilion.
In August we
received our first report of the AIR test scores from tests that were
administered in February and March 2016 to students in grades 3 through 11. The
AIR test replaced the troubled PARCC test that was required in 2015. As we evaluated the AIR scores, many seemed
to have glaring inconsistencies. For
example, the English II scores for high school students were reported at a 60%
passage rate on the AIR test. The PARCC
scores from the previous year showed a 75.3% passage rate for the same group of
students. How are we, as a community, to
believe that scores could vary by such a large margin, especially when we were
assured the AIR test was a much less rigorous test? The same question came up as we looked at
scores for social studies and math at all grade levels.
Other examples of
results that raised questions and concerns about the scoring methods are:
- · The majority of our special needs students (those with an IEP) did not pass the math exam. Social studies scores for special needs students were much higher on the AIR test than on the previous year PARCC tests.
- · The Report Card score includes a penalty for any student who did not take the test because the parent chose to “opt out”. The State permits parents to “opt out”, but the school district is penalized on the Report Card grade for students who did not take the test.
- · The new “Prepared for Success” measure looks at students over a two year period. In late June, the state made a change in how the second year data was to be reported, but districts were not permitted to update data derived from the first year. As a result, improvements made by districts, such as adding additional college-level courses, are not considered or included in this year’s score.
- · The Achievement metric measures how well students performed on state tests. Federal guidelines have expanded testing, adding nine additional tests in all content area, and Ohio has changed test types three times in as many years. This passage rate will change for the 4th time in the spring of 2017. This is analogous to an architect changing the building plans for a high rise every time the builders start a new floor. Imagine how confusing and difficult this would be for the building inspectors. There is no consistent measure of excellence. In other words, while teaching and learning standards have remained constant in local districts, assessment requirements have repeatedly changed, making it very difficult, if not impossible, to evaluate and improve instruction methods.
- · The K-3 Literacy Rate compares the results of a student’s preliminary reading assessment to their proficiency on the Grade 3 test. However, the new AIR test incorporates reading and writing. This Report Card measure is flawed in that it compares a rate based on a reading score to one based on a reading AND writing score. As a result, the calculated score does not reflect actual literacy attainment. The target for achievement will change again next year to an even higher passage rate.
Tests are given
only one day of the year and are not a true measure of a student’s
ability. However, they are used as the
basis of the State Report Card grade for districts statewide. Since all public school districts are
provided with only final test scores, which are returned to us months after
testing, our teachers and administrators cannot use the results to accurately
evaluate student performance or to improve instruction. The present method required by the State is a
violation of everything we know about effective teaching and evaluation
techniques.
In Vermilion we
have used the MAP test in Kindergarten to grade 8 to inform instruction. The MAP test measures academic progress of
each student. The results are returned
to us very quickly, allowing our team of teachers and administrators to
identify areas of strength and weakness and make instructional adjustments at a
grade level, specific subject area or for individual students. The MAP results give us a true picture of
whether or not students have achieved one year of growth for one year of
instruction.
State required AIR
and PARCC tests are given only at the end of the school year. The results are not returned to us until
months later, and therefore, are not useful to inform instruction for subject
levels or individual students. We are
given no information about the way the tests are scored, or how each student
performed. In order to evaluate and
learn from testing, it is important to see every student’s test to evaluate
his/her strengths and weaknesses. For
example, we find it difficult to improve third grade writing scores when we
have no information about how the state test was evaluated or scored. The data used to create the State Report Card
is therefore flawed and unreliable. The
State of Ohio’s approach to assessing student achievement is in direct conflict
with all of the educational research on evaluating and improving
instruction. This method of testing one
subject, on a single day, defies all best practice teaching methods.
The Plain Dealer
recently used the statewide Report Card scores to rank schools from a 4.0 to
0.0 grade point average. Vermilion was
176 out of 608 schools ranked in this study.
We are not citing these statistics to say that we are better than other
districts, but merely to demonstrate that there is something wrong with a
reporting system that grades only 205 schools at 2.0 GPA or better (the
equivalent of a “C” average). When this
many grades are average or below, educators know that they have to evaluate the
test itself, not the test takers.
This is the first
time Vermilion educators have spoken out about this issue. We do not wish to appear to be making excuses
by sharing this information with the community.
We know very well that we must be accountable to our parents and
taxpayers. We continue to work hard, on
a daily basis, to provide the very best education we can to all of our
students. However, we also believe that
we have reached a tipping point with State and Federal authorities through
their unreasonable demands. At this time
we will be joining forces with public school educators statewide to make our
thoughts known to the Ohio Department of Education and the Governor’s
office. We will keep you informed about
this important discussion.
No comments:
Post a Comment