Jump to content

Ohio didn’t like its students’ Common Core test scores — so it changed the passing grade


Recommended Posts

rainymike

The Phil's seems to have adapted the Common Core in designing its national curriculum. Similar problems are likely to emerge here down the road.

 

 

If you want to understand just how political the process is of determining how well students do on tests, consider this: Ohio didn’t like its students’ Common Core test scores — so it changed the way they are interpreted.

 

Ohio just released an initial wave of results from the Common Core test known as PARCC, which stands for the Partnership for Assessment of Readiness for College and Careers, the multi-state consortium that designed the exam with federal funds. (The test is aligned to what is called the Ohio Learning Standards, which are basically the Common Core State Standards with a different name.) The scores were for online PARCC tests, which were taken by about 75 percent of  students who took the exam in Ohio.

 

Under the Core scoring rules for the test set by the PARCC consortium, a little more than a third of students who took the test online are considered “proficient” in math and English language arts. The state decided to use a lowered benchmark for “proficiency,” and the number of students who did well is near 65 percent.

 

In a memo (see text below) about  the Ohio proficiency controversy, Karen Nussle, executive director of the nonprofit Collaborative for Student Success, shows how test results can be viewed in different lights depending on what a passing score is determined to be.

 

“Proficiency as defined by the Ohio State Board of Education is inconsistent with how proficiency is defined by both the Partnership for Assessment of Readiness for College and Careers (PARCC) and the National Assessment of Educational Progress (NAEP), the nation’s report card. This discrepancy should give pause to parents, community leaders and policy makers who expect transparency in Ohio’s transition to higher standards and new tests. … [it] suggests that Ohio has set the proficiency bar too low and undermines the promise of ensuring kids are on track for college and career.”

 

This past summer, Ohio decided to stop using PARCC and contracted with a testing company for a new exam to give to students.

 

Changing the “cut scores” of exams is not a new game among education officials in various states who don’t like the optics of very low test scores. As explained in this Answer Sheet post by educator Carol Burris, cut scores are selected points on the score scale of a test, and the points are used to determine whether a particular test score is sufficient for some purpose. Notice the word “selected.” Cut scores are selected based on criteria that the selectors decide have some meaning. It is often the case that the criteria have no real validity in revealing student achievement, which is the supposed mission of the test — and that means the scores have no meaning either.

 

In 2012, Florida gave a new standardized writing test to students in various grades  and only 27 percent of fourth-graders had proficient scores on the Florida Comprehensive Assessment Test, which was down from last year’s 81 percent. Eight-graders and 10th-graders also had dramatically lower scores than last year. State education officials panicked, and the Florida Board of Education decided to lower the passing score on the exam.

Another way to game the testing system is to make the tests easier so the scores will be better, which is what happened in New York City public schools during the eight-year tenure of Joel Klein as chancellor. He had to resign in 2010 after it was revealed  that test scores that he pointed to as proof of the success of his business-based reforms were based on increasingly easy standardized exams.

 

Here’s the full memo by Karen Nussle:

 

Ohio this week became the first PARCC (Partnership for Assessment of Readiness for College and Careers) state in the country to release preliminary student test results measuring the Ohio’s Learning Standards, which are based on the Common Core State Standards.

That the number of students deemed proficient on the tests was lower than previous years is no surprise. Common Core State Standards are, after all, considerably more rigorous than previous K-12 standards. But what parents should pay attention to is the percentage of students determined by the state to be proficient under the new assessments.

 

Proficiency as defined by the Ohio State Board of Education is inconsistent with how proficiency is defined by both the Partnership for Assessment of Readiness for College and Careers (PARCC) and the National Assessment of Educational Progress (NAEP), the nation’s report card.

This discrepancy should give pause to parents, community leaders and policy makers who expect transparency in Ohio’s transition to higher standards and new tests."

 

Who’s Looking Out for Students?

 

According to the state, more than half of Ohio students who took the PARCC exam are now officially proficient in math and English. Both PARCC and NAEP, however, would consider that percentage to be significantly lower. The discrepancy suggests that Ohio has set the proficiency bar too low and undermines the promise of ensuring kids are on track for college and career.

 

For the past five years, since Ohio first adopted Common Core State Standards, the state has been engaged in a very difficult transition to not only bring transparency and honesty to the process of reporting student achievement, but also to raising the bar in order to ensure kids were prepared for the next step after high school.

 

This herculean effort was needed because in past years, Ohio’s Honesty Gap – that is, the difference between proficiency rates reported by state tests and those on the NAEP – was among the most pronounced in the country.

 

By expanding the definition of proficiency to include students that are less-than-proficient, it appears the state is regressing. “I’m trying to understand how these [proficiency rates] are raising expectation,” said Sarah Fowler, a member of the state board of education. There has been no explanation as to why this decision was made, but we can speculate that it was so more students would score “proficient” on paper, and not because they truly earned that designation. We encourage Ohioans to ask these questions.

 

For further insight, read this post from the Fordham Institute’s Ohio Gadfly.

 

Local Control Still Needs to be Honest and Transparent

 

We very much support Ohio’s ability to make these decisions for themselves without intrusion by the Federal government or other national entities. These decisions need to be made in Ohio by Ohioans.

 

But at the same time, Ohio parents deserve an honest assessment of student proficiency. ‘Local control’ cannot become a fig leaf that covers up a dumbing down of the system in order to make policy makers look good at the expense of kids.

 

At this critical time of transition, parents deserve transparency. And they deserve the truth. They should demand it in Ohio.

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use, Privacy Policy and Guidelines. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue..

Capture.JPG

I Understand...