February 23, 2018
The Point Latest News | Poll Questions | Opioid Epidemic | Joyce McLain | Tourney Time 2018

Maine didn’t do so well on the Smarter Balanced test. The hard part is figuring out why.

George Danby | BDN | BDN
George Danby | BDN | BDN
By Matthew Stone, BDN Staff

Maine students typically fare well on the one national exam that’s used to compare academic performance state by state, the NAEP. But the Smarter Balanced assessment, the online standardized test that Maine students took for the first and only time this past spring, paints a more sobering picture of how Maine students stack up with their peers across the country.

Maine was one of 18 states where students took the Smarter Balanced exam this year, and the exam was designed so one could compare results state by state.

— Among the 12 states that have publicly released their scores so far, Maine’s students rise to the top half — and just barely — in only one subject area: third-grade English, in which Maine ranks sixth based on the percentage of students who scored proficient or better.

— Maine ranks 10th among the 12 states for math in third, fourth and fifth grades.

— At the high school level, Maine 11th graders ranked eighth out of 11 in math and ninth out of 11 in English, according to a BDN analysis of test results. (One of the 12 states, Missouri, didn’t administer the Smarter Balanced test for high schoolers.)

— Maine students placed either eighth or ninth among the 12 states in all other testing categories. (The Smarter Balanced tested students’ math and English abilities in grades 3-8 and 11.)

The difficult part is figuring out why, especially given that Maine typically outscores or performs on par with most of the 11 comparison states on the NAEP, or National Assessment of Educational Progress, which tests sample groups of fourth- and eighth-grade students every other year in math and reading.

With Smarter Balanced, Maine students won’t have another chance to take the online test. That means no one will know if Maine’s results this year were a fluke or a more meaningful indicator of student performance.

Comparison cautions

There has long been an expectation that states would see student performance drop when they switched to a new test based on the more rigorous Common Core standards for math and English.

It happened a few years ago in Kentucky and New York, the first states to test their students’ Common Core mastery. And it happened this year in Maine.

In 2013, the last year of Maine’s former testing regime, the New England Common Assessment Program, or NECAP, nearly 70 percent of students were proficient in English; on the Smarter Balanced English test this year, the figure was 48 percent. In math, about 60 percent of Maine students scored proficient or better on the NECAP. With Smarter Balanced, the figure was 36 percent.

“There’s no doubt that these are assessments that are more rigorous than each of the Smarter Balanced states had under their old test,” said Deb Sigman, former deputy superintendent of the California Department of Education who served as chair of the Smarter Balanced Assessment Consortium executive committee. “Quite frankly, it takes a lot of time to transition to a new system. You have new instructional materials, new instructional strategies that teachers need to become skilled in.”

Those are important factors to consider in comparing performance state by state, Sigman said. “If you’re going to do state comparisons, so much is dependent on context in each state.”

It’s also helpful, she said, to compare like groups of students in each state — such as economically disadvantaged students, English language learners and minority groups. But Maine and a number of other states haven’t released such breakdowns of their scores.

Maine education officials aren’t investing much effort into comparing the state’s Smarter Balanced scores with others’.

“We chose not to go that route, but rather put our energy into going forward, finding a new [test] vendor and, hopefully, a long-term partner,” said Department of Education spokeswoman Anne Gabbianelli. “You’d have to ask yourself, how accurate would any of it be, anyway?”

Transition trouble

A handful of states this year encountered technology troubles that largely paralyzed Smarter Balanced testing efforts. (Those states, Montana, Nevada and North Dakota, have not released test scores.)

Maine wasn’t among them, but the spring testing period was characterized by its share of testing hiccoughs: reports of students taking the wrong test or not being able to save their work, technology glitches, wordy and typo-ridden questions, and general growing pains associated with a new assessment.

“You’d have to compare testing environments across those different states to us,” Bangor schools Superintendent Betsy Webb. “What does their testing technology look like? How did they test? How did their departments of education roll out supports to teachers?”

In Bangor schools, for example, testing capacity in the lower grades was limited by technology, Webb said. Younger students used older laptops that previously belonged to middle-school students through Maine’s laptop program, and there weren’t enough to allow all students to take the test at the same time.

And the state Department of Education was ironing out testing details essentially up to the moment when schools started to administer the test, Webb said. “We were in a learning phase, and everybody knew that.”

Then, in Augusta, lawmakers unanimously approved a bill to cancel the Smarter Balanced test after year one just as students across the state were taking the exam.

“When the kids hear there’s a debate among adults about the worthiness of the test, you have to wonder, to what extent do kids bring their A game to this test?” said Duke Albanese, who served as Maine’s education commissioner from 1996 to 2003 and is now senior policy adviser at Great Schools Partnership, which works with schools across Maine on reform strategies.

Many Maine students and parents apparently agreed that testing wouldn’t be worthwhile. Substantial numbers of high school students opted out of the test altogether and focused instead on Advanced Placement, SAT and ACT exams given in the same busy season.

At six high schools, fewer than 20 percent of students participated in testing. At Yarmouth High School, for example, just 9 percent of students took the English test.

“It almost seems like you should be able to compare this state to state, but you don’t know to what extent the context is a contributing factor,” said Albanese. “You might have some high-achieving schools, which you do, with high opt-out rates. We’re a small enough state that that could affect the state profile.”

Opting out of validity?

But the impact of low participation would most likely be confined to Maine’s high school results. Of the 69 schools statewide with participation rates lower than 80 percent, 49 were high schools. And high schools with low participation generally belonged to districts with substantially higher participation at the elementary and middle school levels.

In Bangor, about two-thirds of high school students participated; participation didn’t drop below 96 percent at any other city school.

Statewide, Maine’s overall test participation rate was about 89 percent — lower than the 95 percent required by federal law, but not necessarily low enough to invalidate scores.

What next?

For states that will continue giving the Smarter Balanced test, this year’s scores serve as a baseline. In Maine — one of three states dropping the test altogether — the value from one year of test results isn’t as straightforward.

“Although it was only a one-year data point, we welcome any data point that can assist us in helping to prepare students,” Webb said. “We are analyzing that data.”

Unlike past standardized exams, the Smarter Balanced test emphasized critical thinking and problem-solving rather than recall and rote memorization. So the 2015 results could inform educators’ efforts to help students develop those in-demand skills and their efforts to acquaint students with the Common Core standards.

“The data point you in directions where you should be posing questions about what this means and how we can do better,” said Sigman. “Regardless of where you’re moving, I think there’s still value to be gained out of the information from these exams.”


Have feedback? Want to know more? Send us ideas for follow-up stories.

You may also like