The Florida Department of Education is working overtime to “set the record straight” about recent results of the validity study of the Florida Standards Assessment, saying some people have misinterpreted the study’s results.
Earlier this month, Utah-based Alpine Testing Solutions and Washington D.C.-based edCount concluded the FSA was a valid tool for measuring student achievement in Florida; the results have become the center of controversy.
In response, the department sent out an email Thursday attempting to dispel misinterpretations about the study’s results. The email follows several phone calls from the department to Sunshine State News, in which staff members attempted to explain specific portions of the study results.
Critics of the results honed in on several parts of the report which said the department did not fully review all items of the FSA. Alpine Testing recommended phasing out some of the test’s content because it aligned to Utah standards and was left in the FSA.
“While alignment to Florida standards was confirmed for the majority of items reviewed via the item review study, many were not confirmed, usually because these items focused on slightly different content within the same anchor standards,” they wrote. “It would be more appropriate to phase out the items originally developed for use in Utah and replace them with items written to specifically target the Florida standards.”
FDOE said for all but three out of the 386 total items, external reviewers identified connection to a standard that appeared on the Florida test blueprints, which define test content.
“This affirms that the FSA accurately measures students’ knowledge of Florida's content standards,” read the department’s email.
FDOE also disputed claims that the FSA wasn’t suitable for Florida students because it was originally written with Utah-based education standards in mind, saying all items were reviewed for appropriateness to Florida standards.
The study was the center of the Senate Education Pre-K-12 Committee, which met Thursday afternoon.
Representatives from both Alphine and edCount came to explain the results and answer questions from state lawmakers on exactly how they reached their conclusions.
Sen. Bill Montford, D-Tallahassee, called the department’s email saying people had misinterpreted the results “offensive.”
“At the risk of one more time saying that the superintendents ... or staff ... have misrepresented the fact, it seems like you may even have some questions too about the report itself," he said.
Senators had many questions for Alpine’s Andrew Wylie, who said the companies were put on a strict timetable to complete the study since it had to be completed in a three-month timeframe.
“There was a very aggressive timeline set at the beginning of the process,” Wylie said.
Wylie admitted that although Alpine had done validity testing before, the testing company had never reviewed a statewide assessment.
Ultimately, Wylie said the results seemed to indicate the test was still on par with Florida standards.
“While the review process was not ideal ... it still was consistent with the test standards,” he said. “Content-wise, I think the content matches the standards at an appropriate level.”
Wylie admitted, however, there was “rigorous debate” on the study’s results.
“This was not an easy decision,” he said.
Commissioner Pam Stewart said the department looked over two drafts of Alpine's study and briefly reviewed the final results around 5 p.m. Aug. 31, the day before the study results were released. While department officials didn't have any specific influence over the study's conclusions, the department did request to improve some aspects of the document, which was done via telephone.
See the DOE’s “claim check” email here.
Reach Allison Nielsen by email at firstname.lastname@example.org