OPINION: The invalidity of the Florida Standards Assessment


On March 5, students all across Florida walked into school to take the new Florida Standards Assessment (FSA), a replacement for the old FCAT test. Instead, those students found blank login problems and technical errors. State education officials attributed the issues to a cyber-attack. Testing was forced to be rescheduled for another day. The testing of April 13 went off mostly problem-free, but the technical issues returned for Round 2 on April 20 due to last-minute changes made by FSA vendor American Institutes for Research (AIR).

Amid it all, the FSA has been blasted by both educators and politicians, and there have been calls for the FSA to be ended or not be counted against students. One of the most interesting arguments from opponents of the FSA is that the irregularities and rescheduling has resulted in the FSA not being consistent, and thus, it is not an accurate measure of student performance.
This argument certainly makes sense. For example, due to the aforementioned issues, the second day of FSA Reading makeups was moved to a week later. Under normal circumstances, the two parts of an FSA test would only be a few days apart at the maximum, and you would expect student performance to be roughly similar on both days. The entire purpose of having the test days close together is to ensure little variation in performance between the days while minimizing the inconvenience to teachers. The rescheduling fails on both counts. With the two testing days now being weeks apart, a marked difference in performance between the two days would not be surprising; the extra time gives students a chance to either forget all they had remembered during Day 1, or to do more studying so they’re even more prepared than they were on the first day. The change in testing days also makes it difficult for teachers, who now must either plan around yet another testing day, or expect a decent chunk of their students to do make-up work.
With all of these errors and inconsistencies, there has been considerable criticism that the FSA test is not valid. Interestingly, this pattern is not new; Kansas also had cyber-attacks and technical problems when it rolled out new tests, and according to the Miami Herald, “students who suffered technical difficulties skipped more questions than students who didn’t: 15 percent compared with 1 percent.” If that isn’t convincing evidence,  it should also be noted,according to the Herald, that the FSA test was not fully tested, and the field testing that was performed was done in Utah, not Florida.
In the end, with the myriad of problems facing the FSA, the test is of questionable worth. Both the Florida Department of Education and FSA vendor American Institutes for Research (AIR) have mishandled the test. For the sake of fairness, FSA results should be discarded, and the test must not count against schools. Maybe then lawmakers might realize the complexities of standardized testing.