The forthcoming Common Core (CC) Assessments are the next generation of standardized tests in the US, and will meet the testing frequency requirements of the most recent version of the Elementary and Secondary Education Act also known as No Child Left Behind unless congress should act to change this, which is most unlikely. Forty six of the fifty states have signed on to voluntarily administer the exams that will be written to meet the standards of the Common Core. The Smarter Balanced Assessment Consortium (SBAC) is one of two consortia that organizes the architecting and contracting for the Common Core assessments; SBAC is responsible for about half of the member states, including California.
I have examined the SBAC’s RFP’s for testing design and delivery of the CC assessments, and the consortium managed to construct a guide for contractors that even Finnish educators would admire. It is difficult to tell from the website, but it appears that the SBAC employed work groups that engaged school practitioners, or at least retired practitioners, to shape the tasks.
The winning bids for exam design, delivery, and reporting for the SBAC, have all gone to Wireless Generation, a company turned down by the New York Department of Education at least in part because of the parent company’s (Newscorp) role in mishandling personal data. This actually concerns me less (for now) than does the challenge that the private, for profit Wireless Generation (WG) must meet to deliver on the promise of the Common Core.
I am hopeful that WG can construct a multiple choice administration tool that is adaptive and requires less time of students to assess what multiple choice tests can; namely, what a student does not know. Call me cynical, but less time spent taking multiple choice tests is a win at this point.
I am also confident that the SBAC oversight of the process is likely to inspire WG to construct free response questions for both language arts and mathematics that look much like the sample assessments that are already available on the SBAC website. These sample items require students to read critically, write with insight, and analyze text appropriate for their age with significant depth in language arts. In mathematics the problem solving requires significant application of critical thinking skills in addition to strong traditional mathematical manipulation capability. I am impressed.
The College Board and the International Baccalaureate organizations are two examples of standardized testing administration bodies that have done some things right. Both organizations employ a combination of multiple choice items and free response in their standardized assessments. The IB takes this one step further and takes sample work from the school year (essays, laboratories, etc.) as well. Then, both organizations employ armies of trained educators to perform the assessment in a controlled fashion.
The logistics of this task are incredibly challenging. For the scoring of the AP exams, tens of thousands of educators gather in gymnasiums and cafeterias around the country to score the nearly 3.7 million AP exams taken by students in a given year.
The IB handles this international challenge a bit differently. In addition to end of year exam scoring events, trained IB scorers (all of whom are IB trained teachers as well) are mailed work samples from the work completed during the school year by students in another school (likely from another country) to carry out the assessment at their leisure – sort of – there is a deadline. In both cases, however, the exams are scored by trained human eyes; and, in particular, the eyes of, accomplished, highly trained, professional teachers.
The Common Core assessment is likely to look much more like IB assessment than the AP. The SBAC has outlined a formative assessment schedule that will look much like the IB’s internal assessment of student work that happens during the school year. And the extensive amount of writing in the SBAC outlined assessments will more closely mirror the many papers associated with end of year exams in IB. To meet the testing requirements for the IB in my 2000 student high school (that serves only about 500 students that take IB exams) we employ a full-time IB coordinator and a part time administrative assistant. The total local personnel cost for this administration (includes benefits, insurance, etc.) is probably about $150k. The exams themselves cost the students $75 each. And the annual fee to be an IB school (mostly goes to test creation, assessment, and reporting) is $10,400. To review… to administer exams that closely match those outlined by the SBAC for the Common Core for 500 students the cost is about:
$150k + $75 * 2 exams/student *(500 students) = $150k + $75k + $10.4k = $235,400
…or approximately $471 per student
To put this in perspective, the per pupil cost to administer the NCLB inspired tests in 2004 was approximately $20 per student. That is a 20x difference.
Here is a well-thought out examination of the cost of standardized testing…
WG will clearly have a significantly larger assessment challenge to meet and undoubtedly a smaller per student fee to work with. The CC assessments must meet law; and for language arts and mathematics, this will mean testing each child at least eighteen times during their public school experience (grades 3-11). I love back of the envelope calculations, so hang with me on this one. That means that the 13 million out of 19 million test-eligible public school students that fall within CC adopting states that are served by the SBAC will take a total of approximately 26 million end of year tests (in language arts and mathematics alone), and even more mid-year formative assessments. This is a challenge that is an order of magnitude greater than the most experienced test administration organization in the country has ever faced.
How will WG accomplish this task? Both the AP and the IB are nonprofit organizations that rely upon armies of educators who will travel to work for little pay to have the experience of scoring student exams as professional development. The SBAC has this answer to a related FAQ on their website, “Finally, teachers will score parts of the assessments, including extended response and performance tasks.” If WG can build trust with America’s teachers, and every school were to be involved in the process, this could be a significant growth experience for the country.
Keep in mind that computer based assessment of student written argument is still impossible. Nobody has cracked this nut, and nobody in edtech is likely to do so for at least another ten years. However, there are computer-based productivity tools that could make this process faster and easier to accomplish for our nation of educators.
There is a significant opportunity for WG to fulfill the goals of the SBAC to improve the quality of assessment that is currently happening in the United States. If WG engages educators as allies and partners in constructing and assessing quality exams, we might just make a smarter generation. However, if in the balancing act that weighs quality and profit, WG elects to cut costs with unproven computer-based assessment techniques, or worse, outsourcing of exam scoring to non-educators, I fear that the focus on student argument will be lost.
I am hopeful that what we will experience is a net improvement over what has existed for the last ten years. Assessment is an important part of learning. We have far more summative assessment than any of the other industrialized nations right now, and it has not proven successful in increasing student intelligence and critical thinking ability. Few would argue that one significant reason for this failure has been low quality tests coupled with a process that leaves educators out. With deeper assessments that focus on argument, written and assessed in partnership with America’s educators, and aided by advances in technology, we have the opportunity for a renaissance in public education.