What does Virginia’s assessment program say about what we value?

Henry County Superintendent Jared Cotton pinpoints four needed changes in Virginia's assessment system, and suggests how we might add rigor and relevance. 

For the last several years, I have had many opportunities to work in the assessment field.  Prior to serving as Superintendent for Henry County Public Schools, I served as the Director for Assessment and Accountability for Chesapeake Public Schools and later the associate superintendent for Educational Leadership and Assessment in Virginia Beach.   I can honestly say that I have seen the good, the bad, and the “ugly” when it comes to student assessment.  With all of my experiences with assessment, I always go back to an anonymous quote that I feel plagues many school divisions across the nation: “We value what we measure, rather than measure what we value.”  I interpret this quotation to mean that we often place value on assessments that are available to us rather than developing an assessment system that truly measures the outcomes that we value for our students.  In my opinion, this is certainly the case with the 34 criterion-reference tests that we attach value to in Virginia.  This is why I feel it is important to ask the question, “What does the current assessment program say about what is valued in Virginia?”  Does it say that we are focused on preparing our students for college, career, and citizenship by assessing the skills required for each?  Does it reinforce the importance of relevant, rigorous instruction and assessment practices in classrooms throughout the Commonwealth?  Does it provide educators and students with data to accurately measure student growth?   Based on my observations, it is time for a critical look at Virginia’s assessment program and accountability model.  As a result, I offer the following in support of the VASS Blueprint 2011 that calls for a much needed change in Virginia.  

Utilizing one assessment measure is not sufficient—With the current assessment system in Virginia, one test on a given day is the only indicator of a year’s worth of learning at several grade levels and across all high school level courses.  As Jay McTighe asserts, student assessment should be like a “photo album” rather than an “snap shot.”   He talks about how multiple assessments are necessary to show evidence of student understanding.  In addition, all researchers know the importance of “triangulation” when analyzing student data.  A single measure cannot be used to assess student mastery.  This became very clear to me in my current role as Superintendent for Henry County Public Schools.   When I started in this position, I began to articulate the importance of literacy and the expectation that all students read on grade level.  Using the Virginia SOL English: Reading assessment, it appeared that close to 90% of all students had met this expectation.  However, I decided to utilize the Scholastic Reading Inventory (SRI) as an additional measure to assess reading ability.  As predicted, we discovered that several students who passed the SOL Reading Test were not reading on grade level according to the SRI.  If we had relied on one assessment, several students would have missed out on necessary reading interventions that were put in place at all elementary schools this past year.  As a result, I no longer allow principals to use SOL data alone to measure student achievement.  Should we have the same expectation for Virginia? 

Multiple-Choice assessments are not the most effective way to assess student mastery—I have said this before, and I will repeat it here.  I am convinced that multiple-choice assessments continue to be prevalent as an assessment format in state assessment programs is because of two things:  time and money.  We all know that you can easily score multiple-choice assessments and test developers can “sample” from a wide array of content.  However, I frequently ask teachers an important question to ponder when they have administered a multiple-choice assessment to students.  Can you say with 100% confidence that a student who passes a multiple-choice assessment has mastered the content.  In every case, I get a resounding, “No!”   The truth is, students can utilize test-taking strategies or sheer luck to select the correct response, and teachers are well aware of this phenomenon.  In addition, you cannot assess higher-level thinking skills on a multiple-choice assessment.  This point was further emphasized when I met with college mathematics professor recently.  He lamented that standardized tests are “killing his mathematics program.”  Having piqued my curiosity, I had to learn more.  He went on to say that over the past several years, his students constantly ask for multiple choice tests so that they have “options” or choices on the test.  He wants to focus on ensuring that students have a deep understanding of mathematical concepts and that they can apply what they have learned in new situations.  He cannot assess at this level using multiple-choice items.  I don’t mean to start a campaign against multiple-choice assessments.  Clearly, they are effective if used appropriately.  I get concerned when it is primary or only assessment format used to assess student learning.  I am encouraged by the fact that Virginia is now including “technology enhanced items” in newly developed assessments, however, more needs to be done.     

In addition, most educators would agree that important college and career readiness skills such as critical thinking and creative thinking cannot be accurately measured through multiple-choice or assessments that require limited responses.  To put this in perspective, I will often ask educators to consider the following question:  “Would you feel safe if you were boarding a flight to California and you just learned that your pilot received his license by passing a multiple-choice assessment or paper/pencil assessment without demonstrating mastery through hours of supervised flight time?”  OR “Would you be comfortable if your surgeon only demonstrated mastery through multiple-choice assessments without having the hands-on learning experiences under the direction of a skilled surgeon?”  Everyone always has the same response to both questions.  No sane person would ever be comfortable with either of these scenarios.  I’m not saying that testing is a life for death proposition.  However, it is clear that when it really matters, multiple-choice assessments are not an acceptable demonstration of mastery.  So, why do we continue to use this as the primary assessment format in statewide assessment programs?  

Measuring student growth is essential to a comprehensive assessment program—In Virginia, we continue to focus on whether or not students met an established scaled score on a criterion-referenced test rather than focusing on student growth over time.   When I talk to educators and principals throughout the commonwealth, they are continually making attempts to examine student growth by comparing results from last year’s cohort group to this year’s group results or looking at increased scaled scores for individual students from one year to the next.  The truth is, you can’t do this in Virginia.  The Virginia assessments are not vertically aligned.  As a result, you can’t look at a child’s score on the 3rd grade SOL Reading assessment and compare it to the same child’s score on the 4th grade SOL Reading assessment to interpret whether or not a student demonstrated growth.  This dilemma has forced school divisions to look for other resources to assess student progress and growth over time.   This is a mindset that has to change in Virginia.  We should expect all students to grow academically and we need measures to help us make this determination.  In Henry County, we had to find an adaptive assessment that accurately measures student growth in relation to state standards.  Wouldn’t it make sense to make this a part of the assessment program in Virginia so that all students will have access to the same resource?    

Teaching to the test is not preparing our students for the future—I am often reminded of a quote that Tony Wagner shared during a community meeting in Virginia Beach several years ago.  When asked about the importance of assessments, he noted that “teaching to the test is fine if you develop a test worth teaching to.”  Every day in classrooms across Virginia we see a lot of focus on test-prep or test-taking strategies.  We also see examples of students being asked to memorize a great deal of content.  I had a student tell me recently that his teachers teach him to pass the SOL test, but he doesn’t feel like he is learning anything.  I found this to be a profound statement from a high school student.  He did not see the relationship between passing a test with learning!  I don’t say this to be negative toward teachers in any way.  Teachers are working hard every day in Virginia classrooms to “cover” the content and prepare the students for the 34 criterion-referenced assessments that are administered at the end of each year in Virginia.  In Henry County, I have been working with teachers to increase the rigor in both instruction and assessment practices.  As you might imagine, teachers and administrators are concerned that decreasing test-taking strategies or test-prep in the classroom could have a negative impact on the SOL test results.  Of course, I continue to make the argument that teachers need to continue to focus on the Standards of Learning, but I believe that teaching and assessing at a higher level will increase student understanding of the content.  However, not everyone believes this.  As a result, I have worked to find teachers in our school division who teach this way on a regular basis and have high passing rates on state assessments.  As I shared with a group of teachers last week, this is not an “urban legend.”  You can teach at high levels and students will be successful on the state tests.  You can prepare students for success on state tests without continuing to “practice” for the test.  As many have learned, the United States is constantly being compared to the other nations in regard to our performance in areas such as mathematics and science.    In many of these comparisons, the Programme for International Student Assessment or PISA is utilized for the comparisons.  This international assessment focuses on the application of knowledge, not the regurgitation of facts.  Several countries use PISA as their “test worth teaching to.”  I believe that if we create assessments like these in Virginia, we will see increased rigor in classrooms.