Thoughts on Assessment 3: Writing the obit on summative assessment

This is the third in a series of posts on assessment. I imagine that it won’t be the last, but I think this is the most important of the three thus far. The first post was inspired by a post by Henrick Oprea (Blog, Twitter) and developed as I read posts by Steven Anderson’s (Blog, Twitter) and Jan Webb (Blog, Twitter). If you are just checking in at this point, here are the links to the first to posts (Note: I had started this post prior to attending Educon 2.2 at the Science Leadership Academy in Philadelphia, January 29 – 31, and worked on it while I was there, thus there is an Educon influence):

Thoughts on Assessment 1: A response

Thoughts on Assessment 2: A conversation

I have been researching and toying with this post for a few weeks and recently saw a link, in my Twitter stream, to a post by Jim Blecher (Blog, Twitter), entitled, “SocialMedia CreativityCommunity.” It really clicked for me and brought the pieces of this post together. I am a firm believer that even the most jaded students enjoy learning, what they hate, is school. There can be any number of reasons this is true, but one that is arguably universal across the student spectrum is the fact that far too much of school is organized around testing and not around learning. School is focused on the process of knowledge inoculation as opposed to knowledge appropriation. Teachers need to vaccinate themselves against the evils of high stakes testing and in order to do that, they design learning that focuses on the fractured knowledge that is required to successfully produce on those tests. Students, on the other hand, want to know “stuff” and want to see what can be created out of it, they want to appropriate the knowledge to themselves, shall we say, Construct Meaning. The missing element in the equation is relevance and it is missing because it has been tested out of the school process. I am not referencing just the high-stakes standardized tests, those are merely the most egregious example. What I want to focus on in this post is, hopefully writing an obituary for the idea of summative testing, or at least declare it in critical condition.

In Rethinking Education in the Age of Technology, Allan Collins (Profile, Wikipedia) and Richard Halverson (Profile) talk about the way in which we determined the learning progress of a student (apprentice in those days) years ago:

In the apprenticeship era, the adult carefully observed learners and corrected them as they went along, giving them tasks they were ready for, and seeing whether they completed them successfully. Observation during the course of task completion combined the functions of formative and summative assessment. Ongoing, formative encouragement or critique provided feedback to guide the learner through tasks, and the final, summative judgment gave learners feedback on whether the task was successfully completed.

As education began to drift, leading to its current condition of having no apparent purpose, testing began to be the tool of choice to determine if students had “learned” the appropriate information and skills determined as “necessary” to indicate that learning had occurred. This model also fit nicely into the industrial age economy. Students went to school, learned the prescribed material, proved they had learned that material via a test that required no creative or original thought, and then went on to jobs that required them to perform assembly line tasks or develop assembly line thinking and procedures, what Tom Peters (Blog, Twitter) refers to as the Ford Motor Company model of education (see his four minute presentation in this post). More and more the idea of assessment moved from helping a student see their growth and development to a judgment of their “status of learning,” which was then used to classify and categorize them – not to assist in the learning. R. J. Stiggens, writing in Phi Delta Kappan explains the the difference of “assessment for” and “assessment of” learning this way:

Assessment for learning can contribute to the development of effective schools. If assessments of learning provide evidence of achievement for public reporting, then assessments for learning serve to help students learn more. The crucial distinction is between assessment to determine the status of learning and assessment to promote greater learning.

More recently, assessment as moved from a tool with potential to help students develop mastery to one that is used to secure funding. It no longer aids students in learning, but performs a punitive function to the determent of learning. Formative assessment, far too often, is given only cursory acknowledgment and all the eggs are put in the summative assessment basket. There is something fundamentally wrong with this current situation. Formative assessment is the most powerful of the assessments in helping students learn and move toward mastery. In fact formative assessment enables a teacher to determine incremental amounts of mastery as the student moves through the learning process. Done effectively it renders summative assessment redundant and unnecessary. So, here is were I hope to put the last nail in the coffin of summative assessment (my apologies for the rather dark metaphor – but summative assessment is a dark cloud hanging over school).

The post by Jim Blecher (Blog, Twitter), entitled, “SocialMedia CreativityCommunity” highlighted the process of “perception and processing,” something students do every day in school (or we hope they do). Jim begins by talking about a song by Jonathan Coulton (embedded below):

I hope the metaphor I see here works for you as well. Let’s say that this video represents the learning that each teacher carefully crafts in their classroom. This includes a variety of pieces that students play with in the process of internalizing the information, practicing skills, and making them part of their knowing.  At various points in this process the teacher stops to check and determine if the internalization process is happening and/or to what degree (formative assessment). When gaps appear she/he work with the student(s) to insure that mastery is being developed. Better yet, let’s assume that this is a true project based learning environment. Project based learning is the process of small bits of convergent learning designed to lead to divergent assessment (that’s my theory on it any way). The small modules are designed to converge into formative checks and then open up directly into the next piece of connected learning (all within the context of understanding the big picture, which is always where learning should begin). By the time the student reaches the cumulative point of the project they have achieved mastery. If the teacher were to use a summative assessment tool (test) that was predicated on all the formative checks (as is usually the case) what would it tell the teacher? Nothing new. Lets put summative assessment, as a tool to gauge learning, out of our students misery. Back to Jim Blecher’s post.

Jim addresses the process of creativity and play within the context of a media enhanced social community – that’s not a bad description of what classrooms should be, is it? He highlights a couple of YouTube videos that grew out of Coulton’s (I did a search and there are currently 1,990 hits when searching for “code monkey”). I liked this one the best:

Continuing the metaphor from above. After working through the process of constructing meaning (internalizing learning) and mastering skills, students should be turned lose to generate something new, in other words we need divergent assessment where we now have summative assessment. Students need to add themselves to their new knowledge. In his book Change by Design, Tim Brown (Profile, Blog) describes the idea this way:

Convergent thinking is a practical way of deciding among existing alternatives. What convergent thinking is not so good at, however, is probing the future and creating new possibilities. Think of a funnel, where the flared opening represents a broad set of initial possibilities and the small spout represents the narrowly convergent solution. This is clearly the most efficient way to fill up a test tube or drive toward a set of fine-grained solutions.

If the convergent phase of problem solving is what drives us toward solutions, the objective iof divergent thinking is to multiply the options to create choices . . . By testing competing ideas against one another, there is an increased likelihood that the outcome will be bolder, more creatively disruptive, and more complelling. Linus Pauling said it best, “To have a good idea, you must first have lots of ideas.” – and he won two Nobel Prizes.

Summative assessment is convergent, all learning leads to one final set of questions and every student must answer them in exactly the same way. Formative assessment can be convergent and maintain integrity, using convergent assessment at the completion of learning lacks integrity. I am not suggesting we recouch summative assessment, I am advocating for its extinction. Divergent assessment is not summative. Upon completion of the mastery phase of learning students need to be set free to use their new wisdom and the requirement should be, “Show us something new, amaze us!” That is opposed to the current reality in schools. Let me paraphrase Brown from the same chapter:

The natural tendency of [education] is to constrain problem-solving and restrict [student] choices in favor of the obvious and the incremental. Though this tendency may be more efficient in the short run, in the long run it tends to make a [student] conservative, inflexible, and vulnerable to game-changing ideas from outside. Divergent thinking is the route, not the obstacle, to innovation.

Divergent assessment, by design, requires the use of the top layers of Bloom’s ideas about learning. Because students have achieved mastery along the way, of information and skills, they have a new embedded knowing that allows them to take their learning and apply it in ways that intrigue them and allow them to find purposeful meaning in their process. At this point, the restrictions are minimal – very minimal – and students are allowed to evidence their learning in powerful and meaning ways. An example of this type of expression of learning can be found in Chris Lehmann’s (Blog, Twitter) NYSCATE presentation (the segment I am referencing begins at 42:27 and a shorter version can be found here beginning at 2:40):

I don’t know if the development of the flow-process bio-diesel generator was a form of assessment, I use the example as a way for you to envision divergent assessment. The result of student learning in this situation was powerful – purposeful. It is an example of what the results of divergent assessment can look like. Students set free to play with their learning will generate this type of evidence. And, this type of evidence has the integrity that is lacking in current forms of summative assessment. This approach to assessment will, as Howard Gardner (Profile) puts it, reveal not just what students know and understand, but “also capture how those new understandings metamorphose.” Students at Science Leadership Academy complete what I feel is the closest thing to real divergent assessment in their Capstone:

The Capstone Project at Science Leadership Academy is an opportunity for students to show the scholars they have become. It represents the culmination of four years of intellectual growth towards an independent and self-directed learner who can contribute meaningfully to his or her community, whatever that means to the individual. It will enable the student to focus his interests and curiosity into a coherent representation of how he thinks and what he believes as he leave high school. The capstone represents a synthesis of the SLA mission and vision as students attempt to answer the questions: “How do we learn?” “What can we create?” and “What does it mean to lead” through a self-selected and designed independent project. As with everything we do, it should embody the core values of inquiry, research, collaboration, presentation, and reflection. The final product will look different for each student, just as each student has a unique perspective and approach to learning.

Yes, I hope that soon we will see the extinction of what we currently call summative assessment. It is necessary in a reality that as Tom Peters points out is, “[An] age of creation intensification.” This shift must be part of the process of rethinking school and will begin to move us toward fulfilling the lip service we pay to the statement that, “We are educating students for a future we don’t even know and can’t imagine.” Let’s write the obituary on summative assessment before it’s too late.

Convergence and Divergence graphics are from Dr. Scott McLeod and can be found on his blog, Dangerously Irrelevant, in the post, “Convergent v. divergent thinking in K – 12 schools.

Stiggins, R. J. 2002. Assessment Crisis: The Absence of Assessment FOR Learning, in Phi Delta Kappan Vol.83, No.10 pp758-765.

___________

Other’s thoughts and ideas on assessment:

Guiding Principles for Assessment by Daniel Meyer at dy/dan

I Test, Therefore I Am” at DragonPhysics Blog

Assessment, Giving Students A Choice” by Maggie Hos-McGrane at Tech Transformation

It Is The Test! Or is it . . . at Constructing Meaning

It’s not ‘the tests.’ It’s us.” by Scott McLeod (Twitter) at Dangerously Irrelevant

Assess the Assessment by Tom Whitby at My Island View

Assessment as (re)search” at University Writing Assessment

Trivial Pursuit and Assessment” by Aaron Eyler (Twitter) at Synthesizing Education

A Vision of Standardized Assessment” by Jason T Bedell (Twitter) at Jason T Bedell: Reflections on Teaching and Learning

Assessment Matters” by John Wilkie at Learning & Teaching

TEDxBANFF is Now a Memory & I was Wrong about Creativity” by David Warlick (Twitter) at Two Cents Worth

Transparent Algebra: Assessment” by Karl Fisch (Twitter) at The Fischbowl

Where are we? by Joe Bower (Twitter) at For the Love of Learning

Assessment is a Bad Word?” by Ben Grey (Twitter) at The Edge of Tomorrow

Can Standardized Test Data be Formative?” by Ben Grey (Twitter) at The Edge of Tomorrow

Testing, testing, 1, 2, 3 . . .” by Heather Mason (Twitter) at Teacher In Transition

About Greg Thompson

I'm a dad, Educational Freelancer, Edupunk thinker, EdTech Blogger, Paralegal, Foodie, and I love inspiring and facilitating change in education.
This entry was posted in Uncategorized and tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

4 Responses to Thoughts on Assessment 3: Writing the obit on summative assessment

  1. Joe Bower says:

    Wow. Just Wow. You have no idea how much you and I are on the same page when it comes to assessment. Thank you so much for this post.

    Check out my blog at http://www.joe-bower.blogspot.com where I blog about education.

    I have subscribed to your blog via RSS in my Google Reader and I am now following you on Twitter. (I can be found at http://www.twitter.com/joe_bower)

    • Tina says:

      Thanks for the link you posted on Dangerously Irrelevent. This was worth reading and now I have more reading to do. I think the issue with education is that it is not evaluating the evaluation system. As an Art Teacher my lessons are product based. I see students succeed in my class that never succeed in the core areas. They not only succeed but show levels of creativity that would amaze. If this could be used as a model for them in core areas I think we would see more successes.

  2. Pingback: I Test, Therefore You Are « DragonPhysics Blog

  3. Pingback: Teaching for the test – based on a true story « Doing some thinking

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s