Monday, November 25, 2013

What is a summative assessment?

Ok, I ranted in this post.  I didn't plan on it but it turns out I am pretty darn bothered by the state of what we call assessments.  The Big Idea in this post is "Lets fix our assessments or start being honest about what they actually are."

By definition a summative assessment is an assessment that is at the end of the learning.  I challenge ANYONE to find a summative assessment in education that is not an ACT, State  Testing, or final exam.  Isn't it true that a good teacher takes information from a topic/chapter assessment, which is typically considered summative, and discusses the issues the students had so they no longer have those issues...thus formative.  

Now that we know that almost everything in education is formative lets move on.  The idea behind a formative assessment is to get information that will directly drive instruction.  Therefore they should be quick, to the point, one topic, and emphasize learning at a DOK 1 or DOK 2 level.  The more of those basic level low rigor questions we get rid of on the formative assessments the more depth we can access on our summative assessments.  That means our summative assessments can ask quality questions that force students to think, to analyze, to combine thoughts so we can actually see how deeply they know the material.  What does this mean for all those who create "summative" assessments?  Lets make some rules to clarify further what a summative assessment is and is not.

1.  A summative assessment NEEDS to be more than a multiple choice assessment.  
2.  A summative assessment MUST ask students to explain, to clarify, to compare, to analyze different situations.  If we don't ask, we won't know.  
3.  A summative assessment needs to have a rubric to determine a students level of proficiency.  
...time to get my frustration out...
Why does a 11/11 correlate to a 4, a 10/11 go to a 3 and an 8/11 go to a 2...Because we insist on making everything a percent!  If we want a 4-point scale then lets make assessments that assess students properly and force students to demonstrate depth of knowledge that would confirm a 4.  If students cannot demonstrate a 4 it is up to us to create those assessments that allow students to do just that.  If we can't, grading better start at a 3.  

For those that are curious...this isn't even about math anymore.  We have problems with our assessments but we also have a fix in the process.  It will take time but we at least have the train on the tracks.  It is entirely to easy for teachers to say "I grade on a 4-level scale" when in reality, they don't.  They just grade on a percentage scale then fit it into a 4-3-2-1 according to 90-80-70-60.  

For more information check out some of my previous posts that all involve assessments and grading.
     Testing and Scoring Math
     Why do we Review EVERY chapter in Math???


Monday, November 18, 2013

An unfortunate epiphany

For the past several years I have read the CCSS, created tasks, wrapped PD around the mathematical practices and worked with teachers to understand Webb's DOK.  I have focused on getting resources in the hands of teachers to implement the CCSS in a manner that leads our students to the best chance of success.  

All that work leads our district into a position for success.  However, without changing instruction, not the style of instruction (guided math), group work, or project oriented instruction but deep down how we talk about math instruction...we are stuck.  We teach skills that are aligned to the CCSS but we teach them the same way we have for years....Without a deep understanding.  We as teachers have a tendency to focus on "tricks" that will make it easier to understand and show students immediate success.  When in the end the easy path is never the right path to choose.

We as a district need to focus on what it means to instruct math in terms of depth and engaging students in the meaning of math.  This doesn't need to be boring or teacher led.  This just means we need to make the small connections necessary for students to not just gain success but connect the dots from one topic to another.  To truly understand the meaning of the different topics we are trying to convey the meaning of.

This is going to be a slow and probably painful path.  Although, as discussed those are typically the paths that lead to success.

Thursday, November 14, 2013

At the MPES (Math Proficiency for Every Student) Conference

Attending the MPES conference in Oconomowoc.  I have to present most of the day but we start off with an update from the Director of Math for the SBAC.  Lets see what she says.

Major Point #1:  (from a colleague who is sitting next to me on the ACT redesign team)

  • ACT is remaking their assessment to align with the questioning levels (DOK) of the SMARTER Balanced Assessment.  Time constraints/lengths of assessments have not been determined but assessing claim 4 will be some form of a prompt given weeks ahead of time.
Shelby Cole - Director of Math for the SBAC - 
  • Going over the stuff we know from the SBAC.  The focus is tying the assessment to the things they want to see as far as instruction in the classroom.  
  • Summative Assessment is good for a snapshot of growth from one year to the other. It could be used as a screener but it is only one-time per year
  • Interim Assessments are used to give information at a different level.  It will allow teacher to build assessments on specific topics using questions that are in the same style as the Summative Assessment.  
  • Formative Assessment section is not more tests but great resources for teachers to help define students level of performance.  It goes way beyond just testing...into good practice.
  • The test will be adaptive.  All kids will be answering questions seeing items that are appropriate in level and difficulty for that child.  The score is based on the kinds of questions the students answers correctly.  
  • The test is shorter than a fixed form test to get the same results.  
  • Performance Tasks "reflect real-world task and/or scenario-based problem.
    • DCE is 100% accurate in our perception of tasks - we must make them happen regularly.
    • We might be a bit more aggressive in elementary but students are handling it.  Why would we back off?
    • Need to be feasible for the school/classroom environment.  They have also taken into account the CAREER PATHWAYS.  They want the tasks to span several pathways.
    • Two types of Tasks
      • Evaluate and Recommend
        • Write a letter to your school principal in regards to...
      • Plan and Design:
        • Plan a garden that....
  • The emphasis on the claims - although we haven't been specific in our terminology of "claims" is right now.  
    • Claim 1:  concepts and procedures - we are good at this one!
    • Claim 2:  Problem solving - doing well - for the most part
    • Claim 3:  Communicating Reasoning
      • reasoning on a number line
      • reasoning of fractions
      • Does not need to be embedded into context
      • This is NUMBER SENSE!
    • Claim 4:  Modeling and Data Analysis
      • These are our situations (weak tasks).  
      • We need to do more with "modeling" scenarios but our analysis is strong.
    • Performance Tasks:
      • These are our really good tasks.  Very open ended with NO scaffolding.  
  • Plus Standards:
    • In 2015 we will not have items to assess those standards
    • 2016 and beyond the plan is to have items that assess those standards.  These would not be reported in the "normal" reporting method but in a different reporting feature.  
  • How long will the test take?
    • Just math time for the total test is estimated at 1.5 hours for the performance task and about the same (could be 2 hours) for the adaptive test.  The pilot versions took about 3.5 hours total time.
  • There are NO calculators available prior to GRADE 6.
  • Accommodations
    • Universal tools are access features of the assessment that are available to all students.
      • The major one for DCE is handing out scratch paper and pencils for all students.  If we don't give it to them they won't use it.
    • Designated Supports and Accommodations are classified as either embedded or non-embedded.
      • Embedded - in the assessment
        • color, masking, text to speech, language, sign-language, braille, closed captioning,
      • Non-embedded - district provided
          • bilingual dictionary, color contrast, color overlay, seating, calculator, read aloud, ...
    • Translations:  some words have a very light box that have a strong relationship to the math they need to know will not have a translation available.  Other words will have a translation piece available using a "thesaurus."
    • Students CAN go back into their test and review old questions.  It will choose the next item based on current performance but students can change answers (highlight/flag questions they are uncomfortable with). 
    • Can strikethrough their answers to help narrow down/eliminate choices.
  • Comment from a student.
    • "On the CMP (Conneticut Mastery Test) I work and work and work and then my answer is not there.  This test let me explain how I did it."
    • This supports our districts focus on literacy in explaining their questions and reasoning.
  • In summary:  we are doing really well.  This made me feel very comfortable with our direction.  We need more implementation of what teachers know we should be doing.  This means, unfortunately, more time for teachers to develop and more PD on what scaffolding actually does to hinder student development.  For me, that means more concrete examples to give to teachers so they can see the clear distinction between a quality question and a good question that we made procedural.

Sunday, November 3, 2013

The States decision on the CCSS

With the resignation of a Democratic Senator from the educational committee overseeing the decision of the CCSS in Wisconsin for reasons stated as not wasting his time on a committee that clearly has a alternative agenda and is not listening to the overwhelming majority of positive response to the CCSS, it appears the state is headed for a new process.  While at the hearing, there was a lot of mention of the Massachusetts State Standards and how much better they are than the CCSS.

I am not sure about you but when I looked at the standards from Massachusetts, they looked a whole lot like the CCSS.  So I made a little checklist.  Here is what is in the Massachusetts math standards.
     - 8 Mathematical Practices - check
     - K-8 standards by grade level - check
     - 9-12 standards not listed by grade level - check
     - Plus standards and modeling standards - check
     - An emphasis on application and depth of instruction - check

So what is different?
     - almost nothing - no standards have been removed
     - In all of K-12, only 16 standards have been added.  Of those, 9 of them are "clarification" standards.
     - That means that in all of K-12 there are only 7 added standards.  

What a great idea.  Lets get them.  If that is the difference that the Wisconsin wants I am all for it.  I will just keep doing what I am doing and be just fine.  8-)

What a tremendous waste of time this is!