College Policy Debate Forums
June 22, 2018, 05:07:19 PM *
Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
News: IF YOU EXPERIENCE PROBLEMS WITH THE SITE, INCLUDING LOGGING IN, PLEASE LET ME KNOW IMMEDIATELY.  EMAIL ME DIRECTLY OR USE THE CONTACT US LINK AT THE TOP.
 
   Home   Help Search Login Register  
Pages: [1]
  Print  
Author Topic: Program Assessment  (Read 2129 times)
andreareed
Full Member
***
Posts: 101


« on: October 30, 2010, 02:20:55 PM »

Have any of you directors out there had to create a program assessment plan for your squad?  Specifically, has anyone investigated or developed quantitative metrics for measuring debate-related student learning outcomes?  I am looking for ideas for how to measure and show our administration that we are in fact teaching students advanced research, critical thinking, and public speaking skills.  We are being asked to do this because Kentucky is up for reaccreditation soon, though this type of research might be helpful for many schools as they are increasingly having to defend the size of their budgets and (sadly) the existence of their programs.

Our assessment coordinator is obsessed with rubrics.  She wants me to create a rubric that we would fill out after every practice debate or rebuttal redo to track studentsí progress.  Iím not so hot on this idea mostly because I donít want to invest my time collecting data that is not that useful to usÖ  as coaches, when we all watch our students at home and at tournaments, we rely mostly on qualitative assessments of their performance.  So if I must spend the time collecting data for the higher-ups, I want to try to make it as meaningful for us as possible.  

Even if you donít have fully fleshed out metrics, I would appreciate hearing any ideas people might have, especially how to use the data we already have (win-loss records, speaker points, Bruschke rankings, NDT/CEDA points, ect.) in a way that connects those numbers to specific student learning outcomes.
« Last Edit: October 30, 2010, 02:22:29 PM by andreareed » Logged
hansonjb
Full Member
***
Posts: 223



« Reply #1 on: October 30, 2010, 02:33:01 PM »

three things i've done:

1. encouraged the use of qualitative assessment.

2. asked students about the intellectual rigor and whether our program meets our school's academic mission using a 1-5 scale. similar questions could be asked.

3. i have used win-loss improvements for students as a metric.

and finally, i will say, assessment is an oxymoron for debate. every practice debate, every debate, every reviewed research assignment is being assessed. writing that down is dumb but here we are in the 21st century ready to be evaluated by a fresh crop of social scientists on overdrive.
Logged

jim hanson Smiley
seattle u debate forensics speech rhetoric
Jessica Kurr
Jr. Member
**
Posts: 89


« Reply #2 on: October 30, 2010, 11:24:59 PM »

Although not specific to individual debates, Dr. Jack Rogers, and someone else whose name I forgot, at Central Missouri just completed a decade long longitudinal study on 5 different categories including job performance, critical thinking, "psychological multipliers," and two others. The study followed two groups: debaters and non-debater from the start of college to the end of the college then the end of a couple years on the job and a few more years on the job. I don't have an electronic study, but I'm sure they would e-mail you a copy.
Logged
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.0.19 | SMF © 2013, Simple Machines
SMF customization services by 2by2host.com
Valid XHTML 1.0! Valid CSS!