Assessment and Grading In 2-3 minutes, list all the words you know - - PDF document

assessment and grading
SMART_READER_LITE
LIVE PREVIEW

Assessment and Grading In 2-3 minutes, list all the words you know - - PDF document

4/18/2013 Assessment and Grading In 2-3 minutes, list all the words you know that are related to assessment (whether you know well or not) Emad Mansour eam0028@auburn.edu 26 Feb., 2013 Goals for Today Learning What should students


slide-1
SLIDE 1

4/18/2013 1

Assessment and Grading

Emad Mansour

eam0028@auburn.edu

26 Feb., 2013

In 2-3 minutes, list all the words you know that are related to assessment (whether you know well or not)

Goals for Today

  • Differentiate between Assessment & Evaluation
  • Differentiate between formative and summative

assessment of students’ learning

  • Identify and apply some classroom assessment

techniques (CATs)

  • Identify different types of test items
  • Differentiate between validity and reliability
  • Contrast grading policies
  • Create and use rubrics for grading

Participants will be expected to: What should students learn/take away from this lecture? How will I know if students are learning what they need to know? What content and teaching strategies I will use?

Design

Deliver Learning

  • utcomes

Assessment Teaching and learning activities What is assessment?

http://astronomy101.jpl.nasa.gov/teachingstrategies/teachingdetails/?StrategyID=5

What is evaluation?

“A process of collecting information for the purpose of making decisions”.

Assessment

“The process of defining, selecting, designing, collecting, analyzing, interpreting, and using information to increase students’ learning and development.”

Erwin, 1991

slide-2
SLIDE 2

4/18/2013 2

“To sit beside” brings to mind such verbs as to engage, to involve, to interact, to share, to trust. It implies team learning, working together, discussing, reflecting, helping, building, collaborating. “Sitting beside” implies dialogue and discourse, with

  • ne person trying to understand the other’s

perspective before giving value judgments. Assessment from the Latin assidere for “sit beside”

(Braskamp and Ory, 1994)

Assessment

“Assessment defines what students regard as important, how they spend their time …….….

so

if you want to change student learning, then change the assessment method.”

Brown, 2001

http://www.pcrest.com/LO/AE/4.htm

Provides feedback in order to strengthen performance

A process of making judgments about the value, importance, quality

  • f the characteristics that we
  • bserve in people.

Evaluation

Determines if a standard was met; success or failure

slide-3
SLIDE 3

4/18/2013 3

Fill the missing parts in the handout

Assessment Evaluation

What is the purpose? to improve the quality of future performances to determine the quality of the present performance Who requests it? assessee client Who observes the performance? assessor evaluator Who sets criteria? assessee and assessor client (with possible consultation with the evaluator) Who uses the information? assessee (in future performances) client (to make decisions) When can feedback occur during or after a performance during or after a performance On what is feedback based?

  • bservations; and strongest and

weakest points level of quality based on a set standard What is included in the report? what made the quality of the performance strong; and how might

  • ne improve future performances

the quality of the performance, often compared to set standards Who receives the report? assessee client How is the report used? to improve performance to make judgments

The most important aspect

  • f implementing high-

quality ASSESSMENT is making the shift from an EVALUATION mindset to an ASSESSMENT mindset

Assessment Categories

Formative

for

Learning Summative

  • f

Learning

  • Assessment occurring during the

process of instruction

  • serve as practice
  • To check for understanding and

provide feedback

  • Guide teacher decision making

about future instruction

Formative Assessments Summative Assessments

  • Occurs at the end of instruction
  • Provides a summary of accomplishments
  • End of chapter, midterms, final exam
  • Purpose is to determine final achievement

Allyn and Bacon, 2001

slide-4
SLIDE 4

4/18/2013 4

Assessment Formative Summative Type Informal Formal Timing During learning At end of unit/course Source CATS, Self, Peer Test/Exam at end Role of assessor Consultant, Coach Judge Goals Feedback for improvement Pass/Fail grade Process Change over time Snapshot in time

Formative vs. Summative assessment

? ? ? ? ? ?

http://www.ginacarson.com/gc/wp-content/uploads/2010/02/UD-Testing-Cartoon.jpg

Use a variety of assessment tools

Varieties of Assessment Tools

  • Objective Tests/Quizzes

– Closed Book/Open Book – Take Home – (Pop Quizzes)

  • Essay Tests/ Quizzes

– Short/Long

  • CATs

– 1 Minute Paper – Muddiest Point

  • Papers

– Term, Book Reports and – Reaction Papers

  • Course Portfolios

Homework / Projects

  • Problems/Exercises
  • Case Studies
  • Simulations/Role Playing
  • Group Assignments
  • Service Learning
  • Discussion

– In class or electronic

Constructive alignment

Learning

  • utcomes

Teaching and learning activities Assessment

  • Balance between instruction & assessment
  • Reduces tendency to test memorization
  • Insure covering content

The Table of Specifications: A Test Blueprint

See handout

Table of Specifications (modified)

Number inside denotes number of test items Weight

Remember Understand Apply Analyze Evaluate Create

1 1 2 2 3 3

Topic 1

1 1 1 1 2 2 2 2 3 3 1 1 1 1 4 1 1 1 1 5 2 2 2 4 6 6 2 4 4 6 7 3 6 6 9 8 3 6 9 15 22 9 2 4 6 10 14 10 2 4 6 10 14 11 1 3 3 4 12 2 6 6 12 17 3 6 20 4 30 6 69 100

Topics

and their weights

Learning objectives and their weights

Total

Total

%

slide-5
SLIDE 5

4/18/2013 5

Formative assessment

Classroom Assessment Techniques (CATs)

Why CATs?

  • Brief, Non-credit, Effective, Direct, Formative

Classroom Activities

  • To assess and improve student learning
  • To clarify and improve your teaching
  • Less emphasis on grades and more on learning

Classroom Assessment Techniques CATs

  • 1. Learner-centered:

Helps focus students and teachers on what students are learning

  • 2. Teacher-directed:

The teachers is in complete control of: what is being assessed when the assessment occurs how that information is used

Characteristics of Classroom Assessment

  • 3. Formative

Improving vs. evaluating student learning

  • 4. Mutually beneficial

Students and teachers both understand what students are learning and where there is confusion

Characteristics of Classroom Assessment

slide-6
SLIDE 6

4/18/2013 6

  • 5. Ongoing

Continuous feedback loop between student and teacher

http://concord.org/sites/default/files/images/loops.gif

Characteristics of Classroom Assessment

  • 6. Rooted in good teaching practice

Effective teachers are continuously monitoring what students are learning and making adjustments accordingly

Characteristics of Classroom Assessment

Examples

  • f

Classroom Assessment Techniques (CATs)

Guided Paraphrasing

  • In your own words put to paper then explain

to a person from a different discipline (or a10- years old kid) what CATs are. One-Sentence Summary

  • Students are given a topic and asked to write a

single informative, grammatical, and long sentence summarizing that topic.

  • The goal is to say who does what to whom,

when, where, how, and why, where the instructor provides the prompt of the who and students complete the rest.

One-sentence summary video

Write one-sentence summary about assessment

  • What is assessment
  • Who does the assessment?
  • to whom?
  • Why?
  • When?
  • and how?
slide-7
SLIDE 7

4/18/2013 7

Think-Pair-Share

  • The teacher asks a question or presents a

problem

  • Every student think individually for 30-45

seconds.

  • Students exchange ideas in pairs
  • Students share their ideas with another pair of

students or with the whole class

  • Can be applied in any class size

One Minute Paper

  • A few minutes before end of class,

Professor asks students to take a clean sheet of paper (no name) and answer these two questions: 1- What was the most important thing you learned during this class? 2- What important question remains unanswered?

  • In next class, discuss students’ answers and review

areas they found confusing.

The Muddiest Point

  • Near end of lecture ask students to write what is

least clear (muddiest) after today’s lecture/class.

  • Students hand in sheets without names – similar

to One Minute Paper- or use collection box

  • Teacher identifies the most difficult aspects and

elaborates more on these points, at beginning of next class

Concept Map

Summary

slide-8
SLIDE 8

4/18/2013 8

Tests Types (objective- subjective) Reliability and validity

Summative assessment

Tests

Groccia’s “Ideal” Testing Plan

  • 1. Quiz
  • 2. Quiz
  • 3. Quiz
  • 4. Quiz
  • 5. Comprehensive Exam
  • 6. Quiz
  • 7. Quiz
  • 8. Quiz
  • 9. Quiz
  • 10. Comprehensive Exam
  • 11. Quiz
  • 12. Quiz
  • 13. Quiz
  • 14. Quiz
  • 15. Comprehensive Exam

*Subjective data: observations (information) that two or more people are likely to interpret differently. *Objective data: observations (information) that two or more people are likely to interpret the same way.

Types of data collected in assessment Objective Assessment Options

  • Multiple-Choice Questions
  • True-False Questions
  • Matching Exercises

Objective Assessment Examples

  • True or False?

1) Student Learning is the Acquisition of New Information, Attitudes, and Values as Well as the Acquisition of New Skills, Habits, and Related Changes in Thinking and Behavior. A) True B) False 2) The Highest Level of Thinking as Noted in Bloom’s Taxonomy is Synthesis A) True B) False

slide-9
SLIDE 9

4/18/2013 9

Multiple Choice Questions

  • Test Item
  • Stem
  • Alternatives
  • Distractors

Key Terms

Objective Assessment Examples

  • Multiple-Choice Questions
  • At the heart of effective course design is:

– A. frequent assessment of student learning. – B. developing rapport with your students. – C. being proactive in developing learning objectives. – D. engaging students in meaningful learning.

A)Knowledge B)Synthesis C)Evaluation D)Analysis E)Comprehension

What is the most complex level in the taxonomy of the cognitive domain?

Objective Assessment Examples

  • Which of the following is a category in the

taxonomy of the cognitive domain? A) Reasoning ability B) Critical thinking C) Rote learning D) All of the above E) None of the above

  • Multiple-Choice Questions (example of poor question)

Key Principles Multiple-Choice Questions

  • The Stem (Question) should be stated clearly
  • Use only 3-4 alternatives, not more
  • Only one alternative should qualify as the answer
  • Avoid “None” or “All” of the above questions
  • Make sure the correct answer is NOT the longest

alternative

  • Evenly distribute the correct answer across the 3 or 4

alternatives (avoid too many “Cs” or “Ds”)

  • State the problem in positive terms
  • Use `not’, ‘no’, or ‘except’ sparingly or

mark them: NOT , no, except

  • Avoid exclusive & inclusive words

– all, every, only, never, none

  • Avoid exact textbook language
  • Avoid overuse of all- or none- of the

above Tips for Writing Multiple-Choice Questions

(Cont.)

slide-10
SLIDE 10

4/18/2013 10

  • Construct statements that are definitely true or

definitely false, without additional qualifications.

  • Keep true and false statements approximately

the same length.

  • Avoid absolute terms such as, never or always.
  • Do not arrange answers in a pattern (i.e.,

TTFFTTFF, TFTFTF).

  • Always state the question positively.

http://www.utexas.edu/academic/ctl/assessment/iar/students/plan/method/exams

  • truefalse.php

Key Principles for writing true/false questions:

  • Practice. . .
  • Choose one class you teach.
  • For that class, write one true-false question and
  • ne multiple-choice question.
  • Share with your partner.
  • Group discussion

Strengths of essay questions

  • Allows students to interpret and integrate their knowledge of

course content.

  • Easier and less time consuming to create than other question

types.

  • Provides a more realistic task for the student.
  • Allows students to express individuality and creativity in their

answers.

  • Reduces guessing.
  • Requires students to organize their own answers and
  • to express them in their own words.
  • Can efficiently measure higher order cognitive objectives.

http://www.utexas.edu/academic/ctl/assessment/iar/students/plan/method/exams-essay.php

Example question Learning level What are the five sections of a research report? Knowledge In one sentence give the point of a written passage. Comprehension Given the data available on an issue, take a position and defend it. Evaluation Given an argument for the abolition of guns, enumerate the positive and negative points presented. Analysis Construct an original work which incorporates five common materials in sculpture. Synthesis Write a short poem in iambic pentameter. Application Given two opposing theories, design an experiment to compare them Evaluation Example question for each cognitive learning level:

  • Use novel questions, that Require Application,

Analysis, Evaluation, or Creativity (e.g. Problem-Based Questions)

  • Give preference to focused questions that can

be answered briefly

  • Use to answer one or few specific objectives

Tips for writing essay tests/ questions: Tips for writing essay tests/ questions:

  • Indicate the relative importance of each

question (e.g., time to be spent or points assigned).

  • be sure to specify precisely what you want

students to do in their answer

slide-11
SLIDE 11

4/18/2013 11

  • Create a Rubric for Grading Essays
  • Read 4-5 Essays BEFORE grading any of the

essays

– Allows You to “Calibrate” Student Answers with Your Rubric

  • Do Not Try to Read All the Essays at One Time

– Reduces your mental fatigue and boredom

Tips for writing essay tests/ questions: Validity and Reliability Reliability :

  • The measure of consistency for an

assessment instrument.

  • The instrument should yield similar

results over time with similar populations in similar circumstances.

Reliable = consistent measurement

Reliable Measure # 2 YES Measure # 2 NO Measure # 1 YES Agree Disagree Measure # 1 NO Disagree Agree

Validity and Reliability

L L

1- Test-Retest= Stability overtime

H H

How reliability estimated? 2- Alternate-forms 3- Split-half (internal consistency),

Problem with halves- Solution Use Cronbach’s Alpha

Means to reduce error

(increase reliability):

  • Objectively scored questions (e.g. MCQ)
  • Question that doesn’t suggest the answer
  • No tricky questions
  • Clearly worded questions
  • Clear direction
  • …………
  • …………
slide-12
SLIDE 12

4/18/2013 12 Validity and Reliability Validity: Is the extent to which a test measures and accurately reflects what it claims to measure Validity and Reliability

Valid Reality YES Reality NO Test Says YES RIGHT FALSE POSITIVE Test Says NO FALSE NEGATIVE RIGHT

How do you determine if a test is valid? Content validity: Criterion validity:

Does this test measure the construct it is intended to measure? (Math w/ Eng.) Does this test make the distinction it is supposed to make? Does it predict performance on some criterion measure (predictive validity) (SAT and college grades)

Construct validity:

Does this test include the content it is supposed to cover (Math achievement, Alg. Geo., calc.)

Grading

  • Make them valid (use tables of specifications)
  • Make them reliable –assess often, with clear

criteria (rubric)

  • Plan assessments BEFORE the course begins
  • Give students details of marking criteria
  • Give students access to relevant examples of

assessment tasks

How to Make ALL Summative Assessments Better

Grading and Rubrics

http://iteachicoachiblog.blogspot.com/2011/01/five-reasons-why-you-should-stop.html

  • Criterion-referenced

– Provides absolute comparisons – Describes student performance in relation to pre-set standard

  • Norm-referenced (on the curve)

– Provides relative comparisons – Describes individual performance in comparison to

  • thers

Types of Grading Policies

slide-13
SLIDE 13

4/18/2013 13

Allyn and Bacon, 2001

Criterion- Referenced

  • Mastery of objectives
  • Criteria for grades set

in advance

  • Student determines

what grade they want to receive

  • All students could

receive an ‘A’

  • Grading on the curve
  • Students compared to
  • ther students
  • ‘Average’ becomes

the anchor for other grades

  • Fairness issue
  • ‘Adjusting’ the curve

Norm- Referenced

Rubrics

Have you used rubrics in your teaching?

A) Yes, frequently B) Yes, occasionally C) Yes, but only once or twice D) No, never E) What’s a rubric?

Rubrics

  • “A rubric is an explicit summary of the criteria

for assessing a particular piece of student work, plus levels of potential achievement for each criterion.”

  • Shared with students before they begin the

work that will be assessed.

Types of Rubrics

  • Holistic: Describes a student’s work as a

single score

Best for:

  • tasks that can be performed or evaluated as a whole

and/or

  • those that may not require extensive feedback.
  • Analytic: Provides a separate score for each

criterion

Using a Rubric

  • For Evaluation, Communication, Motivation and Organization
  • Rubrics can be used to:

Give feedback to students.

Directly or indirectly determine a grade or portion of a grade.

Teach students to self-assess

Encourage a minimum level of quality for their work.

Activity

slide-14
SLIDE 14

4/18/2013 14

Impact: Evaluation

  • Evaluate: valid, fair, and clear judgment of

quality.

Consistency.

Objectivity.

Clear justification for grade.

Feedback on effectiveness of pedagogical choices

  • r change(s)

Impact: Communication

  • Communicate: with student and others

– Instructors can better communicate with:

  • Students
  • TAs and/or graders
  • Instructors (sections or next/past course)
  • Outside sources (e.g. accrediting bodies).

Impact: Motivation

  • Motivate: how students study, work, and

learn

Helps students engage with the course material/work.

Impact: Organization

  • Organize: transitions, closure, effort, and

expectations

Makes expectations clear before each work is started.

Identifies what to focus on in instruction!

Organizes efforts of multiple graders.

Saves grading time

Creating a Rubric

  • Identify your “criteria/traits” that will be

assessed.

  • For all criteria/traits, decide on number of

levels (2 to 5).

Write a descriptive verb statements for each level

Including differences in degree, frequency and/or effectiveness

Use stronger and weaker descriptors accordingly

Activity: Creating a Rubric

  • Create a rubric for evaluating students’ online

discussion for THIS class (HIED8510)

  • First decide on the criteria
  • Then each group will work on one criterion
  • ver all levels
slide-15
SLIDE 15

4/18/2013 15 Online Discussion Rubric

Unsatisfactory Limited=1 Proficient=2 Exemplary=3

Critical analysis (Relating to readings) Contribution and enriching of discussion, time and number of posting Professional Communication and Etiquette Writing Quality

Summary

Jeopardy Game

http://fwallpapers.com/view/3d-thank-you-note