Queen's Learning Outcomes Assessment

Learning Outcomes Project
Learning Outcomes Project

Authentic Assessment of Cognitive Skills

Developing critical thinking, creative thinking and problem solving in higher education

BASICS Rubric-builder

1. Creating the Assessment
In guiding the development Task- Course- Program (Context- Rationale for assessment)of authentic assessment to support student learning within a course using rubric-based assessment, all choices made should consider the context of the program and institution.

Student learning relates to: a) Knowledge and understanding; b) Proficiencies and practices; and c) Attitudes and dispositions. Authentic assessments can include each of these, while combining different types of knowledge:

  • Declarative (knowing “what”); 
  • Procedural (knowing “how”); 
  • Conditional (knowing “why”); and 
  • Metacognitive (knowing "why")    

 

Considerations

What Will the Product Look Like

  • What is the size of the class?
  • What prior knowledge do the learners have?
  • Are there any special needs?
  • What is the Learning Context?
    • Face-to-face
    • Blended
    • Online
  • How will learners be Working?
    • Individually 
    • In pairs or small groups
  • Written, eg.
    • Advice letter/ blog
    • Analysis (of research/product)
    • Book Review
    • Design/ research proposal
    • Feature Article/ Editorial 
    • Research Essay
  • Presented 
    • Campaign proposal
    • Documentary/video
    • Oral Product demonstration
  • Performed 
    • debate
    • play
    • Dance/ Music Recital

 

 
2. Real World Contexts

The Right Question

Instructional Scaffolding

Learning activities for an authentic task revolve around a question that has relevance in the real world. Learners have the option of developing their own question, but the more open-ended the task, the more complex the assessment will become.

 

Motivation and meaningful learning:

  • Is the question is worth answering?
  • Is there is a real world application?

Complexity of the question:

Does the question begin with what, where or when, or how, why. The how and why questions often lead to integration and generalization. Once you have decided on the right question for your purpose,  you need to determine the what material and how much support will be provided.

 
 

The targeted learning behavior can be supported by providing learners with a specific set of materials and instruction on how to engage with the material.

The Task Library

Different Types of information should be provided:

  • Such as reports, articles, policy documents, newspaper reports, anecdotal information
  • Including variety in data display and type 
  • As well as "disractors" to prompt analytical development, such as source material with incongruity, bias (blogs, propaganda etc.)

Learners Role

Meaning for learners can be enhanced using a scenario, placing them in a real world role setting, e.g. analyst, consultant, journalist, product developer. The stakes can be increased by providing a rationale (within the scenario) for why an inaccurate response may impact them, e.g. the health of a friend is dependent on the recommendation.

 

 

 

 

3. Creating the Rubric
START: Identify the year group and department

Step 1.

Select the Assignment Type

Consider the cognitive skill set that aligns most closely with what the task is intended to elicit

Step 2.

Define the assignment topic

 

Describe the content and context that the learners will be engaging with.

Note: The description provided here will be incorporated into the rubric 

Step 3

Deciding on the assessment dimensions 

Dimensions are the breakdown elements of the cognitive skill. For skill development, coverage of all dimensions is suggested

Step 4.

Deciding on the assessment dimensions

Dimensions are the breakdown elements of the cognitive skill. For skill development, coverage of all dimensions s suggested

Step 5

Edit rubric scaffold to semantic preferences 

 

The rubric app auto-fills from choices selected. The edit functions allows for fine tuning of language. the levels displayed (developing, accomplished or advanced) are dependent on the year group identified

 

4. Interrelated Skills

Definitions for some common terms

Assessment: The process or means of evaluating academic work.

Assignment (task): Something assigned, as a particular task or activity

Context: The set of circumstances or facts that surround a particular event, situation, etc. (may refer to historical, social, cultural, political, or other)

Issue: A point, matter, or dispute, the decision of which is of special or public importance

Problem: Any question or matter involving doubt, uncertainty, or difficulty;  question proposed for solution or discussion

 
5. Level of Achievement
 

Assessment Levels 

Mapping of Rubric Level to Expected Achievement

Levels for the rubric as developing, accomplished or advanced; these labels can be replaced to suit departmental of institutional needs.

The criterion appearing in the rubrics are dependent on the year group selected in first step of the application. For example, when "first year" is selected, the rubric app displays criteria at level 0,1 and 2.

For analytical marking, a number could be attached to each level to derive a score for the assessment. For example developing =1, accomplished =2, and advanced =3.

6. Implementation

Using Rubrics to Improve Learning

These rubrics are intended as assessment tools, and to be used as teaching tools that support student learning and the development of higher-order thinking skills. To be used in this way, the learner needs to be aware of and engage in the assessment process.

Key Questions to Consider

  • Will learners be active in the rubric development process?
  • Will the assessment rubric be shared with the learners prior to the learning activity?
  • Are the learners contributing to the assessment process (eg. peer-evaluation/self-evaluation)?
  • Will there be an opportunity for learners to get feedback prior to the final submission date?

Example Rubric

 
7. Evaluating the Assessment

Backward design is an educational strategy, beginning with goal setting, then defining what it is that constitutes learning, followed by selection of instructional methods for meaningful learning. Wiggins & McTighe, (2005). Understanding by design.

  1. Review the rubric to determine if it describes your intended outcomes.
  2. Review the assigned task brief to evaluate the likelihood of it eliciting demonstration of the desired outcomes.
  3. Design specific learning experiences to support learners achievement of the outcomes.
  4. Reflect on the learning achievement and refine the assessment to cater for learners needs

 
8. Assessment Protocol

Calibration

Rater training should be undertaken by all markers. This is led by a facilitator, and includes: 

  1. Close reading of the rubric
  2. Discussion of the terms
  3. Practice scoring a work sample one row at a time
  4. Opportunity for participants to explain their reasoning and offer evidence to support their scores.
  5. Discussion of the level awarded; consensus of decision
  6. Repetition of practice marking until a common understanding of the standard is determined

Moderation

Assessment moderation processes involve review of work samples, and marking judgments. It is designed to support fair and equitable grading. Moderation groups are collaborative nature, can build peer relationships, and may be undertaken:

  • Within departments
  • Within faculties
  • Across institutions
  • Between institutions

 

 

Support Materials

quick guide is available (Download PDF 68KB).

These eight support document are available in a printable booklet format, in the Guide for Authentic Assessment of Cognitive Skills (Download full PDF, 1.8 MB.) Note: the print layout for the guide is designed for "booklet" print setting on US legal sized paper

Authored by Natalie Simper, last updated: July, 2016