CIRCA:Assessment Tools

From CIRCA

Jump to: navigation, search

This page gathers together a variety of techniques for evaluating and assessing video games.

Contents

List of Methods/Techniques

Note: Below is a list of some methods that may be used to assess parts of game design and assessment. Each method includes a brief explanation why it may be relevant to design as well as a few associated sources from our literature review. Please feel free to add or the revise to the list.

Guidelines for Methods:

  • Add only relevant methods.
  • Use a brief description to define the method.
  • Link it to relevant literature in the Literature Review & Precises section when possible.
  • Use proper citation
  • Alphabetical order is preferred.
  • Link to the Grand Assessment Framework for which method may be useful for which section of the framework. For example, Interviewing can be linked to Stakeholders and Expectations as '1.0', under recommended use with framework.

AB Testing

Testing your current design against alternative ones to determine which ones produce positive results.

Recommended Use with the Framework: 5.0-Design; 7.0-Feedback

Associated Literature:

  • Wikipedia contributors. Wikipedia: A/B Testing
  • "What is A/B testing?" Optimizely. https://www.optimizely.com/ab-testing. Accessed December 13, 2013.
  • Martin, Bella and Hanington, Bruce. "Universal Methods of Design". Rockport Publishers, 2012.
  • Siddharth Deswal. "Seven A/B testing mistakes you need to stop making in 2013". Visual website optimizer. January 4th, 2013. Accessed December 13, 2013.

Charters

Recommended Use with the Framework: 1.0-Stakeholders and Expectations; 2.0-Requirements; 4.0-Planning; 8.0-Closing

Associated Literature:

  • Rueker, Stan and Radzikowska. "The Iterative Design of a Project Charter for Interdisciplinary Research". In DIS '08 Proceedings of the 7th ACM conference on Designing interactive systems. 288-294. New York, NY: ACM. 2008.

Developmental Evaluation

Developmental Evaluation makes use of a developmental evaluator to inquire into developments, track developments, facilitate interpretation of developments and their significance, and engages with multiple parties engaged in and affected by the product regarding making judgements about what is being developing, how long it is being developed, and the consequences and impacts of what has been developed, and what are the next stages of development.

Recommended Use with Framework: 4.0-Planning; 5.0-Design

Associated Literature:

  • Patton, Michael. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use . New York: Guiford, 2011.

Digitization

The conversion of information into a digital format. Digitization makes information easier to preserve, access, and share.

Recommended Use with the Framework:

Associated Literature:

Feature Testing

Controls users' exposure to specific features or concepts of a product that are then tested against one another. It aims to understand how users will react to the project, with all its features, by exposing them to each hypothetical design at a time.

Recommended Use with the Framework: 7.0-Feedback

Associated Literature:

  • Laurel, Brenda. "Overview of Quantitative Methods in Design Research". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.

Focus Groups

Asking a group of people about their perceptions, opinions, beliefs, and attitudes regarding a product, issue, service, or idea.

Recommended Use with the Framework: 6.0-Delivery; 7.0-Feedback

Associated Literature:

Historical Methods

Recommended Use with the Framework:

Associated Literature:

Iterative Iteration

Based on a cyclical process of prototyping, testing, analysis, and refining a work in progress. It can be boiled down to test, analyze, refine, repeat.

Recommended Use with the Framework: 5.0-Design; 7.0 Feedback

Associated Literature:

  • Laurel, Brenda. "Play as Research". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.

Interviewing

Asking questions or talking to interviewees to gain insight into 'subjective understanding'.

Recommended Use with the Framework: 7.0-Feedback; 8.0-Closing

Associated Literature:

Journaling

Keeping reflective reports during the progress of the project.

Recommended Use with the Framework: 4.0-Planning

Associated Literature:

  • Ortlipp, Michelle. "Keeping and Using Reflective Journals in the Qualitative Research Process". In The Qualitative Report. 14.4 (2008):695-705.

Persona Scenarios

The use of archetypal users with specific goals and needs based on who would be using your your project. The ultimate goal of personas is to identify specific user goals and needs so they can be aligned with business needs and goals to create an agreed upon list of features and functions.

Recommended Use with the Framework: 5.0-Design; 6.0-Delivery

Associated Literature:

Play-Testing

Play-testing focuses on measuring the product’s quality and functionality.

Recommended Use with Framework: 5.0-Design; 6.0-Delivery

Associated Literature:

  • Warren, Scott, Greg Jones, and Line Lin. "Usability and Play Testing: The Often Missed Assessment." In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 131-146.

Rational planning model

The process of realizing a problem, establishing and evaluating planning criteria, creating alternatives, implementing alternatives, and monitoring progress of the alternatives. It is a multi-step model for making logically sound decisions that aims to follow the orderly path from problem identification through solution.

Associated Literature: 3.0-Resources; 4.0-Planning; 5.0-Design; 6.0-Delivery

Recommended use with Framework:

  • Simon, Herbert A. "Administrative Behavior: a Study of Decision-Making Processes in Administrative Organization ". New York, NY: Free Press 1947.
  • Wikipedia contributors. "Rational planning model". Wikipedia. http://en.wikipedia.org/wiki/Rational_planning_model.

Statistical Analysis

The collection and interpretation of statistical data. Statistics are frequently used to find relationships between data sets, extracting important properties from data sets, and visualizations.

Recommended Use with the Framework: 7.0-Feedback

Associated Literature:

Standardized Criteria

Including checklists and rubrics, standardized criteria sets expectations and a focus on measuring specific objectives.

Recommended Use with Framework: 2.0-Requirements; 3.0-Resources; 4.0-Planning; 5.0-Design; 7.0-Feedback; 8.0-Closing

Associated Literature:

Success/Failure Counters

The use of counters to determine the success or failure of measurable factors.

Recommended Use with the Framework: 8.0-Closing

Associated Literature:

Talk Aloud

Gathering data by having users talk aloud as they perform a set of tasks. Users are asked to say whatever they are looking at, thinking, doing, and feeling as they go about their task, allowing observers to know the process of task completion from the user point of view.

Recommended Use with the Framework: 6.0-Delivery; 7.0-Feedback

Associated Literature:

  • Jorgensen, Anker Helms. "Thinking-aloud in user interface design: a method promoting cognitive ergonomics". Ergonomics, 33 (4). 501-507. March 27, 2007.
  • Someran, Maartin, Barnard, Yvonne, and Sandberg, Jacobijn. "The Think Aloud Method: A Practical Guide to Modelling Cognitive Processes (Knowledge-Based Systems)". San Diego: Academic Press, 1994.

Text Methods

Analyzing a text to describe and interpret its content.

Recommended Use with the Framework:

Associated Literature:

Usability-Testing

Usability-testing focuses on measuring the capacity for the product to meet its intended purposes with the use of user-centered interaction.

Recommended Use with Framework: 5.0-Design; 6.0-Delivery; 7.0-Feedback

Associated Literature:

GRAND Assessment Framework

The GRAND Assessment Framework is an iterative assessment framework designed by the University of Alberta GRAND group. The framework starts with the assumption that the group research project involves practical game design, and continues by attempting to ask the most significant over-arching questions. It follows eight categories (see GRAND Assessment Page for questions and more information):

  • 1.0 Stakeholders and Expectations
  • 2.0 Requirements
  • 3.0 Resources
  • 4.0 Planning
  • 5.0 Design
  • 6.0 Delivery
  • 7.0 Feedback
  • 8.0 Closure

Specific approaches to projects

Personal tools