CIRCA:Assessment Tools

From CIRCA

(Difference between revisions)
Jump to: navigation, search
 
(27 intermediate revisions not shown)
Line 1: Line 1:
This page gathers together a variety of techniques for evaluating and assessing video games.
This page gathers together a variety of techniques for evaluating and assessing video games.
-
==General List of Assessment Techniques==
+
=List of Methods/Techniques =
-
'''Note:''' See the [[CIRCA: Assessment Framework| Assessment Framework]] for more information regarding the assessment frameworking we're developing
+
'''Note''': Below is a list of some methods that may be used to assess parts of game design and assessment. Each method includes a brief explanation why it may be relevant to design as well as a few associated sources from our literature review. Please feel free to add or the revise to the list.
-
===Qualitative Assessment Techniques===
+
'''Guidelines for Methods''':
 +
*Add only relevant methods.
 +
*Use a brief description to define the method.
 +
*Link it to relevant literature in the [[CIRCA:Literature Review & Precises |Literature Review & Precises]] section when possible.
 +
*Use proper citation
 +
*Alphabetical order is preferred.
 +
*Link to the [[CIRCA:Assessment Framework |Grand Assessment Framework]] for which method may be useful for which section of the framework. For example, Interviewing can be linked to Stakeholders and Expectations as '1.0', under recommended use with framework.
-
Qualitative techniques are likely to dominate video game assessment, for the success of a video game experience is often predicated upon difficult to quantify human elements such as "fun" and "enjoyment".
+
==[[CIRCA: A/B Testing|AB Testing]]==
 +
Testing your current design against alternative ones to determine which ones produce positive results.
-
====Evaluation by Game Creators====
+
Recommended Use with the Framework: 5.0-Design; 7.0-Feedback
-
* [[CIRCA: Journaling | Journaling]]
+
-
* [[CIRCA: Design Documents| Iterative Design Documents]]
+
-
====Evaluation by Game Players====
+
Associated Literature:
-
Game players may be asked to evaluate a game in a variety of ways.  
+
*Wikipedia contributors. [http://en.wikipedia.org/wiki/A/B_testing Wikipedia: A/B Testing]
 +
*"What is A/B testing?" Optimizely. https://www.optimizely.com/ab-testing. Accessed December 13, 2013.
 +
*Martin, Bella and Hanington, Bruce. "Universal Methods of Design". Rockport Publishers, 2012.
 +
*Siddharth Deswal. "Seven A/B testing mistakes you need to stop making in 2013". Visual website optimizer. January 4th, 2013. Accessed December 13, 2013.  
-
* [[CIRCA: Focus Groups | Focus Groups]]
+
==Charters==
-
* [[CIRCA: Prepost | Pre- and post- testing]]
+
 +
Recommended Use with the Framework: 1.0-Stakeholders and Expectations; 2.0-Requirements; 4.0-Planning; 8.0-Closing
-
===Quantitative Assessment Techniques===
+
Associated Literature:
-
Quantitative assessment of video games generally entails analyzing data collected about the progress of players through the game.
+
*Rueker, Stan and Radzikowska. "The Iterative Design of a Project Charter for Interdisciplinary Research". In DIS '08 Proceedings of the 7th ACM conference on Designing interactive systems. 288-294. New York, NY: ACM. 2008.
-
Quantitative assessment can involve the measurement of:
+
==Developmental Evaluation==
-
* of tickets/tokens purchased
+
-
* Success/failure rates
+
-
* [[CIRCA: Text Adventure | Journaling]]
+
Developmental Evaluation makes use of a developmental evaluator to inquire into developments, track developments, facilitate interpretation of developments and their significance, and engages with multiple parties engaged in and affected by the product regarding making judgements about what is being developing, how long it is being developed, and the consequences and impacts of what has been developed, and what are the next stages of development.
-
* Time in-game
+
-
* Amount of content interacted with
+
-
* Locations the game was accessed from
+
-
* specific times the game was accessed
+
 +
Recommended Use with Framework: 4.0-Planning; 5.0-Design
-
== Specific approaches to projects ==
+
Associated Literature:
-
* [[CIRCA:Fort Edmonton Park Assessment | Fort Edmonton Park Assessment ]]
+
* Patton, Michael. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use . New York: Guiford, 2011.
-
* [[CIRCA:OSBA Assessment | Old Strathcona Business Association Assessment ]]
+
-
= Draft Revised List For Methods/Techniques =
+
==[[CIRCA: Research Methods|Digitization]]==
 +
The conversion of information into a digital format. Digitization makes information easier to preserve, access, and share.
-
'''Note''': Below is a list of some methods that may be used to assess parts of game design and assessment. Each method includes a brief explanation why it may be relevant to design as well as a few associated sources from our literature review. Please feel free to add or the revise to the list.
+
Recommended Use with the Framework:  
-
'''Guidelines for Methods''':
+
Associated Literature:
-
*Add only relevant methods.
+
*Bailey, Charles W. "Digital Resource Curation Guide". Digital-Scholarship. August 12, 2012. http://digital-scholarship.org/dcrg/dcrg.htm
-
*Use a brief description to define the method.
+
*"Digitization Guidelines". University of Virginia Library. http://guides.lib.virginia.edu/content.php?pid=40437&sid=297543
-
*Link it to relevant literature in the [[CIRCA:Literature Review & Precises |Literature Review & Precises]] section when possible.  
+
*"Resources from Digitisation Doctor workshop now available". Wellcome Library. March 5, 2013. http://blog.wellcomelibrary.org/2013/05/resources-from-digitisation-doctor-workshop-now-available/
-
*Use proper citation
+
-
*Alphabetical order is preferred.  
+
-
*Link to the [[CIRCA:Initial questions to ask when starting group research projects with a practical game design component - |Grand Assessment Framework]] for which method may be useful for which section of the framework. For example, Interviewing can be linked to Stakeholders and Expectations as '1.0', under recommended use with framework.
+
-
==Developmental Evaluation==
+
==Feature Testing==
 +
Controls users' exposure to specific features or concepts of a product that are then tested against one another. It aims to understand how users will react to the project, with all its features, by exposing them to each hypothetical design at a time.
 +
 
 +
Recommended Use with the Framework: 7.0-Feedback
-
Developmental Evaluation makes use of a developmental evaluator to inquire into developments, track developments, facilitate interpretation of developments and their significance, and engages with multiple parties engaged in and affected by the product regarding making judgements about what is being developing, how long it is being developed, and the consequences and impacts of what has been developed, and what are the next stages of development.
 
-
Recommended Use with Framework:
 
Associated Literature:
Associated Literature:
-
* Patton, Michael. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use . New York: Guiford, 2011.
+
*Laurel, Brenda. "Overview of Quantitative Methods in Design Research". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.
 +
==Focus Groups==
 +
Asking a group of people about their  perceptions, opinions, beliefs, and attitudes regarding a product, issue, service, or idea.
 +
 +
Recommended Use with the Framework: 6.0-Delivery; 7.0-Feedback
 +
 +
Associated Literature:
 +
*JISC Usability Foundation Study. "3. Usability Methodologies and Techniques". http://www.jisc.ac.uk/media/documents/programmes/presentation/usabilitymethods.pdf
 +
*"Focus Groups". Usability.gov. http://www.usability.gov/how-to-and-tools/methods/focus-groups.html
 +
 +
==[[CIRCA: Research Methods|Historical Methods]]==
 +
 +
Recommended Use with the Framework:
 +
 +
Associated Literature:
 +
*
 +
 +
==Iterative Iteration==
 +
Based on a cyclical process of prototyping, testing, analysis, and refining a work in progress. It can be boiled down to test, analyze, refine, repeat.
 +
 +
Recommended Use with the Framework: 5.0-Design; 7.0 Feedback
 +
 +
Associated Literature:
 +
*Laurel, Brenda. "Play as Research". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.
 +
 +
==[[CIRCA: Interviews|Interviewing]]==
 +
Asking questions or talking to interviewees to gain insight into 'subjective understanding'.
 +
 +
Recommended Use with the Framework: 7.0-Feedback; 8.0-Closing
 +
 +
Associated Literature:
 +
*Laurel, Brenda. "Qualitative Methods: From Boring to Brilliant". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.
 +
*JISC Usability Foundation Study. "3. Usability Methodologies and Techniques". http://www.jisc.ac.uk/media/documents/programmes/presentation/usabilitymethods.pdf
 +
 +
==Journaling==
 +
Keeping reflective reports during the progress of the project.
 +
 +
Recommended Use with the Framework: 4.0-Planning
 +
 +
Associated Literature:
 +
* Ortlipp, Michelle. "Keeping and Using Reflective Journals in the Qualitative Research Process". In The Qualitative Report. 14.4 (2008):695-705.
 +
 +
==Persona Scenarios==
 +
The use of archetypal users with specific goals and needs based on who would be using your your project. The ultimate goal of personas is to identify specific user goals and needs so they can be aligned with business needs and goals to create an agreed upon list of features and functions.
 +
 +
Recommended Use with the Framework: 5.0-Design; 6.0-Delivery
 +
 +
Associated Literature:
 +
*Laurel, Brenda. "User Requirements". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.
 +
*JISC Usability Foundation Study. "3. Usability Methodologies and Techniques". http://www.jisc.ac.uk/media/documents/programmes/presentation/usabilitymethods.pdf
 +
*"Personas". Usability.gov. http://www.usability.gov/how-to-and-tools/methods/personas.html.
==Play-Testing==
==Play-Testing==
-
Play-testing focuses on measuring the product’s quality and functionality.  
+
Play-testing focuses on measuring the product’s quality and functionality.  
-
Recommended Use with Framework:
+
 
 +
Recommended Use with Framework: 5.0-Design; 6.0-Delivery
 +
 
Associated Literature:
Associated Literature:
* Warren, Scott, Greg Jones, and Line Lin. "Usability and Play Testing: The Often Missed Assessment." In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 131-146.  
* Warren, Scott, Greg Jones, and Line Lin. "Usability and Play Testing: The Often Missed Assessment." In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 131-146.  
-
==Usability-Testing==
+
==Rational planning model==
-
Usability-testing focuses on measuring the capacity for the product to meet its intended purposes with the use of user-centered interaction.
+
The process of realizing a problem, establishing and evaluating planning criteria, creating alternatives, implementing alternatives, and monitoring progress of the alternatives. It is a multi-step model for making logically sound decisions that aims to follow the orderly path from problem identification through solution.  
-
Recommended Use with Framework:
+
 
-
Associated Literature:
+
Associated Literature: 3.0-Resources; 4.0-Planning; 5.0-Design; 6.0-Delivery
-
*[[CIRCA:Usability and Instructional Design Heuristics for E-Learning Evaluation - Benson, L., et al. | Perry, David., Rusel DeMaria, and Safari Books Online (Firm). Benson, L., D. Elliott, M. Grant, D. Holschuh, B. Kim, H. Kim, E. Lauber, S. Loh, and T.C. Reeves. "Usability and Instructional Design Heuristics for E-Learning Evaluation." Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 . Chesapeake: AACE, 2002. 1615-1621 ]] 
+
 
-
* [[CIRCA:David Perry on Game Design: A Brainstorming Toolbox |Perry, David., Rusel DeMaria, and Safari Books Online (Firm). David Perry On Game Design: A Brainstorming Toolbox. Boston: Charles River Media, 2009]]  
+
Recommended use with Framework:
-
* Warren, Scott, Greg Jones, and Line Lin. "Usability and Play Testing: The Often Missed Assessment. Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 131-146.
+
*Simon, Herbert A. "Administrative Behavior: a Study of Decision-Making Processes in Administrative Organization ". New York, NY: Free Press 1947.
 +
*Wikipedia contributors. "Rational planning model". Wikipedia. http://en.wikipedia.org/wiki/Rational_planning_model.  
 +
 
 +
==[[CIRCA: Basic Statistical Analysis|Statistical Analysis]]==
 +
The collection and interpretation of statistical data. Statistics are frequently used to find relationships between data sets, extracting important properties from data sets, and visualizations.
 +
 
 +
Recommended Use with the Framework: 7.0-Feedback
 +
Associated Literature:
 +
*Wikipedia Contributors. "Statistics". Wikipedia. http://en.wikipedia.org/wiki/Statistics
==Standardized Criteria==
==Standardized Criteria==
-
Including checklists and rubrics, standardized criteria sets expectations and a focus on measuring specific objectives.
+
Including checklists and rubrics, standardized criteria sets expectations and a focus on measuring specific objectives.
-
Recommended Use with Framework:
+
 
 +
Recommended Use with Framework: 2.0-Requirements; 3.0-Resources; 4.0-Planning; 5.0-Design; 7.0-Feedback; 8.0-Closing
 +
 
Associated Literature:
Associated Literature:
* [[CIRCA:Game Analysis: Project Design Checklist - Hjartarson, Paul |Hjartarson, Paul. Note. Project Design Checklist.]]   
* [[CIRCA:Game Analysis: Project Design Checklist - Hjartarson, Paul |Hjartarson, Paul. Note. Project Design Checklist.]]   
Line 82: Line 145:
* [[CIRCA:Serious Educational Game Assessment - Leonard Annetta and Stephen Bronack (Eds.) |Annetta, Leonard A., Richard Lamb, and Marcus Stone. “Assessing Serious Educational Games: The Development of a Scoring Rubric”. In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 75-94]]   
* [[CIRCA:Serious Educational Game Assessment - Leonard Annetta and Stephen Bronack (Eds.) |Annetta, Leonard A., Richard Lamb, and Marcus Stone. “Assessing Serious Educational Games: The Development of a Scoring Rubric”. In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 75-94]]   
* Javier, Ricardo and Mena, Rademacher. “Game Assessment Using the E/E Grid”. In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 95-118.
* Javier, Ricardo and Mena, Rademacher. “Game Assessment Using the E/E Grid”. In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 95-118.
 +
 +
==Success/Failure Counters==
 +
The use of counters to determine the success or failure of measurable factors.
 +
 +
Recommended Use with the Framework: 8.0-Closing
 +
 +
Associated Literature:
 +
 +
==Talk Aloud==
 +
Gathering data by having users talk aloud as they perform a set of tasks. Users are asked to say whatever they are looking at, thinking, doing, and feeling as they go about their task, allowing observers to know the process of task completion from the user point of view.
 +
 +
Recommended Use with the Framework: 6.0-Delivery; 7.0-Feedback
 +
 +
Associated Literature:
 +
*Jorgensen, Anker Helms. "Thinking-aloud in user interface design: a method promoting cognitive ergonomics". Ergonomics, 33 (4). 501-507.  March 27, 2007.
 +
*Someran, Maartin, Barnard, Yvonne, and Sandberg,  Jacobijn. "The Think Aloud Method: A Practical Guide to Modelling Cognitive Processes (Knowledge-Based Systems)". San Diego: Academic Press, 1994.
 +
 +
==[[CIRCA: Research Methods|Text Methods]]==
 +
Analyzing a text to describe and interpret its content.
 +
 +
Recommended Use with the Framework:
 +
 +
Associated Literature:
 +
*
 +
 +
==Usability-Testing==
 +
Usability-testing focuses on measuring the capacity for the product to meet its intended purposes with the use of user-centered interaction.
 +
 +
Recommended Use with Framework: 5.0-Design; 6.0-Delivery; 7.0-Feedback
 +
 +
Associated Literature:
 +
*[[CIRCA:Usability and Instructional Design Heuristics for E-Learning Evaluation - Benson, L., et al. | Perry, David., Rusel DeMaria, and Safari Books Online (Firm). Benson, L., D. Elliott, M. Grant, D. Holschuh, B. Kim, H. Kim, E. Lauber, S. Loh, and T.C. Reeves. "Usability and Instructional Design Heuristics for E-Learning Evaluation." Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 . Chesapeake: AACE, 2002. 1615-1621 ]] 
 +
* [[CIRCA:David Perry on Game Design: A Brainstorming Toolbox |Perry, David., Rusel DeMaria, and Safari Books Online (Firm). David Perry On Game Design: A Brainstorming Toolbox. Boston: Charles River Media, 2009]]
 +
* Warren, Scott, Greg Jones, and Line Lin. "Usability and Play Testing: The Often Missed Assessment. Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 131-146.
 +
 +
=GRAND Assessment Framework=
 +
 +
The [[CIRCA:Assessment Framework |GRAND Assessment Framework]] is an iterative assessment framework designed by the University of Alberta GRAND group. The framework starts with the assumption that the group research project involves practical game design, and continues by attempting to ask the most significant over-arching questions. It follows eight categories (see GRAND Assessment Page for questions and more information):
 +
*1.0 Stakeholders and Expectations
 +
*2.0 Requirements
 +
*3.0 Resources
 +
*4.0 Planning
 +
*5.0 Design
 +
*6.0 Delivery
 +
*7.0 Feedback
 +
*8.0 Closure
 +
 +
= Specific approaches to projects =
 +
* [[CIRCA:Fort Edmonton Park Assessment | Fort Edmonton Park Assessment ]]
 +
* [[CIRCA:OSBA Assessment | Old Strathcona Business Association Assessment ]]

Current revision as of 17:26, 8 January 2014

This page gathers together a variety of techniques for evaluating and assessing video games.

Contents

List of Methods/Techniques

Note: Below is a list of some methods that may be used to assess parts of game design and assessment. Each method includes a brief explanation why it may be relevant to design as well as a few associated sources from our literature review. Please feel free to add or the revise to the list.

Guidelines for Methods:

  • Add only relevant methods.
  • Use a brief description to define the method.
  • Link it to relevant literature in the Literature Review & Precises section when possible.
  • Use proper citation
  • Alphabetical order is preferred.
  • Link to the Grand Assessment Framework for which method may be useful for which section of the framework. For example, Interviewing can be linked to Stakeholders and Expectations as '1.0', under recommended use with framework.

AB Testing

Testing your current design against alternative ones to determine which ones produce positive results.

Recommended Use with the Framework: 5.0-Design; 7.0-Feedback

Associated Literature:

  • Wikipedia contributors. Wikipedia: A/B Testing
  • "What is A/B testing?" Optimizely. https://www.optimizely.com/ab-testing. Accessed December 13, 2013.
  • Martin, Bella and Hanington, Bruce. "Universal Methods of Design". Rockport Publishers, 2012.
  • Siddharth Deswal. "Seven A/B testing mistakes you need to stop making in 2013". Visual website optimizer. January 4th, 2013. Accessed December 13, 2013.

Charters

Recommended Use with the Framework: 1.0-Stakeholders and Expectations; 2.0-Requirements; 4.0-Planning; 8.0-Closing

Associated Literature:

  • Rueker, Stan and Radzikowska. "The Iterative Design of a Project Charter for Interdisciplinary Research". In DIS '08 Proceedings of the 7th ACM conference on Designing interactive systems. 288-294. New York, NY: ACM. 2008.

Developmental Evaluation

Developmental Evaluation makes use of a developmental evaluator to inquire into developments, track developments, facilitate interpretation of developments and their significance, and engages with multiple parties engaged in and affected by the product regarding making judgements about what is being developing, how long it is being developed, and the consequences and impacts of what has been developed, and what are the next stages of development.

Recommended Use with Framework: 4.0-Planning; 5.0-Design

Associated Literature:

  • Patton, Michael. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use . New York: Guiford, 2011.

Digitization

The conversion of information into a digital format. Digitization makes information easier to preserve, access, and share.

Recommended Use with the Framework:

Associated Literature:

Feature Testing

Controls users' exposure to specific features or concepts of a product that are then tested against one another. It aims to understand how users will react to the project, with all its features, by exposing them to each hypothetical design at a time.

Recommended Use with the Framework: 7.0-Feedback

Associated Literature:

  • Laurel, Brenda. "Overview of Quantitative Methods in Design Research". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.

Focus Groups

Asking a group of people about their perceptions, opinions, beliefs, and attitudes regarding a product, issue, service, or idea.

Recommended Use with the Framework: 6.0-Delivery; 7.0-Feedback

Associated Literature:

Historical Methods

Recommended Use with the Framework:

Associated Literature:

Iterative Iteration

Based on a cyclical process of prototyping, testing, analysis, and refining a work in progress. It can be boiled down to test, analyze, refine, repeat.

Recommended Use with the Framework: 5.0-Design; 7.0 Feedback

Associated Literature:

  • Laurel, Brenda. "Play as Research". In Design Research:Methods and Perspectives. Massachusetts: Institute of Technology. 2003.

Interviewing

Asking questions or talking to interviewees to gain insight into 'subjective understanding'.

Recommended Use with the Framework: 7.0-Feedback; 8.0-Closing

Associated Literature:

Journaling

Keeping reflective reports during the progress of the project.

Recommended Use with the Framework: 4.0-Planning

Associated Literature:

  • Ortlipp, Michelle. "Keeping and Using Reflective Journals in the Qualitative Research Process". In The Qualitative Report. 14.4 (2008):695-705.

Persona Scenarios

The use of archetypal users with specific goals and needs based on who would be using your your project. The ultimate goal of personas is to identify specific user goals and needs so they can be aligned with business needs and goals to create an agreed upon list of features and functions.

Recommended Use with the Framework: 5.0-Design; 6.0-Delivery

Associated Literature:

Play-Testing

Play-testing focuses on measuring the product’s quality and functionality.

Recommended Use with Framework: 5.0-Design; 6.0-Delivery

Associated Literature:

  • Warren, Scott, Greg Jones, and Line Lin. "Usability and Play Testing: The Often Missed Assessment." In Serious Educational Game Assessment: Practical Methods and Models for Educational Games, Simulations and Virtual Worlds . Rotterdam: Sense Publishers, 2011. 131-146.

Rational planning model

The process of realizing a problem, establishing and evaluating planning criteria, creating alternatives, implementing alternatives, and monitoring progress of the alternatives. It is a multi-step model for making logically sound decisions that aims to follow the orderly path from problem identification through solution.

Associated Literature: 3.0-Resources; 4.0-Planning; 5.0-Design; 6.0-Delivery

Recommended use with Framework:

  • Simon, Herbert A. "Administrative Behavior: a Study of Decision-Making Processes in Administrative Organization ". New York, NY: Free Press 1947.
  • Wikipedia contributors. "Rational planning model". Wikipedia. http://en.wikipedia.org/wiki/Rational_planning_model.

Statistical Analysis

The collection and interpretation of statistical data. Statistics are frequently used to find relationships between data sets, extracting important properties from data sets, and visualizations.

Recommended Use with the Framework: 7.0-Feedback

Associated Literature:

Standardized Criteria

Including checklists and rubrics, standardized criteria sets expectations and a focus on measuring specific objectives.

Recommended Use with Framework: 2.0-Requirements; 3.0-Resources; 4.0-Planning; 5.0-Design; 7.0-Feedback; 8.0-Closing

Associated Literature:

Success/Failure Counters

The use of counters to determine the success or failure of measurable factors.

Recommended Use with the Framework: 8.0-Closing

Associated Literature:

Talk Aloud

Gathering data by having users talk aloud as they perform a set of tasks. Users are asked to say whatever they are looking at, thinking, doing, and feeling as they go about their task, allowing observers to know the process of task completion from the user point of view.

Recommended Use with the Framework: 6.0-Delivery; 7.0-Feedback

Associated Literature:

  • Jorgensen, Anker Helms. "Thinking-aloud in user interface design: a method promoting cognitive ergonomics". Ergonomics, 33 (4). 501-507. March 27, 2007.
  • Someran, Maartin, Barnard, Yvonne, and Sandberg, Jacobijn. "The Think Aloud Method: A Practical Guide to Modelling Cognitive Processes (Knowledge-Based Systems)". San Diego: Academic Press, 1994.

Text Methods

Analyzing a text to describe and interpret its content.

Recommended Use with the Framework:

Associated Literature:

Usability-Testing

Usability-testing focuses on measuring the capacity for the product to meet its intended purposes with the use of user-centered interaction.

Recommended Use with Framework: 5.0-Design; 6.0-Delivery; 7.0-Feedback

Associated Literature:

GRAND Assessment Framework

The GRAND Assessment Framework is an iterative assessment framework designed by the University of Alberta GRAND group. The framework starts with the assumption that the group research project involves practical game design, and continues by attempting to ask the most significant over-arching questions. It follows eight categories (see GRAND Assessment Page for questions and more information):

  • 1.0 Stakeholders and Expectations
  • 2.0 Requirements
  • 3.0 Resources
  • 4.0 Planning
  • 5.0 Design
  • 6.0 Delivery
  • 7.0 Feedback
  • 8.0 Closure

Specific approaches to projects

Personal tools