CIRCA:Fort Edmonton Park Assessment

From CIRCA

Revision as of 16:35, 11 June 2011 by VickyVarga (Talk | contribs)
Jump to: navigation, search

Broadly speaking there are three methods of assessment to be used:

  • Evaluation by participants
  • Evaluation by game producers
  • Quantitative analysis of game metrics

Contents

Interaction with participants

In order to determine the success of the game it will be necessary to get information from participants about their experiences. To this end we will need to have two groups of volunteers while the game is running and give each a post-experience questionnaire. Both a control group who does not play the game and a group which does play the game will be required to determine whether the experience of playing the game enhanced the overall park experience.

The questionnaire will be used to gauge how much fun participants in the game had versus those who simply visited the park and did not play our game. The questionnaire will also be used to gauge if they retained more knowledge from playing the game than from simply visiting the park and learning from signs and interpreters. Finally the questionnaire will help us ascertain what parts of the game interface worked and what needs to be improved.

The first two sections of the questionnaire will be identical for both the playing and control groups. The third (interface) section will only be on the questionnaire given to the playing group.

Learning

The learning portion will likely have to be in the form of a quiz to test participant’s knowledge. It may work better if they are not aware they are to take a quiz so they do not go through the park and study instead of playing the game for fun.

Did participants learn?

fAR-Play has been developed as an academic project with an interest in serious games and games for learning, thus, it is important for us to gauge if participants actually learned through the game.

Did they learn more, differently, or better than they would have without the game?

In order for the game to be successful the participants must learn in a way that differs from the normal park experience. This could be learning a different historical narrative, supplemental learning to what normally takes place in the park or simply an increased engagement with the material. We want to determine in which of these ways, if any, participants learned.

Did participants retain and/or use knowledge after the visit?

This is to assess long term retention to see if anything learned at the park found its way into longer term knowledge or not. Both University of Alberta researchers and Fort Edmonton Park staff are interested in this result. Thus, the questionnaire ought to include a question asking if we may mail the participant a follow up questionnaire in a month's time.

Demographic Information

General demographic information (age, gender, experience with technology) will be useful to collect to identify who is playing the game. This will help determine whether the intended target demographic (men 18-35) is indeed the group who plays the game.

Fun

The fun portion necessitates opinion based responses and to that end question using the Likert scale (1-5, where 1 is strongly disagree and 5 is strongly agree)

The interface portion will likely be a combination of Likert questions and asking fort comments from participants.

The staff at Fort Edmonton Park is also interested in what degree of long term knowledge was gained from the visit so we would also get permission to have participants fill out a second questionnaire one month after their visit, for both control groups. IT will be important to keep track of which participants fill out a short term questionnaire in addition to the long term one, as those having filled out the short term survey may be more likely to retain long term knowledge of the visit.

Was the experience fun?

Part of a game (generally speaking) is to have fun, so we want to assess whether or not the experience was fun, as well as conducive to learning

Was it more fun than just a visit to the park?

Did the game make visiting the park more fun than it would have otherwise? The hope is to bring people to the park that may not otherwise visit so an experience on top of the normal visit is important to do that.

Interface

Did the game interface work?

Since this is the first full scale test of a fAR-Play game it will be important to gather audience feedback regarding the interface and gameplay flow. It may be useful to compare the perceptions of game play ease/difficulty with the quantitative metrics collected from the game.

Were there problems with connectivity?

One concern with this game is the spotty connectivity within Fort Edmonton Park. As testing indicates that some phones/providers have

Interaction with designers and content creators

fAR-Play has been developed with the intention of using it as a platform for game development, therefore, it is important to assess the success of the relationship with the client. Additionally, the experiences of the students involved in designing and developing the game may provide insight into the overall process of game development.

Fort Edmonton Park Staff

We want to interview Fort Edmonton Park staff involved to find out how the experience went for them and to see what could be improved and what aspects they liked.

How did this process work for the Fort Edmonton Park Staff

Given that fAR-Play is intended for client use it is important to ascertain if the designers and content creators for on the client side had a good experience with the platform and developing the game.

HuCo Students/Faculty

  • Journaling
  • Feedback to CS
  • Reflections - ask Calen for a reflection?

Comp Sci Students/Faculty

Metrics/data analysis

Participant interaction can be partly backed up by simple metrics from the game. By recording which questions users answered and how long they stayed on various web pages for the game it will be possible to see to what degree users engaged with the content and if users started the game then gave up or lost interest.

Success rates

  • Number of tries to get the question correct
  • Most/least attempted question
  • Most/least popular adventure

Entry points

  • Which adventure do participants begin on? Which question?
  • Which page is visited most often?

Credits

  • Initial document - Calen Henry – April 2011
  • Assistance with assessment methods - Erik DeJong - April 2011
  • Revision/expansion - Vicky Varga - June 2011
Personal tools