Jump to: navigation, search

# LaTeX code for Poster Proposal of GWrit

Click here for a plain text version of this LaTeX code.
\documentclass[a4paper,11pt]{article}
\usepackage{ulem}
\usepackage{a4wide}
\usepackage[dvipsnames,svgnames]{xcolor}
\usepackage[pdftex]{graphicx}

\usepackage{hyperref}
% commands generated by html2latex

\begin{document}\hypertarget{The_Game_of_Writing:_Gamification_and_Social_Commenting_in_Writing_Instruction}{}

\subsection{\textbf{ The Game of Writing: Gamification and Social Commenting in Writing Instruction}}

How can we test gamification and social learning in online writing environments? This poster will demonstrate an online writing environment, GWrit (Game of Writing), where students can comment on each other???s writing and where they get rewards for task activity in a cooperative environment (gamification in this instance is not competitive). The application of game-based learning strategies to teaching writing shows promise, with one study reporting that their role-playing game improved the quality of student writing (Wang, Chen, Chang \& Chan, 2016). GWrit has been developed by a cross-disciplinary team of  academics and programmers at the University of Alberta over the past three years and has been used with over 1000 students.

This poster/demonstration brings together research on GWrit from the following perspectives: gamification, social-network influenced peer review, and the task completion structures. Our research will be summarized on the poster and we will demonstrate the environment during the poster session.

\textbf{Gamification.} Deterding (2011) defined gamification as the use of game elements and game design techniques in non-game contexts to engage people in solving problems. Gamified environments employ a number of mechanisms to encourage people to engage with them (Dicheva et al., 2015), and existing studies prove that gamified learning environments create deeper engagement of students. (Barata, Gama, Jorge \& Gon??alves, 2013; Fitz-Walter, Tjondronegoro \& Wyeth, 2012). Following Deterding???s definition, we incorporates gamification analytics, feedback, and social-media inspired commenting in a cooperative environment, as the key pedagogical innovations behind GWrit. Traditional schooling is part of the problem: many students perceive it as ineffective and boring (Dicheva, Dichev, Agre \& Angelova, 2015), and many students need flexible learning platforms to enable them to balance work/life responsibilities. Our gamified  environment has many ???surface??? (award trigger systems, competitive environments, badges, and ranks) as well as ???deep??? gamification components (task completion structures, social commenting support, public posting of draft documents).

When demonstrating GWrit we will first introduce the way the system was designed to support experimenting with commenting and gamification, and then show the gamification rule editing environment we developed. Our working hypothesis was that users want information about what they are doing and that gamification can be a playful way of representing that information back to the users so that they can make decisions and possibly be motivated differently. This proved to be true in the commenting components of the course, but the task completion components need further refinement. The poster will summarize usability research we have done on the system.

\textbf{Social-network influenced peer review.} Social networking has also been shown to have a positive influence on students??? academic learning (Tian, Yu, Vogel \& Kwok, 2011; Tsuiping , 2016). Within GWrit we provided an environment where students have the opportunity to both read and then comment on each other???s drafts, a technique that has been shown to improve writing (Schunn, Godley, \& DeMartino 2016; Ion, Barrera-Corominas \& Tom??s-Folch, 2016). Reading skill is directly linked to writing improvement, particularly if students are reading texts similar to the texts they are trying to produce (Source). GWrit allows students to post drafts of their documents for review and comment by other students, peer tutors, and graders. The writer of the draft can respond to each comment, and they are also likely to reciprocate by reading and commenting on the drafts of the students who gave them comments. GWrit provides flexibility for students by requiring less time face-to-face time through an asynchronous environment, an environment that leads to deeper and more serious comments (Tsuiping, 2016). Micro-networks of comments sprout up within the comments on these texts. Because students in the writing course version of GWrit have the option of working on one of three different assignments, larger, informal networks of students who are working on the same assignment also coalesce. The writing course version of GWrit has four main three-week long modules with a choice of three assignments in each module; the social networks re-form at the end of each module. Our early assessments of students and commenting in the writing course confirm what others have reported: that peer feedback is as valuable as instructor feedback (Guasch, Espasa, Alvarez, Kirschner 2013). We found that students who earn an ???A??? level grade write, on average, 58 comments over the term; students who earn an average of  ???C??? in the course write an average of 28 comments; ???F??? students wrote an average of 4 comments. It seems clear that engagement with the course materials through commenting is related to overall success in the course. In a post-course student survey, 68\% of respondents said that the comments helped them improve their documents while 25\% disagreed with that statement. The poster will summarize this research on commenting.

\textbf{Task completion structures.} Third, the poster will deal with the role task completion structures play in motivating learning. GWrit incorporates three task completion structures to help students: the course completion fuel gauge, a task list, and assignment deadlines. All three of these task completion structures were used in WRS 102 in winter 2016 term and the first and the last structures were used in fall 2015 term. Early findings show that students were divided on the task structure: only 22\% felt it helped them stay on track and complete assignments, while 46\% disagreed with that statement and 32\% were neutral. When asked if the points and badges system motivated them to engage with course material, 54\% of students disagreed and only 25\% agreed; 20\% were neutral. When asked if ???the fuel gauge, leaderboard, and badge page showed me my progress in the course,??? 45\% disagreed with the statement while 34\% agreed; 20\% were neutral.

In our discussion of the research on the poster we will summarize interview data on why we think the various aspects of the system work, which areas we think need to be improved to work better, and how we intend to transform a curriculum-based tool (the course-based version of GWrit) into a free-standing, web-based site.

\textbf{References:}

Barata, G., Gama, S., Jorge, J., \& Gon??alves, D. (2013). Improving Participation and Learning with Gamification. In Proceedings of the First International Conference on Gameful Design, Research, and Applications (pp. 10???17). New York, NY, USA: ACM. \href{http://doi.org/10.1145/2583008.2583010}{http://doi.org/10.1145/2583008.2583010}

Deterding, S., Dixon, D., Khaled, R., \& Nacke, L. (2011). From Game Design Elements to Gamefulness: Defining ???Gamification.??? In Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments (pp. 9???15). New York, NY, USA: ACM. \href{http://doi.org/10.1145/2181037.2181040}{http://doi.org/10.1145/2181037.2181040}

Dicheva, D., Dichev, C., Agre, G., \& Angelova, G. (2015). Gamification in Education: A Systematic Mapping Study. Journal of Educational Technology \& Society, 18(3), 75???88.

Fitz-Walter, Z., Tjondronegoro, D., \& Wyeth, P. (2012). A Gamified Mobile Application for Engaging New Students at University Orientation. In Proceedings of the 24th Australian Computer-Human Interaction Conference (pp. 138???141). New York, NY, USA: ACM. \href{http://doi.org/10.1145/2414536.2414560}{http://doi.org/10.1145/2414536.2414560}

Guasch, T., Espasa, A., Alvarez, I.M., \& Kirschner, P. A. (2013). Effects of feedback on collaborative writing in an online learning environment, Distance Education, 34:3, 324-338, DOI: 10.1080/01587919.2013.835772

Ion, G., Barrera-Corominas, A., \& Tom??s-Folch, M. (2016). Written peer-feedback to enhance students??? current and future learning. International Journal of Educational Technology in Higher Education, 13:15. DOI 10.1186/s41239-016-0017-y

Hasan, L., Morris, A., \& Probets, S. (2009). Using Google Analytics to Evaluate the Usability of E-Commerce Sites. In M. Kurosu (Ed.), Human Centered Design (pp. 697???706). Springer Berlin Heidelberg. Retrieved from \href{http://link.springer.com/chapter/10.1007/978-3-642-02806-9_81}{http://link.springer.com/chapter/10.1007/978-3-642-02806-9\_81}

Schunn, C., Godley, A.,  \& DeMartino, S.  (2016). The Reliability and Validity of Peer Review of Writing in High School AP English Classes. Journal of Adolescent Literacy 60:1, 13-23.

Tsuiping C. (2016). Technology-supported peer feedback in ESL/EFL writing classes: a research synthesis, Computer Assisted Language Learning, 29:2, 365-397. DOI: 10.1080/09588221.2014.960942

Tian, S. W., Yu, A. Y., Vogel, D., \& Kwok, R. C. W. (2011). The impact of online social networking on learning: a social integration perspective. International Journal of Networking and Virtual Organisations, 8(3/4), 264. \href{http://doi.org/10.1504/IJNVO.2011.039999}{http://doi.org/10.1504/IJNVO.2011.039999}

Wang, J. H., Chen, S. Y., Chang, B., \& Chan, T.W.  (2016). From integrative to game-based integrative peer response: high ability versus low ability. Journal of Computer-assisted Learning, 32, 170???185.  doi: 10.1111/jcal.12125

\end{document}