Almost a year ago, GVSU Libraries launched LibraryQuest, our mobile quest-based game. It was designed to teach users about library spaces and services in a way that (we hoped) would be fun and engaging. The game was released “into the wild” in the last week of August, 2013, which is the beginning of our fall semester. It ran continuously until late November, shortly after midterms (we wanted to end early enough in the semester that we still had students on campus for post-game assessment efforts). For details on the early development of the game, take a look at my earlier ACRL TechConnect post. This article will focus on what happened after launch.

Running the Game
Once the app was released, we settled on a schedule that would put out between three to five new quests each month the game ran. Designing quests is very time intensive, and 3-5 a month was all we could manage with the man and woman power we had available. We also had short duration quests run at random intervals to encourage students to keep checking the app. Over the course of the game, we created about 30 quests total. Almost all quests were designed with a specific educational objective in mind, such as showing students how a specific library system worked or where something or someone was in the physical building. Quests were chiefly designed by our Digital Initiatives Librarian (me) with help and support from our implementation team and other staff in the library as needed.
For most of the quests, we developed quest write-up sheets like this one: Raiders of the Lost…Bin. The sheets detailed the name of the quest, points, educational objective, steps, completion codes, and any other information that defined the quest. These sheets proved invaluable whenever a staff member needed to know something about a quest, which was often. Even simple quests like the one above required a fair amount of cooperation and coordination. For the raiders quest, we needed a special cataloging record created, we had to tag several plastic crowns and get them into our automatic storage and retrieval system.
For every quest players completed, they earned points. For every thirty points they earned, they were entered once in a drawing to win an iPad. This was a major component of the game’s advertising, since we imagined it would be the biggest draw to play (and we may not have been right, as you’ll see). Once the game closed in November, we held the drawing, publicized the winner, and then commenced a round of post-game assessment.
Thank You for Playing: Post-Game Assessment
When the game wrapped in mid-November, we took some time to examine the statistics the game had collected. One of our very talented design students created a game dashboard that showed all the metrics collected by the game database in graphic form. The final tally of registered players came in at 397. That means 397 people downloaded the app and logged in at least once (in case you’re curious, the total enrollment of GVSU is 25,000 students). This number probably includes a few non-students (since anyone could download the app), but we did some passes throughout the life of the game to remove non-student players from the tally and so feel confident that the vast majority of registered players are students. During development, we set a goal of having at least 300 registered players, based mostly on the cost of the game and how much money we had spent on other outreach efforts. So we did, technically, meet that goal, but a closer examination of the numbers paints a more nuanced picture of student participation.

Of the registered players, 173 earned points, meaning they completed at least one quest. That means that 224 players downloaded the app and logged in at least once, but then failed to complete any quest content. Clearly, getting players to take the first step and get involved in the game was somehow problematic. There are any number of explanations for this, including encounters with technical problems that may have turned players off (the embedded QR code scanner was a problem throughout the life of the game), an unwillingness to travel to locations to do physical quests, or something else entirely. The maximum number of points you could earn was 625, which was attained by one person, although a few others came close. Players tended to cluster at the lower and middle of the point spectrum, which was entirely expected. Getting the maximum number of points required a high degree of dedication, since it meant paying very close attention to the app for all the temporary, randomly appearing quests.

In general, online-only quests were more popular than quests involving physical space, and were taken and completed more often. Of the top five most-completed quests, four are online-only. There are a number of possible explanations for this, including the observation offered by one of our survey recipients that possibly a lot of players were stationed at our downtown campus and didn’t want to travel to our Allendale campus, which is where most of the physical quests were located.
Finally, of our 397 registered users, only 60 registered in the second semester the game ran. The vast majority signed up soon after game launch, and registrations tapered off over time. This reinforced data we collected from other sources that suggested the game ran for too long and the pacing needed to be sped up.
In addition to data collected from the game itself, we also put out two surveys over the course of the game. The first was a mid-game survey that asked questions about quest design (what quests students liked or didn’t like, and why). Responses to this survey were bewilderingly contradictory. Students would cite a quest as their favorite, while others would cite the exact same quest as their least favorite (and often for the same reasons). The qualitative post-game evaluation we did provides some possible explanation for this (see below). The second survey was a simple post-game questionnaire that asked whether students had enjoyed the game, whether they’d learned something, and if this was something we should continue doing. We also asked if they had learned anything, and if so, what they had learned. 90% of the respondents to this survey indicated that they had learned something about the library, that they thought this was a good idea, and that it was something we should do again.
Finally, we offered players points and free coffee to come in to the library and spend 15-20 minutes talking to us about their experience playing the game. We kept questions short and simple to keep within the time window. We asked about overall impressions of the game, if the students would change anything, if they learned anything (and if so, what) and about what quests they liked or didn’t like and why. The general tone of the feedback was very positive. Students seemed intrigued by the idea and appreciated that the library was trying to teach in nontraditional, self-directed ways. When asked to sum up their overall impressions of the game, students said things like “Very well done, but could be improved upon”or “good but needs polish,” or my personal favorite: “an effective use of bribery to learn about the library.”
One of the things we asked people about was whether the game had changed how they thought about the library. They typically answered that it wasn’t so much that the game had changed how they thought about the library so much as it changed the way they thought about themselves in relation to it. They used words like “”aware,” “confident,” and “knowledgeable.” They felt like they knew more about what they could do here and what we could do for them. Their retention of some of the quest content was remarkable, including library-specific lingo and knowledge of specific procedures (like how to use the retrieval system and how document delivery worked).
Players noted a variety of problems with the game. Some were technical in nature. The game app takes a long time to load, likely because of the way the back-end is designed. Some of them didn’t like the facebook login. Stability on android devices was problematic (this is no surprise, as developing the android version was by far the more problematic part of developing the app). Other problems were nontechnical, including quest content that didn’t work or took too long (my own lack of experience designing quests is to blame), communication issues (there’s no way to let us know when quest content doesn’t work), the flow and pacing of new quests (more content faster), and marketing issues. These problems may in part account for the low on boarding numbers in terms of players that actually completed content.
They also had a variety of reasons for playing. While most cited the iPad grand prize as the major motivator, several of them said they wanted to learn about the library or were curious about the game, and that they thought it might be fun. This may explain differing reactions to the quest content survey that so confused me. People who just wanted to have fun were irked by quests that had an overt educational goal. Students who just wanted the iPad didn’t want to do lengthy or complex quests. Students who loved games for the fun wanted very hard quests that challenged them. This diversity of desire is something all game developers struggle to cope with, and it’s a challenge for designing popular games that appeal to a wide variety of people.
Where to Go from Here
Deciding whether or not Library Quest has been successful depends greatly on what angle you look at the results from. On one hand, the game absolutely taught people things. Students in the survey and interviews were able to list concrete things they knew how to do, often in detail and using terminology directly from the game. One student proudly showed us a book she had gotten from ILL, which she hadn’t known how to use before she played. On the other hand, the overall participation was low, especially when contrasted against the expense and staff time of creating and running the game. Looking only at the money spent (approximately $14,700), it’s easy to calculate an output of about $85 per student reached (173 with points) in development, prizes, and advertising. The challenge is creating engaging games that are appealing to a large number of students in a way that’s economical in terms of staff time and resources.
After looking at all of this data and talking to Yeti CGI, our development partners, we feel there is still a great deal for us to learn here, and the results are promising enough that we should continue to experiment. Both organizations feel there is still a great deal to learn about making games in physical space and that we’ve just scratched the surface of what we might be able to do. With the lessons we have learned from this round of the game, we are looking to completely redesign the way the game app works, as well as revise the game into a shorter, leaner experience that does not require as much content or run so long. In addition, we are seeking campus partners who would be interested in using the app in classes, as part of student life events, or in campus orientation. Even if these events don’t directly involve the library, we can learn from the experience how to design better quest content that the library can use. Embedding the app in smaller and more fixed events helps with marketing and cost issues.
Because the app design is so expensive, we are looking into the possibility of a research partnership with Yeti CGI. We could both benefit from learning more about how mobile gaming works in a physical space, and sharing those lessons would get us Yeti’s help rebuilding the app as well as working with us to figure out content creation and pacing, without another huge outlay of development capital. We are also looking at ways to turn the game development itself into an educational opportunity. By working with our campus mobile app development lab, we can provide opportunities for GVSU students to learn app design. Yeti is looking at making more of the game’s technical architecture open (for example, we are thinking about having all quest content marked up in XML) so that students can build custom interfaces and tools for the game.
Finally, we are looking at grants to support running and revising the game. Our initial advertising and incentives budgets were very low, and we are curious to see what would happen if we put significant resources into those areas. Would we see bigger numbers? Would other kinds of rewards in addition to the iPad (something asked for by students) entice players into completing more quest content? Understanding exactly how much money needs to be put into incentives and advertising can help quantify the total cost of running a large, open game for libraries, which would be valuable information for other libraries contemplating running large-scale games.
Get the LibraryQuest App: (iPhone, Android)
About our Guest Author:
Kyle Felker is the Digital Initiatives Librarian at Grand Valley State University Libraries, where he has worked since February of 2012. He is also a longtime gamer. He can be reached at felkerk@gvsu.edu, or on twitter @gwydion9.