Taking a trek with SCVNGR: Developing asynchronous, mobile orientations and instruction for campus

Embedding the library in campus-wide orientations, as well as developing standalone library orientations, is often part of outreach and first year experience work. Reaching all students can be a challenge, so finding opportunities for better engaging campus helps to promote the library and increase student awareness. Using a mobile app for orientations can provide many benefits such as increasing interactivity and offering an asynchronous option for students to learn about the library on their own time. We have been trying out SCVNGR at the University of Arizona (UA) Libraries and are finding it is a more fun and engaging way to deliver orientations and instruction to students.

Why use game design for library orientations and instruction?

Game-based learning can be a good match for orientations, just as it can be for instruction (I have explored this before with ACRL TechConnect previously, looking at badges). Rather than just presenting a large amount of information to students or having them fill out a paper-based scavenger hunt activity, using something like SCVNGR can get students interacting more with the library in a way that offers more engagement in real time and with feedback. However, simply adding a layer of points and badges or other game mechanics to a non-game situation doesn’t automatically make it fun and engaging for students. In fact, doing this ineffectively can cause more harm than good (Nicholson, 2012). Finding a way to use the game design to motivate participants beyond simply acquiring points tends to be the common goal in using game design in orientations and instruction. Thinking of the WIIFM (What’s In It For Me) principle from a students’ perspective can help, and in the game design we used at the University of Arizona with SCVNGR for a class orientation, we created activities based on common questions and concerns of students.

Why SCVNGR?
scvngr home screen
scvngr home screen

 

SCVNGR is a mobile app game for iPhone and Android where players can complete challenges in specific locations. Rather than getting clues and hints like in a traditional scavenger hunt, this game is more focused on activities within a location instead of finding the location. Although this takes some of the mystery away, it works very well for simply informing people about locations that are new to them and having them interact with the space.

Students need to physically be in the location for the app to work, where they use the location to search for “challenges” (single activities to complete) or “treks” (a series of single activities that make up the full experience for a location), and then complete the challenges or treks to earn points, badges, and recognition.

Some libraries have made their own mobile scavenger hunt activities without the aid of a paid app. For example, North Carolina State University uses the NCSU Libraries’ Mobile Scavenger Hunt, which is a combination of students recording responses in Evernote, real time interaction, and tracking by librarians.  One of the reasons we went with SCVNGR, however, is because this sort of mobile orientation requires a good amount of librarian time and is synchronous, whereas SCVNGR does not require as much face-to-face librarian time and allows for asynchronous student participation. Although we do use more synchronous instruction for some of our classes, we also wanted to have the option for asynchronous activities, and in particular for the large-scale orientations where many different groups will come in at many different times. Although SCVNGR is not free for us, the app is free to students. They offer 24/7 support and other academic institutions offer insight and ideas in a community for universities.

Other academic libraries have used SCVNGR for orientations and even library instruction. A few examples are:

 

How did the UA Libraries use SCVNGR?

Because a lot of instruction has moved online and there are so many students to reach, we are working on SCVNGR treks for both instruction and basic orientations at the University of Arizona (UA). We are in the process of setting up treks for large-scale campus orientations (New Student Orientation, UA Up Close for both parents and students, etc.) that take place during the summer, and we have tested SCVNGR  out on a smaller scale as a pilot for individual classes. There tends to be greater success and engagement if the Trek is tied to something, such as a class assignment or a required portion of an orientation session that must be completed. One concern for an app-based activity is that not all students will have smartphones. This was alleviated by putting students into groups ahead of time, ensuring that at least one person in the group did have a device compatible to use SCVNGR. However, we do lend technology at the UA Libraries, and so if a group was without a smartphone or tablet, they would be able to check one out from the library.

trek page for ais197b at ua libraries
trek page for ais197b at ua libraries

We first piloted a trek on an American Indian Studies student success course (AIS197b). This course for freshmen introduces students to services on campus that will be useful to them while they are at the UA. Last year, we presented a quick information session on library services, and then had the students complete a scavenger hunt for a class grade (participation points) with pencil and paper throughout the library. Although they seemed glad to be able to get out and move around, it didn’t seem particularly fun and engaging. On top of that, every time the students got stuck or had a question, they had to come back to the main floor to find librarians and get help.  In contrast, when students get an answer wrong in SCVNGR, feedback is programmed in to guide them to the correct information. And, because they don’t need clues to make it to the next step (they just go back and select the next challenge in the trek), they are able to continue without one mistake preventing them from moving on to the next activity. This semester, we first presented a brief instruction session (approximately 15-20 min) and then let students get started on SCVNGR.

You can see in the screenshot below how question design works, where you can select the location, how many points count toward the activity, type of activity (taking a photo, answering with text, or scanning a QR code), and then providing feedback. If a student answers a question incorrectly, as I mention above, they will receive feedback to help them in figuring out the correct answer. I really like that when students get answers right, they know instantly. This is positive reinforcement for them to continue.

scvngr answer feedback
scvngr answer feedback

The activities designed for students in this class were focused on photo and text-based challenges. We stayed away from QR codes because they can be finicky with some phones, and simply taking a picture of the QR code meets the challenge requirement for that option of activity. Our challenges included:

  • Meet the reference desk (above): Students meet desk staff and ask how they can get in touch for reference assistance; answers are by text and students type in which method they think they would use the most: email, chat, phone, or in person.
  • Prints for a day: Students find out about printing (a frequent question of new students), and text in how to pay for printing after finding the information at the Express Documents Center.
  • Playing favorites: Students wander around the library and find their favorite study spot. Taking a picture completes the challenge, and all images are collected in the Trek’s statistics.
  • Found in the stacks: After learning how to use the catalog (we provided a brief instruction session to this class before setting them loose), students search the catalog for books on a topic they are interested in, then locate the book on the shelf and take a picture. One student used this time to find books for another class and was really glad he got some practice.
  • A room of one’s own: The UA Libraries implemented online study room reservations as of a year ago. In order to introduce this new option to students, this challenge had them use their smartphones to go to the mobile reservation page and find out what the maximum amount of hours study rooms can be reserved for and text that in.

SCVNGR worked great with this class for simple tasks, such as meeting people at the reference desk, finding a book, or taking a picture of a favorite study spot, but for tasks that might require more critical thinking or more intricate work, this would not be the best platform to use in that level of instruction. SCVNGR’s assessment options are limited for students to respond to questions or complete an activity. Texting in detailed answers or engaging in tasks like searching a database would be much harder to record. Likewise, because more instruction that is tied to critical thinking is not so much location-based (evaluating a source or exploring copyright issues, for example), and so it would be hard to tie these tasks and acquisition of skill to an actual location-based activity to track. One instance of this was with the Found in the Stacks challenge; students were supposed to search for a book in the catalog and then locate it on the shelf, but there would be nothing stopping them from just finding a random book on the shelf and taking a picture of it to complete the challenge. SCVNGR provides a style guide to help in game design, and the overall understanding from this document is that simplicity is most effective for this platform.

Another feature that works well is being able to choose if the Trek is competitive or not, and also use “SmartRoute,” which is the ability to have challenges show up for participants based on distance and least-crowded areas. This is wonderful, particularly as students get sort of congested at certain points in a scavenger hunt: they all crowd around the same materials or locations simultaneously because they’re making the same progress through the activity. We chose to use SmartRoute for this class so they would be spread out during the game.

scvngr trek settings
scvngr trek settings

When trying to assess student effort and impact of the trek, you can look at stats and rankings. It’s possible to view specific student progress, all activity by all participants, and rankings organized by points.

scvngr statistics
scvngr statistics

Another feature is the ability to collect items submitted for challenges (particularly pictures). One of our challenges is for students to find their favorite study spot in the library and take a picture of it. This should be fun for them to think about and is fairly easy, and it helps us do some space assessment. It’s then possible to collect pictures like the following (student’s privacy protected via purple blob).

student images of ua main library via scvngr
student images of ua main library via scvngr

On the topic of privacy, students enter in their name to set up an account, but only their first name and first initial of their last name appear as their username. Although last names are then hidden, SCVNGR data is viewable by anyone who is within the geographical range to access the challenge: it is not closed to an institution. If students choose to take pictures of themselves, their identity may be revealed, but it is possible to maintain some privacy by not sharing images of specific individuals or sharing any personal information through text responses. On the flip side of  not wanting to associate individual students with their specific activities, it gets trickier when an instructor plans to award points for student participation. In that case, it’s possible to request reports from SCVNGR for instructors so they can see how much and which students participated. In a large class of over 100 students, looking at the data can be messier, particularly if students have the same first name and last initial. Because of this issue, SCVNGR might be better used for large-scale orientations where participation does not need to be tracked, and small classes where instructors would be easily able to know who is who in the data for activity.

Lessons learned

Both student and instructor feedback was very positive. Students seemed to be having fun, laughing, and were not getting stuck nearly as much as the previous year’s pencil-and-paper hunt. The instructor noted it seemed a lot more streamlined and engaging for the class. When students checked in with us at the end before heading out, they said they enjoyed the activity and although there were a couple of hiccups with the software and/or how we designed the trek, they said it was a good experience and they felt more comfortable with using the library.

Next time, I would be more careful about using text responses. I had gone down to our printing center to tell the current student worker what answers students in the class would be looking for so she could answer it for them, but they wound up speaking with someone else and getting different answers. Otherwise, the level of questions seemed appropriate for this class and it was a good way to pilot how SCVNGR works, if students might like it, and how long different types of questions take for bringing this to campus on a larger scale. I would also be cautious about using SCVNGR too heavily for instruction, since it doesn’t seem to have capabilities for more complex tasks or a great deal of critical thinking. It is more suited to basic instruction and getting students more comfortable in using the library.

Pros

  • Ability to reach many students and asynchronously
  • Anyone can complete challenges and treks; this is great for prospective students and families, community groups, and any programs doing outreach or partnerships outside of campus since a university login is not required.
  • Can be coordinate with campus treks if other units have accounts or a university-wide license is purchased.
  • WYSIWYG interface, no programming skills necessary
  • Order of challenges in a trek can be assigned staggered so not everyone is competing for the same resources at the same time.
  • Can collect useful data through users submitting photos and comments (for example, we can examine library space and student use by seeing where students’ favorite spots to study are).

Cons

  • SCVNGR is not free to use, an annual fee applies (in the $900-range for a library-only license, which is not institution-wide).
  • Privacy is a concern since anyone can see activity in a location; it’s not possible to close this to campus.
  • When completing a trek, users do not get automatic prompts to proceed to the next challenge; instead, they must go back to the home location screen and choose the next challenge (this can get a little confusing for students).
  • SCVNGR is more difficult to use with instruction, especially when looking to incorporate critical thinking and more complex activities
  • Instructors might have a harder time figuring out how to grade participation because treks are open to anyone; only students’ first name and last initial appear, so if either a large class completes a trek for an assignment or if an orientation trek for the public is used, a special report must be requested from SCVNGR that the library could send to the instructor for grading purposes.

 

Conclusion

SCVNGR is a good way to increase awareness and get students and other groups comfortable in using the library. One of the main benefits is that it’s asynchronous, so a great deal of library staff time is not required to get people interacting with services, collections, and space. Although this platform is not perfect for more in-depth instruction, it does work at the basic orientation level, and students and the instructor in the course we piloted it on had a good experience.

 

References

Nicholson, S. (2012). A user-centered theoretical framework for meaningful gamification. Paper Presented at Games+Learning+Society 8.0, Madison, WI. Retrieved from http://scottnicholson.com/pubs/meaningfulframework.pdf.

—-

About Our Guest Author: Nicole Pagowsky is an Instructional Services Librarian at the University of Arizona where she explores game-based learning, student retention, and UX. You can find her on Twitter, @pumpedlibrarian.

Test-driving Purdue’s Passport gamification platform for library instruction

Gamification in libraries has become a topic of interest in the professional discourse, and one that ACRL TechConnect has covered in Applying Game Dynamics to Library Services and Why Gamify and What to Avoid in Gamification. Much of what has been written about badging systems in libraries pertains to gamifying library services. However, being an Instructional Services Librarian, I have been interested in tying gamification to library instruction.

When library skills are not always part of required learning outcomes or directly associated with particular classes, thinking more creatively about promotion and embeddedness of library tutorials prompted me to become interested in tying a badging system to the University of Arizona Libraries’ online learning objects. For a brief review on badges, they are visual representations of skills and achievements. They can be used with or instead of grades depending on the scenario and include details to support their credibility (criteria, issuer, evidence, currency).

From classhack.com

Becoming a beta tester for Purdue’s Passport platform gives me the opportunity to better sketch out what our plans are and to test how gamification could work in this context. Passport, according to Purdue, is “A learning system that demonstrates academic achievement through customizable badges.” Through this platform, instructors can design instruction for badges to be associated with learning outcomes. Currently, Passport can only be used by applying to be a beta tester. As they improve the software, it should be available to more people and have greater integration (it currently connects with Mozilla Open Backpack and within the Purdue system).We are still comparing platforms and possibilities for the University of Arizona Libraries, and testing Passport has been the first step in figuring out what we want, what is available, and how we would like to design this form of instruction. I will share my impression of Passport and using badging technology for these purposes from my experience using the software.

Refresher on motivation

It’s important to understand how motivation works in relation to a points and badges system, while also having a clear goal in mind. I recently wrote a literature review on motivation in gamified learning scenarios as part of my work toward a second Master’s in Educational Technology. The general ideas to take away are the importance of employing game mechanics thoughtfully into your framework to avoid users’ relying solely on the scoring system, as well as focusing on the engagement aspects of gamification rather than using badges and points just for manipulation. Points should be used as a feedback mechanism rather than just promoting them as items to harvest.

Structure and scalability

Putting this into perspective for gamifying library instruction at the University of Arizona, we want to be sure student motivation is directed at developing research skills that can be visually demonstrated to instructors and future employers through badges, with points serving as feedback and further motivation. We are using the ACRL Information Literacy Standards as an outline for the badges we create; the Standards are not perfect, but they serve well as a map for conceptualizing research skills and are a way we can organize the content. Within each skill set or badge, activities for completion are multidimensional: students must engage in a variety of tasks, such as doing a tutorial, reading a related article or news story, and completing a quiz. We plan to allow for risk taking and failure — important aspects of game design — so students can re-try the material until they understand it (Gee, 2007).

As you can see in this screen capture, the badges corresponding to the ACRL Standards include: Research Initiator (Standard 1), Research Assailant (Standard 2), Research Investigator (Standard 3), and Research Warrior (Standard 4). As a note, I have not yet created a badge for Standard 5 or one to correspond with our orientations (also, all names you can see in any image I include are of my colleagues trying out the badges, and not of students). A great aspect of this platform is the ability to design your own badges with their WYSIWYG editor.

Main challenge screen
Main challenge screen

Because a major issue for us is scalability with limited FTE, we have to be cautious in which assessment methods we choose for approving badges. Since we would have a hard time offering meaningful, individualized feedback for every student who would complete these tasks, having something automatic is more ideal. Passport allows options for students to test their skills, with multiple-choice quizzes, uploading a document, and entering text. For our purposes, using multiple-choice quizzes with predetermined responses is currently the best method. If we develop specific badges for smaller courses on a case-by-case basis, it might be possible to accept written responses and more detailed work, but in trying to roll this out to campus-at-large, automated scoring is necessary.

Leveling up

Within each badge, also referred to as a challenge, there are tasks to complete. Finishing these tasks adds up to earning the badge. It’s essentially leveling up (which is progressing to the next level based on achievement); although the way Passport is designed, the students can complete the tasks in any order. Within the suite of badges, I have reinforced information and skills throughout so students must use previous skills learned for future success. In this screen capture, you can see the overall layout by task title.

Task progress by users
Task progress by users

When including tasks that require instructor approval (if students were to submit documents or write text), an instructor would click on each yellow box stating that approval is needed to determine if the student successfully completed the task and supply personalized feedback (image above). And you can see the breakdown of tasks under each challenge to review what was learned; this can serve as confirmation for outside parties of what kind of work each badge entailed (image below).

Badge work details
Badge work details
Showing off

Once badges are earned, they can be displayed in a user’s Passport profile and Mozilla Open Badges. Here is an example of what a badge portfolio looks like:

User badge portfolio
User badge portfolio

Passport “classrooms” are closed and require a log in for earning badges (FERPA), but if students agree to connectivity with Mozilla’s Open Badges Backpack, achievements can then be shared with Twitter, Facebook, LinkedIn, and other networks. Badges can also connect with e-portfolios and resumes (since it’s in Beta this functionality works best with Purdue platforms). This could be a great, additional motivator for students in helping them get jobs. From Project Information Literacy, we do know employers find new graduates are lacking research skills, so being able to present these skills as fulfilled to future employers can be useful for soon-to-be and recent graduates. The badges link back to more information, as mentioned, and employers can get more detail. Students can even make their submitted work publicly available so employers, instructors, and peers can see their efforts.

Wrapping up

Whether or not it is possible to integrate Passport fully into our library website for students to access, using this tool has at least given me a way to essentially sketch out how our badging system will work. We can also try some user testing with students on these tasks to gauge motivation and instructional effectiveness. Having this system become campus-wide in collaboration with other units and departments would also aid in creating more meaning behind the badges; but in the meantime, tying this smaller scale layout to specific class instruction or non-disciplinary collaborations will be very useful.

Although some sources say gamification will be taking a huge nosedive by 2014 due to poor design and over-saturation,  keeping tabs on other platforms available and how to best incorporate this technology into library instruction is where I will be looking this semester and beyond as we work on plans for rolling out a full badging system within the next couple of years. Making learning more experiential and creating choose-your-own adventure scenarios are effective in giving students ownership over their education. Using points and badges for manipulating users is certainly detrimental and should fall out of use in the near future, but using this framework in a positive manner for motivation and to support student learning can have beneficial effects for students, campus, and the library.

Additional Resources

Books:

Dignan, A. (2012). Game Frame. New York: The Free Press.

Gee, J. P. (2007). What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.

Kapp, K. M. (2012). The gamification of learning and instruction: Game-based methods and strategies for training and education. San Francisco, CA: Pfeiffer.

Koster, R. (2005). A theory of fun for game design. Scottsdale, AZ: Paraglyph Press.

Online:

Because Play Matters: A game lab dedicated to transformative games and play for informal learning environments in the iSchool at Syracuse: http://becauseplaymatters.com/

Digital badges show students’ skills along with degree (Purdue News): http://www.purdue.edu/newsroom/releases/2012/Q3/digital-badges-show-students-skills-along-with-degree.html

Gamification Research Network: http://gamification-research.org/

TL-DR: Where gamers and information collide: http://tl-dr.ca/

—-

About Our Guest Author: Nicole Pagowsky is an Instructional Services Librarian at the University of Arizona where she explores game-based learning, student retention, and UX. You can find her on Twitter, @pumpedlibrarian.