Is this Article Scholarly?
Authors: Emily Hamstra, Diana Perpich, Melissa Gomis, Jamie Vander Broek, Anthony Whyte
Institution: University of Michigan
Interviewees: Emily Hamstra & Diana Perpich
Interviewer: Lindy Scripps-Hoekstra
Tutorial Description: This straightforward tool was created for undergraduates who need a basic understanding of the concept of scholarly literature. By asking three short questions this tool introduces students to the complexity of scholarly literature and at the same time guides them towards understanding whether an article is scholarly or not scholarly.
Q: What was the motivation behind creating this tool? What were your goals?
A: The impetus for this project came from the University of Michigan Library’s Associate University Librarian for Learning & Teaching, Laurie Alexander. In the fall of 2013, she created three groups to explore library instruction outside of the traditional one-shot. One group explored course integrated library instruction, the second explored for-credit instruction, and a third group explored learning-on-demand.
As the group exploring learning-on-demand, we wanted to create a tool or module that would allow students to learn a key information literacy topic at their point of need. Our goal was to create a simple tool or module for undergraduate students who have 5-10 minutes to learn the key elements of a scholarly article, with links out to additional resources if the student had more time or questions.
Q: What led you to decide upon the format of a short survey?
A: We created the tool as a survey because it is interactive for the learner, and easy to maintain for us. Qualtrics is accessible, and we wanted to make sure that the tool we created could be used by all our users.
Q: Can you explain the roles of the different authors in the creation of this tool?
A: This tool was created by a group of instruction librarians and technologists. Librarians with design skills mocked-up the design and technologists created the tool. Instruction librarians created content for additional learning. We all collaborated on the wording of the tool.
Q: Did you involve any faculty, students, or other campus partners in the development process?
A: We did user testing with undergraduate students. The user testing was designed by a graduate student who was hired on an hourly basis to assist with this project. During the testing, we gave students copies of articles, both scholarly and non-scholarly, and asked them to use the tool to discern whether or not each article was scholarly. Based on this testing, we refined the wording of the questions– particularly the question that asks about the headings of the article.
Q: Approximately how long did it take to create this tool?
A: We met 2 hours per month for 9 months to create, tweak, test, and roll out the tool.
Q: What were some of the positives and negatives of working with a survey tool like Qualtrics for instructional purposes?
A: There are a lot of positives to creating a tool like this in Qualtrics, mainly that it is easy to build. Qualtrics provides lots of survey flow options without having to know how to code or go hunting for third party plugins. Qualtrics maintains the survey. Qualtrics builds in accessibility testing and provides tips for creating an accessible survey. Qualtrics also allows us to share the survey tool outside of our organization.
There are some negatives, mainly limits to branding and styling without HTML experts or special licensing.
Q: How is this tool being used at your institution? How has it been received?
A: We have made the tool available on the Undergraduate Library Homepage and on LibGuides aimed at lower-level undergraduates. Colleagues on campus have responded positively to the template and concept. We’re still ramping up use among librarians and student users.
Q: Have you done any assessment of the effectiveness of this tool? Since it was built on a survey platform, are you able to monitor usage or response selections?
A: Outside of the user testing, we have not done any formal assessment of the tool. We can monitor how often it’s accessed, when it’s accessed, from where it’s accessed (via IP address), but we are not yet collecting that data or data on how useful it is for them. We are hiding the submit button because we can give them adaptive feedback without them officially submitting. We did that to keep the tool flexible, but that meant we gave up the chance to ask them to submit “was this useful” feedback. Perhaps in a later version.
Q: Do you have any advice for those considering the development of similar projects?
A: Rapid development: working through iterative cycles of needs assessment, design, implementation, testing, and then reassessment, redesign, etc. Also, you can’t just put something up online and assume people will find and use it. We didn’t put as many resources toward promoting and sharing the tool as we might have. That’s left a bit of a time gap between availability of the tool and adoption.
April 2015 PRIMO Site of the Month