Author: Erica DeFrain
Institution: University of Vermont
Interviewee: Erica DeFrain
Interviewer: Maribeth Slebodnik
Using the University of Arizona’s Guide on the Side software, this tutorial guides learners through the process of evaluating sources by using real examples of sites that have purposefully and successfully mislead the public.
Q: What led you to develop Evaluating Information?
A: Evaluating Information was part of a series of tutorials I helped develop specifically for the University of Vermont’s freshmen English composition courses. The library is very involved with general education at UVM and works closely with these classes every semester. I was asked to create online learning objects to help teach some of the skills that had traditionally been covered in face-to-face sessions, primarily as a way to free up class time for more targeted instruction. My colleagues and I identified six key information literacy themes we wanted to introduce in the objects, and this was one of them (see more at http://tinyurl.com/uvmComp).
Q: How did you develop the content/structure/organization of the tutorial?
A: Before I started, I talked to librarians in my department and mined our existing documents to help figure out the preferred strategies generally recommended when we’re working with students on this topic. As many different instructors use this, I wanted to ensure that it was appropriate for general use. For this and most of my instructional design projects, I tend to follow Gagne’s 9 Events of Instruction. (Editor’s note: See here and here for background on Gagne’s 9 Events.) I think it’s very helpful for developing a good pace for standalone online materials, and it’s also very learner-centered. Once I figured out my opening “hook” for gaining attention (using the Tree Octopus), I storyboarded everything in outline form with Google Docs so I could easily move things around before transitioning it into the Guide on the Side.
Q: What led you to choose the Guide on the Side technology?
A: I love the Guide on the Side (GotS)! I was working at the University of Arizona Libraries when GotS was released as open source last summer, and I feel so lucky to know the incredible team who developed the tool. I love it for its simplicity and versatility, and it is really important to me that the tutorials I develop can be easily updated and don’t require a lot of technical know-how to maintain. Of all the tools I’ve used, this is by far my favorite. (Editor’s note: Go here to download the code and find additional examples and more information about using Guide on the Side for tutorials.)
Q: What specific best practices did you follow to develop Evaluating Information?
A: Jakob Nielsen’s recommendations on writing for the web have been very influential. I try hard to limit text as much as possible, and make it easy to scan by using bullets and bolding important concepts. The GotS interface is great in that it really forces you to think critically about how you structure the content. I especially love the definition box function, which enables you to provide more information for learners who might need it while keeping it out of the way of those who don’t.
Q: From initial plan to final release, how long did it take to complete this tutorial?
A: I think 20 hours spread over the course of a few weeks would be an honest estimate, but about half that was just brainstorming different ideas and ways to approach the topic. I tend to have three or four different ideas that I kick around until one floats to the top. Again, GotS is a perfect tool for experimenting with different instructional ideas because it’s so simple to use. Some programs are complex enough that once you’ve committed to a path it’s difficult to make significant changes without having to spend a good deal of time reprogramming. With GotS, I can experiment, test, and tweak without too much trouble.
Q: What challenges did you encounter or lessons did you learn?
A: My biggest challenge continues to be figuring out how to represent higher order concepts related to evaluating information. Ultimately I ended up presenting three hoax sites, which covers credibility and is certainly the most entertaining aspect, but I think students struggle more with the issue of relevance, especially when tasked with working with scholarly materials. If someone has any ideas for how to introduce this in a way that fits with the tone of this guide, please contact me!
Q: Were faculty members or students consulted as part of the process of designing and implementing Evaluating Information?
A: I received feedback from my department colleagues first, and they were really great at catching typos and giving constructive feedback. Once I had incorporated their recommendations, we sent it to faculty and graduate assistants in the English department who would be teaching the classes. A few instructors tested it out in their classes before it was assigned to the ENGS 001 classes, and that was really helpful in ensuring that the technology was working and there weren’t any major complaints or criticisms.
Q: What has the response to Evaluating Information been? For example, what kind of response have you gotten to the “hoax” aspect of the examples you chose?
A: I’m relieved to say the response has been really positive! It’s always hard to release materials into the “wild” of the Internet, especially given that GotS natively invites learners to submit anonymous feedback at the end. I think this feature is incredibly important in terms of quality control and assessing the instruction, and learners generally aren’t shy. Of all the guides I’ve created, this one garnered the most responses, and nearly all were positive and enthusiastic for its humor and tone. Ultimately, I think this guide accomplished exactly what I had set out to do: I wanted a conversation piece that was memorable – something that was rooted in real life that also gave a nod to scholarly research. I wanted it to leave an impression, so you could have students complete it and then when you met with them in-class you just say the words “Tree Octopus” and students immediately know what you’re talking about. In practice this makes for a great icebreaker in class, which then allows you to discuss the more serious aspects of evaluating information that might have been overlooked.
Q: How is Evaluating Information being promoted and used at your institution? Is it integrated into classes, workshops, assignments, etc.?
A: It is used every semester in conjunction with the ENGS 001 freshman composition courses where students are required to complete it as part of a series of tutorials. We also provide access through our LibGuides, and promote it directly to faculty.
Q: Do you have any recommendations or advice for someone beginning to contemplate or plan a similar project?
A: Have fun with it, and anticipate many iterations! GotS makes it easy to edit, so keep an eye out for ways you can update your content with current examples. I don’t intend to use the Tree Octopus or Abe Lincoln and Facebook every year. There’s always something new and interesting happening in the world that could be included, and there’s no reason to let it grow stale.
Q: Is there anything else you’d like to tell us about Evaluating Information?
A: Please feel free to use, repurpose, or find inspiration from this guide or any others from UVM, see (http://tinyurl.com/uvmGots). I would love to see how others approach this topic!
April 2014 PRIMO Site of the Month