Critical Information Literacy Laboratory for Faculty
Authors: Gina Schlesselman-Tarango, Barbara Quarton, Claudia Tristan
Institution: California State University, San Bernardino
Interviewee: Gina Schlesselman-Tarango
Interviewer: Marcia Rapchak
Tutorial Description (provided by the author): Critical Information Literacy (CIL) Laboratory for Faculty is a teaching and learning toolkit for faculty and instructors, with instructional resources, tutorials, assessments, and more. The online resource balances both CIL concepts and skills and provides tools that can be tailored to a variety of class assignments and student needs. The mix of basic and higher-level prompts and activities allow for information literacy instruction to effectively be differentiated within the classroom.
Q: How were the Critical Information Literacy SLOs (student learning outcomes) developed, and how did they inform the development of the CIL Laboratory for Faculty?
A: Our outcomes have been in place since 2012. They are a product of our reading of the literature that addresses critical information literacy (CIL), and they were developed to remediate the ACRL Information Literacy Competency Standards for Higher Education (which didn’t suit us because they were too skills-based). Initially, we used our student learning outcomes to develop a workshop series, but we realized that we weren’t able to reach enough students this way. We eventually surveyed the faculty and found out that while they liked the workshops, what they really wanted was an online toolkit that they could use in their classes. It made sense to develop such a resource around our learning outcomes – this meant arranging activities, discussions, and resources by learning outcome as well as being clear about which outcome(s) each video hits. We didn’t want to leave faculty guessing, and we hope that this will also help them assess student learning. By organizing the lab largely around student learning outcomes, it also helps us determine what sorts of resources to include. In short, it defines the scope of the toolkit.
Q: What made you decide that you needed three levels (beginning, intermediate, and advanced) of tutorials in the Laboratory? What was the process for determining what was a beginner skill vs. an intermediate skill vs. an advanced skill?
A: Previously, we offered one tutorial that covered very basic skills, like how to locate a book by call number. We had long heard from faculty that they wanted online learning objects that were suitable for students of all levels and that could be used in a variety of courses. Not only do we have a good number of distance learners at California State University, San Bernardino, but we have a sizable graduate student population as well. We approached the tutorial revision process not only as an opportunity to meet the needs of more students on- and off-campus, but also as a chance to address concepts reflective of our student learning outcomes, like the deep web and “the literature.” Our process for determining which skills and concepts belonged at which level consisted of gathering input from our colleagues. We also took common courses into consideration, as they guide us in understanding what is important for students and at which level.
Q: Who was involved in the creation of the Laboratory? Did you collaborate with any stakeholders outside the library as you developed the Laboratory?
A: Gina, the Instructional Services and Initiatives Librarian, took the lead in creating the tutorials and quizzes and gathering and organizing content. Claudia, our web developer, was extremely helpful in ensuring that the lab and its objects were accessible to the visually impaired. Barbara, Coordinator of Library Instruction, oversaw the entire project and also ensured that our instruction program and student learning outcomes were appropriately represented. Before we went live, we conducted usability testing with faculty and instructors across campus and sought feedback from our colleagues in the library. We continue to seek feedback in our work with faculty and are regularly tweaking, updating, and adding content.
Q: How long did it take for you to develop the Laboratory?
A: It took approximately a year to create the tutorials and quizzes, gather materials, and decide on a platform (we started with LibGuides but ultimately decided to go with WordPress because it allowed us to really control the interface, and we prefer it aesthetically). All this work was occurring in between regular duties, like reference work, instruction, collection development, service, and research. We had a bit more time in the summer, which is when we put everything online, made sure it met campus accessibility standards, and conducted usability testing. We then went live a few weeks before classes started up again in the fall.
Q: What tools did you use to develop the tutorials? Did you discover any surprising advantages or disadvantages as you used this software?
A: We created the tutorials using VideoScribe for the most part. It’s a fairly simple tool to learn, as it doesn’t require any experience with animation. However, it doesn’t do screen-capturing, so we ended up using Camtasia and iMovie for that and to piece everything together. Luckily, Gina had editing experience with iMovie and was also able to learn Camtasia pretty quickly. We don’t have a streaming server, so our videos live on YouTube, and we used their technology to add our video scripts as captions. Creating the tutorials were the most time-consuming component of the project, by far.
Q: Do you plan to add any other resources to the Laboratory? Or do you plan to develop it in another way?
A: In addition to making small changes and additions when needed throughout the year, our plan is to meet annually in the summer to revisit the state of the lab. We’ll need to take new courses or majors into consideration, and we’d like to develop a way to seek feedback on a large scale so we can learn more about how faculty are using it. We also stay abreast of developments in critical information literacy (CIL) and plan to add any pertinent resources so that we can best support faculty as they integrate CIL into their courses.
Q: How have you promoted the Laboratory to faculty?
A: First and foremost, we were intentional about including the term “faculty” in the title of the toolkit. We wanted faculty, instructors, and lecturers to know it was for them. It seems that so many great resources get ignored because faculty think they are for students only, so we wanted to turn that idea on its head. In the fall, we sent out a number of e-mail notifications about our new and improved toolkit, and we’ve also strategically linked to it a number of places, from Blackboard to our campus Teaching Resource Center. We also placed it front and center on our library’s homepage where it’s hard to miss. Our colleagues are great about promoting it in their meetings with faculty, and we created a pamphlet for this very purpose. Any time we lead professional development with faculty, instructors, or teaching assistants, we plug the lab.
Q: What is the relationship between the Laboratory and the CIL grant? Are faculty who receive the CIL grant required to use the Laboratory?
A: We are using the CIL grant supported by the Pfau Library as an opportunity to promote the Laboratory. The grant is meant to encourage faculty to integrate CIL into their courses. Since we have this great tool to point them to, we do ask that they use instructional and assessment materials from the lab. They then provide us with a report of what they did and the results of their work, which we intend to use as examples for other faculty. We’ve been strategic about selecting faculty who teach high-impact courses.
Q: What feedback have you received on the Laboratory? How have faculty used the various tutorials, discussions, and class activities? How have students reacted?
A: We have received very positive feedback so far. We’ve heard that the “About CIL” section has been helpful for those who are new to CIL or those who have a very skills-based understanding of information literacy. Faculty have used the individual videos, tutorials, and quizzes in a variety of classes, and some programs have made them a part of their curriculum. Even some of our colleagues are using the videos in their one-shot instruction sessions. Since the toolkit is open access, we also know that folks not affiliated with CSUSB are using the materials in their teaching — this is great! Gauging student reaction is a little trickier, but we are hearing language from the toolkit (e.g., “deep web”) more frequently at the reference desk, so we know they’re getting the material somehow. One first-year student told us that she has used what she learned in the beginners tutorial in multiple classes this year.
Q: How have the tutorial quiz assessment results been used?
A: The quiz data allow us to estimate our impact outside of and/or in addition to our one-shot instruction sessions and workshops. The data also show us which questions have the lowest number of correct responses, which gives us some idea of which concepts/skills are trickiest for students. We hope to eventually compare the work (papers or projects with a CIL outcome) of students who have taken a quiz versus those who haven’t.
Q: What advice would you give to those who want to create similar resources?
Give yourself plenty of time — projects like these take hours and hours of planning, creation, and implementation. I’d also recommend creating a plan for marketing or promotion before you go live — there’s nothing worse than creating an awesome resource that doesn’t get used.
June 2016 Site of the Month