PRIMO Site of the Month: April 2018
Title: Research Essentials Online
Institution: University of Missouri-Kansas City
Interviewees: Dani Wellemeyer and Jess Williams
Interviewer: Liz King
Description (provided by the author): This site showcases the online learning materials developed in-house at UMKC Libraries for the Research Essentials information literacy instruction program. Research Essentials is taught to three levels of writing and speaking courses in online, hybrid, and face-to-face configurations. Each level (100, 200, 300) features a lesson and quiz; each lesson contains a learning path of information literacy topics; and each quiz covers all the material from the corresponding lesson. The 100 lesson is introductory and focuses on the information cycle as a way of beginning to cover many concepts, including authors, audiences, and source types. The 200 lesson focuses on specifics of source types, fundamentals of searching, and source evaluation. The 300 lesson goes further in depth with its treatment of information creators and the research process. The learning objects on this site are assigned to online courses and used as flipped classroom material with hybrid and face-to-face courses, in which students spend classroom time on practical application of concepts learned beforehand through the online material. Each module was created using eCoach, a cloud-based e-learning authoring tool with rapid release, which allows for updates to the deployed learning objects at any time. Each module features video, images, infographics, and interactive elements, combining OERs with original content in modules designed specifically to support the university’s general education learning outcomes. In credit-bearing courses students complete Research Essentials lessons and quizzes as SCORM items in their instructor’s course site in the university’s LMS (currently Blackboard) so that scoring data is visible to each instructor for use as a grade.
This interview has been edited and condensed.
Q: What led to the creation of the Research Essentials Learning Modules? What needs and goals were identified?
A: This suite of learning modules has been in development for several years, since our university implemented a new general education curriculum in 2013. Librarians who worked here before us ensured that the library had a seat at the table in those gen ed conversations, which created an opening for us to develop an information literacy program to be integrated into the three-level writing and speaking course progression. Meeting with a class one time just isn’t enough time to give students the skills they really need to be successful in their research—no matter how introductory-level the assignment might be. The flipped classroom concept presented a possible solution, so we decided that we would use online learning modules to present students with the building blocks for developing and deploying a research strategy before they came to their library session so that our class time could be spent on concept application. This allows students to make progress on their actual research assignment during class time in the presence of a librarian.
As the university was establishing a new gen ed program with new Core student learning outcomes (SLOs), the libraries also developed a set of information literacy learning outcomes, based on the ACRL Information Literacy Standards for Competency in Higher Education. Our SLOs have been significantly refined over the past few years in response to changes in the gen ed program, the introduction of the ACRL Framework for Information Literacy for Higher Education, and lessons we’ve learned as we’ve been administering Research Essentials over the past 14 semesters. The establishment of our information literacy SLOs guided the content and development of our online modules. This careful planning and alignment also helped us to create a set of online learning materials that can stand alone as an information literacy curriculum for online-only classes.
Q: How did you decide what content to include in the lessons? How frequently do you expect to need to update the research topics (e.g., The Arab Spring)?
A: We work closely with our university’s team of general education instructors. Part of our instruction workflow requires that they send us copies of their syllabi and assignments. Because of this, we are very much in-tune with the types of topics that students will be exploring and we let that guide some of the decision-making. However, we also made a decision to use examples and content that would be relatable to students in many areas of their life. So, our goal was to split the examples we included in tutorials based on application: one-third apply to writing, one-third to speaking, and one-third to real-life (non-academic).
We use assessment data to decide when to update topics. For example, we kept the Arab Spring topic for the Information Cycle tutorial for so long because we heard from students and instructors that they’ve enjoyed learning about an historical event that so fundamentally shaped how we use social media as a society. (We’ve just updated that tutorial to feature Ferguson, MO.)
Q: Your online modules skillfully use friendly language and humor that is clearly aimed at student users. For instance, the 100-level introduction module uses phrases such as, “Got Questions?”, “sheer awesomeness,” and, “because, duh, that’s why we are here!” What factors determined the language used in the modules? Did the level of the module affect the type of language used?
A: Early in the development of Research Essentials we decided that we wanted the program to have a voice. (This is something Dani cares about and put a lot of work into.) Our target audience is undergraduate students, so we wanted to use language they can understand. We also believe connecting with your audience makes for a better learning experience, and language is one way to do that in online learning materials that contain a lot of text.
We don’t believe that informal language or strategic inclusion of humor negatively affects the credibility of the material. In fact, we would argue that the tone of the modules demonstrates that we have carefully considered our audience, which is not—surprise!—academics or librarians. Gen ed courses participating in Research Essentials are comprised mainly of freshmen and sophomores. One source of inspiration for our modules is the science community on YouTube. A lot of amazing video content is being created by scientists, researchers, and experts specifically for a general audience. The creators are translating concepts and ideas into explanations that anyone can understand, and they’re doing it in the form of beautiful videos with high production value. The #scicomm community showed us that we can take expert-level concepts and make them accessible, and—dare we say it?—fun!
We (Dani and Jess) reviewed everything for accuracy, consistency, and voice. The final edit was actually more likely to add jokes than to remove them. We also did some usability testing with student workers at our library. UMKC is an urban campus with a significant international student population, which is represented in our library student workers. Testing with this group revealed some important issues, like slang or jargon that international students didn’t understand. This didn’t mean that we had to take out every humorous use of slang, but we did make sure that communicating a concept students needed to learn never relied on understanding slang. Every joke doesn’t have to land with every audience, but they do have to be subservient to the learning material. We also believe that humor is important in the classroom, and we train our graduate students and fellows to integrate it into their pedagogy.
Q: Community and local resources are themes found in your modules. Why was this important to you? Have you observed that students are more motivated to learn about your library’s learning tools when they are framed in this way?
A: The focus on locating local information and utilizing community resources was motivated by two factors. First, the general education curriculum at our university has a community and civic engagement component. Students are required to find and work with community groups, or learn about and engage with local issues affecting Kansas City. Secondly, our experience working with students—and faculty—showed us that local information sources are something many students don’t know how to find, if they even know they exist. There are rich datasets, troves of interesting local history, and lots of community organizations willing to share their expertise and resources if people simply know how to go about finding them. That’s why the process of locating a local information source is a major focus
Q: What types of technologies or software were used to create these modules? What factors did you consider when making your selection? Did you run into any issues, or was it an easy “out-of-the-box” experience?
A: This answer comes in a few parts! First, these modules were creating using eCoach Authoring (they also have an LMS product). With eCoach, we are able to store the SCORM packages (“SCORM” is a content standard in eLearning) in Blackboard Content Collection, our current campus LMS. This allows our instructors to embed the modules directly into their individual course sites, automatically creating grade columns and collecting assessment data for each student. We guide instructors through this process on our program FAQ page. Additionally, eCoach’s URL sharing feature gives us the opportunity to embed the modules on our website, which is something we had not been able to do before.
When we began the Research Essentials program back in 2013, web-based eLearning authoring software was very new, and there were few options. Our actual design process took longer also (this was also before we were using two of our favorite design tools, Canva & Piktochart). It took us almost an entire summer to create our first tutorial, The Information Cycle.
When it came time to choose a new authoring tool, we took it very seriously because we had learned the hard way which features were important and necessary. We combed instructional design and ed tech articles, made spreadsheets, asked vendors dozens of questions, and tested our top choices. Our criteria included SCORM compliance, Blackboard integration (whether it was web- or software-based), skill level needed for editors, level of support provided, overall visual design, pricing (of course!), and whether or not there were any “fancy” features like branching or gamification.
We’ve loved our experience with eCoach. We were able to onboard our team and convert all of our Captivate and Blackboard-native tutorials to eCoach lessons and modules in 4-5 months. This process also included updating and creating a large amount of new content. We used Trello to create our instructional design workflow, and this made it possible to be highly collaborative and quick! During this time we also gained an Instructional Design Librarian (what a dream!) and were able to work with our campus IT department to make improvements on the LMS delivery side of things.
Q: How much time did it take to create individual lesson modules from start to finish? What steps were involved? Did you run into any unexpected challenges or roadblocks?
A: The creation process for these modules was a bit atypical, because although most of the actual content was freshly created, we started with modules in an existing format (Adobe Captivate and Blackboard-native tutorials) and then converted them to eCoach. In this way, we weren’t starting from scratch. Another caveat is that we wanted to be able to reuse the new content in as many ways as possible, so we created the topical micro-lessons (e.g., “Credibility and Relevance”) and then compiled them into the scaffolded, course level-specific modules.
The steps involved were…numerous. We managed the entire process in Trello. Broadly speaking, each lesson moved from content creation, review and testing, module compilation, and then to final review. We created template workflows for each stage of development. For example, we had two workflows for review and testing: one for content and one for quizzes. The total number of steps for each lesson was 102! That may sound extreme, but the structure allowed us to create a standardized, cohesive product with many team members and a very tight deadline (3 months!).
As far as challenges and roadblocks go, yes, of course we ran into them! We’ve blocked most of them from our memories, but overall we experienced most of them on the LMS side of things. Though SCORM technology has been around for some time, we wrestled with the technicalities of implementation and deployment nonetheless. Some of this was a result of our campus’s implementation of Blackboard, combined with the fact that we were the only department using the LMS to deliver tutorials and capture assessment data. We figured it out as we went and gaining our Instructional Design Librarian in the process was a huge help!
Q: Regarding the use of multimedia, did you find it was better to create your own content or use content that existed already? When using outside content, how did you handle obtaining permission and ensure perpetual access to these materials?
A: We are proponents of not re-inventing the wheel, and our first effort at content creation usually began by exploring content that already existed. We were championing the use of Open Educational Resources before we even knew what OERs were. That being said, we also have high standards for externally-created content: Does it include the information needed to teach the SLO? Does it do so concisely and effectively? Is it boring or fun? Ugly or well-designed? Overall, does the content fit with our goals, our aesthetic, and our voice?
As far as permissions and access go, we check Creative Commons licenses, default to searching sources like Unsplash for stock images, occasionally purchase graphics from VectorStock or Canva, and ask permission when needed. Dani and Jess also created image guidelines and resources documentation for the team to utilize when creating lessons. This included notes on selecting images (e.g., Think: Instagram post that would get lots of likes. Steer away from “stock photos that are obviously stock photos” and “clipart”), formatting settings for eCoach, hex codes and icon style specifications, links to preferred tools and resources, and examples. All videos are embedded directly from their publishing platform—YouTube or Vimeo—using widgets in our eLearning authoring software. As with any embedded or borrowed media, we simply have to check on the videos occasionally, just like checking links on a library website. We like to assume that if a video becomes unavailable that we probably need to look for something new and fun, anyway!
And because we use a flipped classroom, our teaching team is responsible for reviewing all tutorials at the beginning of each semester. This practice ensures that we are not re-teaching tutorial content in the classroom and that we are equipped to review concepts and answer questions. As a bonus, it also means that we are checking links and making sure that videos are loading!
Q: In your experience, is completion of these modules required and/or encouraged by faculty? Are they integrated into any assignments or courses?
A: The modules are integrated into Discourse 100, 200, and 300 courses to teach students skills that will help them satisfy the Technology and Information Literacy learning outcome of the Core curriculum. A full description of the program is available with more details! Though instructors are not technically required to participate in the Research Essentials program nor to assign the modules, we worked intentionally and diligently to gain their buy-in from the very beginning. As a result, almost every Discourse instructor since 2014 has opted in to assigning the tutorials and bringing their classes to the library for instruction. We’ve been able to gather great amounts and types of assessment data and expand and adapt the modules to fit the needs of students in online classes. Fostering these relationships and providing a great product has paid off. In 2015, 90% of the freshmen and sophomore population of the university participated in Research Essentials.
Instructors have the freedom and discretion to determine how the modules are evaluated in their individual courses. We provide guidelines and suggestions, but ultimately instructors decide how the modules best fit into their course grading structure. Sometimes this means that students are given completion points, but most often it means that the tutorials are worth a certain percentage of their overall grade. From our end, we set the modules so that students may take multiple attempts, though instructors may also choose to modify this setting in the LMS.
Q: Your site introduction states that students in “face-to-face” sections will “complete the module prior to the library instruction sessions.” Have you observed any differences during in-person instruction sessions where the students have previously taken the online module?
A: The short answer is, “Yes, absolutely!” On the occasion that an instructor has failed to assign the tutorials or when a large number of students in a class have not completed them yet, it’s apparent within the first few minutes of a class. Class participation is much higher with students who have completed the online work: group discussions are smoother, individual concept application is more successful, and overall the class is able to progress further. When we encounter classes where the modules haven’t been assigned or completed by the students, we use formative assessment techniques to gauge where the class is in their understanding of information literacy concepts or searching skills, and then adjust the lesson plan on the fly to accommodate them.
But again, for us, the online modules are one component of an entire program that we’ve been running for over 4 years. Our relationship with instructors and the expectations that they set for their students ensure that the majority of students show up to the face-to-face session having completed the corresponding online module.
Q: Your site introduction also states that these modules were designed using “best practices for online learning.” Which best practices would you suggest other instruction librarians pay close attention to when designing their own online modules?
A: For the overall process, we used the ADDIE model and for design of individual lessons and modules we ended up following Gagne’s Nine Events of Instruction. Our rubrics and SLOs reflect application of Bloom’s Taxonomy. We also made sure to employ Universal Design standards to ensure that the modules would be usable by all students. These best practices were integrated directly into our workflows, so every team member working on the project was guided through the practices. In short, if you need to teach yourself DIY instructional design for librarians, here’s a guide to our favorite resources.
Q: Do you assess the impact of these modules on student success? If so, which methods do you recommend?
A: Yes, we do! We recommend using multiple assessment methods as opposed to relying on one or two. Here’s an example of how our assessment methods broken down by assessment type in 2014-2015:
|Scored concept-application assignments||X||X|
|Reading and reflection assignments||X||X|
|Student self-evaluation assignments||X||X|
|In-class activities administered by teaching librarian||X||X|
|Post-instruction student survey||X||X|
|Post-instruction instructor survey||X||X|
|Instructor-performed assessment of final projects||X||X|
Q: Have you received any feedback from faculty or students?
A: Our team has been collecting feedback since the beginning of the Research Essentials program both formally, through surveys and usability testing, and informally, through conversations with students.
Faculty interact with the online modules less than students, so for them, the biggest concern with the new and improved online tutorials was that it meant less physical time spent in the library with a librarian. While many instructors still wish they could spend more class time in the library space, we have found through the post-class survey that most accept the tutorials as a high-quality substitute. In fact, 84% of respondents picked 4 or 5 when asked if they believed that the online materials contributed to a successful library session (on a scale from 1-5, 1 = not helpful and 5 = very helpful). One comment we received in the survey really sums up the response from faculty as a whole: “I would love to devote a whole week to Research Essentials—as in the past—the experience was incomparable. And while the new tech component this year rocks, I haven’t gotten the “degree of appreciation” as in prior semesters. The f2f sessions really seemed to resonate! I would love to see the new online components coupled with the multiple class sessions in the library, with the f2f sessions, as before.”
While the instructors we work with would like to spend more time with us, our students have voiced an alternate opinion. Through surveys at the end of our tutorials, as well as in informal discussions, some students have shared that they consider the tutorials too long or repetitive. (We disagree, but…we’re not the customer!) Despite their feelings toward the time commitment, the students consistently identify the tutorials as being helpful and informative. One student said, “I personally felt the search strategies were quite helpful, although some of the information seemed to be a bit repetitive. Thank you for the effort that went in to adding a little bit of zing to what can be a fairly bland topic.”
The final question students are asked in the survey is “What is one thing that you learned today that will help you in the future?” Their responses to this question show us that students are synthesizing the information in the tutorials.