Exploring Academic Integrity Tutorial
http://libraries.claremont.edu/achontutorial/pages/
Authors: Char Booth, Dani Brecher Cook, Sara Lowe, Sean Stone, Natalie Tagge
Institution: Claremont Colleges
Interviewees: Dani Brecher Cook and Char Booth
Interviewer: Megan Hodge
Tutorial Description (provided by the authors):
The Exploring Academic Integrity Tutorial (EAIT) introduces students to the idea of being part of the scholarly conversation, and by extension the rights and responsibilities that come with being part of a scholarly community. This focus – one of inclusion and participation – was intended by its Claremont Colleges Library creators to present a contrast to many other academic honesty tools, which in our observation tended to focus more on the punitive aspects of academic dishonesty (e.g., the negative ramifications of plagiarism, etc.). The interactive online EAIT consists of four core sections, each with a thematic tie to concepts from information literacy and information ethics. By the end of the tutorial, the goal is that students feel a more personal connection to the production and use of scholarly material, and a more nuanced understanding of academic honesty as a result. The tutorial also features historical images from the Claremont Colleges Digital Library, to personalize it to our campuses.
Q: What led to the creation of the tutorial?
A: The Claremont Colleges Library serves five undergraduate liberal arts colleges (Pomona, Harvey Mudd, Scripps, Pitzer, and Claremont McKenna) and two graduate schools (Claremont Graduate University and Keck Graduate Institute). Several years before the tutorial came to fruition, Dianna Graves (Director of Academic Planning at CMC) inquired of the then Head of Instruction Services whether the library had considered creating any resources related to academic honesty. The campus was seeing a variety of cases of academic honesty violations, many of which she believed stemmed from an incomplete understanding of academic honesty–for example, students knew not to plagiarize directly, but didn’t realize that working on a problem set with a classmate might also constitute academic dishonesty. At that time the Library did not have sufficient capacity to support large-scale in-house tutorial development, but this request (and others like it) provided useful justification for the eventual creation of a new Instructional Design Librarian position. Once this individual was on the Instruction Services team, it became possible to respond to campus requests to create an interactive tutorial that addressed this need. It also seemed like a natural extension of our information literacy work, and something that would be broadly useful to our campuses; after all, the Library provided instruction on citation management, copyright, fair use, and the idea that academic output is a “conversation” between scholars.
Q: How did you decide which learning objectives to focus on?
A: Whereas we leveraged open source code and external design elements to assist with the tech build of the tool, the conceptual content of the EAIT was “scratch built” (meaning that we started with a blank canvas and allowed ourselves to build something new and independent). This approach was labor-intensive, but undoubtedly resulted in a product that was better customized to local academic needs and culture, and moreover didn’t “sound like” other learning tools.
To begin our development of theme and learning objectives, the tutorial development team (Char Booth, Dani Brecher Cook, Sara Lowe, Sean Stone, and Natalie Tagge) engaged in a series of scoping conversations, considering our local campus definition of information literacy and where academic honesty “fit in.” We also discussed the importance of respecting our students’ intelligence, and totally avoiding the trap of treating them as though they were errant, academically dishonest children. Through these conversations, we elected to focus the tutorial on a message of “scholarship as conversation” as opposed to one of “don’t plagiarize.” We did this for two main reasons: 1) we thought students would respond better to a positively framed approach to academic honesty, and 2) students have been hearing “don’t plagiarize” since high school,and if the message hasn’t sunk in by now, it probably never will. From there, we broke down the idea of academic honesty into 4 components:
- establishing the idea of the scholarly conversation
- exploring the nitty-gritty of attribution and citation
- providing examples of a variety of academic honesty scenarios, and
- introducing the concepts of copyright and fair use.
We felt like these were areas where the library could provide insight and expertise, and that would get at the original request to provide a nuanced understanding of academic honesty.
Q: Who developed the tutorial? Did you involve any faculty, students, or other campus partners?
A: The team of five librarians named above–Char Booth (then Director of Research, Teaching, & Learning Services), Dani Brecher Cook (Information Literacy and Research Services Coordinator), Sara Lowe (then Assessment Librarian), Sean Stone (then STEM Team Leader), and Natalie Tagge (then Social Sciences Team Leader)–developed the tutorial. The group worked together to brainstorm an initial outline of the tutorial and learning objectives for each section, then delegated one section apiece to Booth, Lowe, Stone, and Tagge to individually flesh out the content. Drafts of each content section were then presented to the full group, who critiqued, revised, and refined the content. Booth, Lowe, Tagge and Stone also identified respected faculty from diverse academic disciplines at each campus to interview for their input on the meaning of the scholarly conversation – we hoped that invoking faculty voices would further connect students with the tutorial’s themes of individual responsibility and academic contributions, and these interviews are featured prominently throughout the tutorial.
The developed content was then translated into a PowerPoint wireframe by Cook. At that point, Booth distributed the wireframe to over 25 administrative, faculty, student, and student services stakeholders across all seven campuses for comment, including writing center directors, faculty members with writing-intensive courses, students involved in campus academic integrity efforts, and Deans of Students. We received an excellent response to this request for feedback, which had an essential (and calculated!) secondary effect of engaging the community who would eventually help us achieve campus adoption of the EAIT. The comments we received – many complimentary, some more challenging and difficult to reconcile – were carefully considered and largely incorporated into the final tutorial.
Once we had a polished near-final product, we assembled a group of students to test the tutorial and provide further feedback — they gave their input during a series of in-person focus group sessions, and their comments were incorporated into the tutorial before we launched it in fall 2015. This student feedback was taken as seriously as the first round of largely faculty, administrative, and staff feedback – in fact, one student even asked to write a nuanced scenario to include in the assessment which we did incorporate – we are proud to feature student-created content in the tutorial.
Q: How did you decide upon the format of a multi-module tutorial?
A: Several years ago, we adapted UCI’s Begin Your Research tutorial at Claremont – this tutorial also provided the content design backdrop for the EAIT. We found that this format was quite successful, as faculty can assign students to take the whole tutorial or specific modules based on need. It also allows for self-guided assessments for understanding at the end of each discrete unit. We liked the flexibility of the multi-module format, which made it seem much less overwhelming than a tutorial with 100+ contiguous pages.
Q: Did any other plagiarism/academic integrity resources (print or online) inspire you?
A: The ethical scenarios in the quiz were inspired by an article by Connie Strittmatter and Virginia Bratton in College & Research Libraries. In terms of the structure, again, we are indebted to the University of California, Irvine, for developing and sharing their Science Information Tutorial. We also liked the interactivity of the University of Texas at Austin’s “All about Plagiarism” tutorial.
Q: How long did the tutorial take to create?
A: It took several years from receiving the initial suggestion to developing the tutorial to having a pilot version to test; again, it was essential for the then-Instruction Services department to first advocate for a position with the technical skill and dedicated focus to code a project of this magnitude. After that position was recruited and work in earnest began, the tutorial took approximately a year from start to finish. Creating the storyboard and script took about 3-4 months, the videos took about a month, and the coding took about 6 months. While the content development and interviews were conducted as a team, we only had one person engaged with the technical work of tutorial creation, so that piece took the longest. On the positive side, the protracted development process did allow us to gather and incorporate feedback from many campus stakeholders.
Q: Which technologies do you use, and why? Would you recommend that others use those technologies as well for similar purposes?
A: The tutorial is coded in HTML and JavaScript. You could use pretty much any web editor to adapt our code. It’s a steeper learning curve than using tutorial creation software, but allows for much more customization. It’s also something that can then be adapted by any library without needing specific, expensive tools.
Our videos were captured with a Sony Flip camera on a tripod. When we update the videos in a few years with different faculty, we are definitely planning on using more professional video equipment. The Flip did a reasonable job capturing the sound, but we found that it was doing a very obnoxious auto-focusing thing during the filming. For a quick video to post on social media, the Flip is great (though not necessarily better than a smartphone camera), but we wouldn’t use it for tutorial video creation again. Of the overall tutorial, the videos are arguably both its most important and problematic element due to their high impact and low quality.
We edited and captioned the videos in Camtasia, and cleaned the audio in Audacity. We were happy with both of these tools; they are relatively user-friendly, and do the job without a huge amount of extraneous bells and whistles. We did learn that Camtasia for Mac and Camtasia for Windows are not cross-compatible, so when one person edited a video on the Mac, we couldn’t then re-edit it on a PC. That dynamic is definitely not ideal for a team that’s working on both platforms, but is something that’s been fixed in the newest version of Camtasia…if you are using version 8.3 for PC/2.7.2 for Mac or older–look out!
Q: Were there any best practices and/or accessibility guidelines you followed while creating the tutorial?
A: We made sure that the tutorial is accessible for those who use screen readers, so all of the images have descriptive “alt” tags and the videos are fully captioned. There are multiple ways of navigating through the tutorial–through the side menu, top menu, and with the arrows on each page. We’d like to make the tutorial keyboard navigable, so that’s something we’ll incorporate in upcoming edits. In general, we follow WCAG 2.0 guidelines.
In terms of instructional design best practices, we limited the learning outcomes for each module, included interactive components to allow students to practice what they learn in the tutorial, and repeat the core ideas (such as “scholarship as conversation”) several times throughout the tutorial.
Q: Who is responsible for grading the end-of-tutorial quiz? How are the qualitative questions scored?
A: The assessment was developed by the full team through a rigorous design process, then Cook created embeddable versions of the quiz for the two learning management systems that are used on campus: Sakai and Canvas. The multiple choice questions are automatically scored through the quiz functions in the LMS, and also provide automated feedback. The qualitative questions are meant to be the starting point for class conversations, so they aren’t automatically scored. When librarians meet with faculty at the beginning of the semester, we suggest ways to use the tutorial in a classroom context. The scenarios can form the core of a nuanced class discussion of academic honesty, and that’s how we encourage people to use them. Of course, if faculty members prefer to score the questions through the LMS, they can do that as well.
Q: Have you done any assessment of the tutorial’s effectiveness in meeting each of the module’s stated objectives? Do you plan on making any updates to the tutorial?
A: We have the quiz data in aggregate, and each question in the quiz is cued to a specific objective in the tutorial. This summer, we’ll be mining that data to see if we can find patterns in where it looks like students may still have confusion. As we did in the development phase, we’ll also do a round of focus groups with students if we replace any of the questions or change tutorial content. We absolutely do plan on making updates to the tutorial, beginning this summer. As more people take the tutorial, we get a better sense of the needs and concerns of our community, and we want to be responsive to those comments.
Q: How has the tutorial been promoted and used inside and outside the library? Have you gotten any feedback?
A: Even before the tutorial was finalized, the initial group of administrators and faculty who provided feedback began requesting to use the tutorial in select student cases where academic dishonesty was involved – this was a totally unexpected and rather fascinating result of our creating this tool (in a number of cases, students have been “assigned” to complete the tutorial and quiz as part of a larger disciplinary action). We launched the tutorial formally on the campuses in Fall of 2015. Claremont McKenna College adopted it as a requirement for all incoming freshman to complete before they can register for courses, and it’s been adopted in a number of first-year writing classes, science courses, and upper-division and/or graduate courses at all seven institutions.
Our intention was that the tutorial could be used without having a librarian necessarily facilitate it, and that’s how it’s been used so far–so although it has been used in some courses that also have library instruction, we offer it as a discrete tool. We have continued to gather feedback from people who have taken the tutorial, and it’s been remarkably positive. The students who have taken it report that it maintains their interest and they especially are engaged with the ethical scenarios in the quiz. Faculty members have asked if we can add additional disciplinary nuance (e.g. citations in STEM fields versus social sciences and humanities), so we are going to explore doing that this summer. We’ve also had people ask if we could make the tutorial responsive for multiple device types, which was something we already had on the to-do list.
Q: The tutorial is posted with a Creative Commons license. Have other libraries adapted the the tutorial so far?
A: Acceptance into PRIMO is the first time we’ve really broadcast that we have this tutorial, in all honesty! We’ve viewed this first year as a pilot, but we feel like it’s in a good place to share with other libraries now. Cook and Tagge will also be presenting on the tutorial at Library Instruction West in June, so we’re hoping that this will get people aware and interested in using the tutorial. So: no adaptations yet that we know about, but we hope it will be a useful tool for academic libraries.
Q: What lessons did you learn? Do you have any recommendations or advice for librarians interested in developing something similar?
A: One major lesson that we learned in developing this tutorial was about timeline: however long you think something will take, double it. Between gathering feedback, learning how to code various components of the tutorial, and creating the videos, it took us longer than expected to put the EAIT together. And because our community was eager to use the tutorial when they learned of its development, it would have been good to have a generous go-live date from the beginning.
Getting feedback from students and faculty and incorporating it into iterative versions of the tutorial has been incredibly informative. Some things we thought were great ended up lacking a positive response from students and/or faculty; in turn, the community brought many valuable ideas and considerable nuance to the tutorial that (if we had had a more closed development process) would have been entirely missing. We believe that the EAIT has added significantly to the resources our campuses have at their disposal to intelligently approach the question of scholarly integrity, and moreover that the feedback and buy-in we sought before the tutorial launched has contributed to the project’s quality and wide adoption. So, the major lesson we can pass along is to go out of your way to seek feedback during the development process.
April 2016 Site of the Month