https://researchtracker.fairfield.edu (Username: primo, Password: primo)
Authors: Jacalyn Kremer (Project Leader), Wit Messangnil (Developer), Philip Bahr, Joan Clark, Curtis Ferree, Ramona Islam, Jessica McCullough, Christina McGowan, Laura Weber
Institution: Fairfield University
Interviewer: Bill Marino
Fairfield University DiMenna-Nyselius Library’s Research Tracker gives students a framework by which to navigate the complex stages of research and writing and avoid last-minute research panic. This flexible tool can be used in conjunction with face-to-face instruction, in the online learning environment, or independently.
Key features include:
- The calculation of steps with assigned due dates that can be migrated to a student’s calendar.
- The ability to store documents and research in one place.
- Customized content suggestions based on assignments.
- Creative Commons Licensing
Further information on the tool as well as integrating it into an institution’s information literacy program can be found athttp://librarybestbets.fairfield.edu/researchtracker.
Q: What inspired you to create the Research Tracker?
A: In short, our students. Through our information literacy program we saw a real need to have a framework by which our students could engage with the research process. Research Tracker provides that framework. It is a tool that can be used both in and outside of a classroom setting, independent of the library.
Q: You mention in the description that Research Tracker is the culmination of seven years of research and application. Can you take us briefly through that process, touching on key lessons learned and how they informed your design decisions?
A: Two key characteristics stand out when speaking of this project: collaboration and “stick-to-it-ness,” or the feeling that you’ve got something that is good, but it’s not perfect and you have to continue working on it. The Research Tracker originally began as a print document used in beginning (i.e., first year) information literacy classes to help teach the research process. It provided a tangible framework for interaction. The institution’s instruction librarians, who meet both before each semester to plan and after each semester to assess the library’s plan for information literacy instruction, collaborated to assess and revise this initial product over a period of around seven years.
This paper version was converted to an Excel document platform in January of 2012 and put online. The Excel version introduced the due date algorithm.
In January of 2014, the Excel product was converted to a web-based PHP version. This process took around 6 months and included the following enhancements that further married the research and writing process:
- Single sign-on authentication. This allowed due dates to be saved to students’ calendars, as well as for documents to be saved and stored for each user.
- Customized content suggestions based on your assignment (i.e., databases are recommended based on the subject that you’re researching).
Q: Who is the intended audience for this tool?
A: The primary audience is first year students and capstone courses. Every first year student uses the tool with the goal of introducing the iterative process of research and writing. Capstone course students employ the tool to its full capacity.
Q: How did you determine the due date algorithm for each of the steps?
A: A lot of research and reading about the undergraduate research process provided the foundation for the due date algorithm.The University of Minnesota’s Assignment Calculator was also an influence. The algorithm, however, is not static; it changes based on feedback and ongoing research. For instance, we initially found the algorithm was too heavily focused on research and therefore, it was modified to balance research and writing.
Q: Did you collaborate with any campus groups (e.g., the Writing Center), students, or faculty in the development of this tool? Why was their contribution important to the tutorial?
A: Our biggest collaborators were our students. Student input during the testing phase, both during the Excel phase and the move to the cloud, influenced both the overall process and the algorithm. In some instances, the steps were modified because originally they did not match the way that students went about their research and writing. Our English faculty have been part of the evolution of the tool since its inception, giving feedback through their participation in our information literacy classes. In addition, our IT department set up the server for hosting the finished product and assisted with the single sign-on authentication. Each contribution was important to the finished end product.
Q: Did you use any special tools to code the tutorial? How do you host the finished product?
A: The web-based application is written in PHP and data is stored in a MySQL database. Wireframe tools were used to mock up the cloud version (based on the Excel version). Coding was done in a text editor/IDE (integrated development environment). The project is hosted on the university’s virtualized Linux server.
Q: Did you encounter any unexpected problems as you developed the tool? How did you overcome these issues?
A: We had wanted to create a web-version for a while, but lacked the technology skills on staff. When Wit Messangnil was hired as the Head of Digital Services and Technology Planning, it became possible.
Single sign-on authentication was an expected problem. Initially, no one had expertise in developing for LDAP (Lightweight Directory Access Protocol). With help from the IT department, it was fixed.
Q: Did you perform any usability testing on the tool? If so, what did you learn?
A: Yes. Some pretty extensive testing was performed both on the Excel document and web-based version. It was tested on a variety of students (undergraduate and graduates from multiple disciplines). A formalized process was developed: students used the Research Tracker without any assistance, answered a series of questions, and finally met with a librarian one-on-one after this process to flesh out their answers.
User testing showed that students wanted university branding on the tutorial. Because of this feedback, we added our mascot branding.
This user testing also informed us that this tool is not for everyone. Some of the most capable students can already articulate a personal research process that works for them; in some instances it does not correspond to that of the tool.
Q: How have you been promoting the Research Tracker to faculty and students?
A: We have a LibGuide box that is shared and can be added to the library?s other guides. We have presented at faculty workshops to encourage use of the tool. Research Tracker is highlighted on the library homepage with its own big button and the mascot branding.
Q: How is the tool being used on your campus? What has student reaction been to Research Tracker? Faculty reaction?
A: The web version has been used for one semester. When a student survey was conducted (n=~200), 98% agreed or strongly agreed that the Research Tracker would help them stay organized throughout the research process. 93% thought it was simple and easy to use. 90% said that they would use the Research Tracker in other courses.
It has received positive feedback from faculty.
Q: There are a number of places where learner input is gathered and stored. Is the library collecting any of the input for assessment? Are you assessing the tool via other means?
A: While usage data (the number of papers in the system, student use/reuse of the material) is collected, the library is not currently mining any data for assessment purposes.
The tool is being assessed by user surveys. It is also a part of the information literacy toolkit used in the library’s overall info. lit. program, which is assessed by pre- and post-tests.
Q: The tool is licensed under a Creative Commons license. Do you know of any other institutions that are using the Research Tracker?
A: Not at the moment – we are finishing cleanup on the source code before we release it. Anyone who wishes, however, is free to use the tool.
There are some universities as well as a high school that are using the Excel version.
Q: Anything else that you’d like included?
A: Thanks to PRIMO. We’d also like to iterate that you don’t have to be a big institution to create quality learning objects – if you can dream something and you stick with it, you can complete it.