For years, reference stats meant keeping a tally using hash marks on a paper form. Over time, though, simply measuring how busy we were at a physical desk no longer sufficed. The value of the community college library and its services must now be demonstrated year in, year out. Identifying trends can also help us adjust our services to meet our students’ needs. Hash marks no longer suffice. What tools can help us with a more sophisticated analysis?
This subject came up in the CJCLS Community on ALA Connect when a librarian asked colleagues to respond with how their libraries capture reference data. “We currently use the READ scale (on paper),” she wrote, but “our current data collection doesn’t tell us if it is email, chat, etc. And we aren’t currently capturing the subject of the question.”
Answers came in from librarians around the country. Several referred to the Springshare line of products, including LibInsight and their LibAnswers component, Reference Analytics. Sandra McCarthy from Washtenaw Community College in Ann Arbor, Michigan, uses yet another Springshare product to track reference desk questions. “We created a LibWizard form,” she wrote. The form captures date, time, day of week, and level of reference following the Warner Model.
A Florida librarian says that her library uses DeskTracker from Compendium Library Services. “The program is very flexible,” she writes. It “allows us to add questions or comment boxes for each of our forms.”
Lisa Eichholtz, from Jefferson Community & Technical College in Louisville, Kentucky, says because her Learning Commons incorporates both library and tutoring services, some of their data for extended reference services comes to them through Tutortrac.
The free Google Forms site works for one librarian in Texas, who writes that the form was customized to collect much the same information that can be found in Reference Analytics and then sorts the answers into a spreadsheet.
I also responded to this query. I used to work at a library in Los Angeles which used Gimlet to track reference stats. Gimlet’s tagging feature was especially nice for analytics, as we could design our own custom tags to make reporting easy.
If you’re considering more sophisticated reference tracking tools, these are all worth looking into!
Having maintained reference stats forms for the past fifteen years, I have learned that there’s a balance to strike between collecting richly detailed data and keeping the form simple for librarians to fill out on the fly. “It takes a few seconds to document each patron interaction, but I’ve found I can usually do it immediately after the patron walks away,” writes a librarian in Houston, Texas. That’s the sweet spot to aim for. It helps to separate the nice-to-knows from the need-to-knows: What can you use to help improve services from the ground up? What does library management want to find out? What does your larger campus administration need to know in terms of the value of your library’s services?
The biggest question for me, right now, is whether there is a way to gather data that will show us whether students who use our reference services show greater academic success compared to students who don’t get our help. Causality is hard to prove, but even a nice correlation would be nice to see! Tracking students while hiding their identities behind random ID numbers is certainly possible – but will asking students to “swipe in” be a deterrent? Perhaps this can be our next discussion in the CJCLS Community on ALA Connect!
Want to join the conversation? See How to Use ALA Connect Like a Listserv.
Still need to join our community in ALA Connect? If you’re not an ALA member, create a free login. With your ALA login, you can go directly to our community page and click the big blue JOIN COMMUNITY button up top.