Vance, J. M., Kirk, R., & Gardner, J. G. (2012). Measuring the impact of library instruction on freshman success and persistence. Communications in Information Literacy, 6(1), 49-58.

This study explored the relationship between one shot library instruction classes and student performance and persistence between the first and second year at Middle Tennessee State University. The data came from library, Student Information Unit, and Finance Unit. It included which students had taken a library instruction class in the Fall 2008 and Spring 2009 semesters, demographic variables, academic preparedness variables (e.g. high school GPA, major, standardized entrance exam scores), college courses taken, and grades received for 3,330 students. The data were then tested using various regression models.

Both an ordinary least squares model (OLS) and a Tobit model found library instruction to have a significant positive correlation with student grades. The OLS model found that library instruction could account for 35% of the variation in GPA, and that students who enrolled in a class that received library instruction would have GPA that was an average of 0.09 points higher than a student that was not enrolled in a class that received library instruction.

 

Tenopir, C. (2013). Building evidence of the value and impact of library and information services: Methods, metrics and ROI. Evidence Based Library and Information Practice, 8(2), 270-274.

This paper outlines several factors to consider when determining academic library value. Tenopir explains that determining total library value is difficult because every collection or service may be viewed from multiple perspectives over different periods of time. However, most conceptualizations of value ultimately are concerned with user centered outcomes, namely what library users are able to do because of the academic library and to what level of success. Return on investment calculations and open ended feedback are two of the methods mentioned to build evidence of library value. The LibValue project, which is sponsored by the U.S. Institute of Museum and Library Services, is also mentioned.

 

Stone, G., & Ramsden, B. (2012). Library Impact Data Project: Looking for the link between library usage and student attainment. College & Research Libraries, crl12-406.

This study was part of the Library Impact Data Project, which was a 6 month project funded by Jisc. It investigated the relationship between student library activity and attainment at 8 universities in the UK. Library activities included checkouts, access to e-resources, and access to the library. Attainment referred to the level of degree earned. The data contained information on 33,074 undergraduate students, and were assessed using the Kruskal-Wallis test between groups of students. Focus groups with students were also run at the various institutions.

The study found a positive relationship between checkouts and degree result, and between electronic resource access and degree result. These results were true for the data from all 8 institutions and at the individual institutions that had the data to run the analyses. The results of the focus groups were quite varied, but overall the participants felt that library resources were important, although not necessarily that they were linked to their degree result.

 

Haddow, G. (2013). Academic library use and student retention: A quantitative analysis. Library & Information Science Research, 35(2), 127-136.

This study examined the relationship between academic library use and student retention at Curtin University (Australia). It combined two sets of data over a three semester period, in June 2010, April 2011, and June 2011. The first set of data came from students, and it included their ID, gender, age, level, socioeconomic status (SES) as determined by zip code, and retention, which was defined as ongoing enrollment at a single institution and completing studies in a certain time frame. The second data set came from the library, which included logins to e-resources and borrowing. The library data was split into different use levels, with zero logins/checkouts indicating the lowest level of use, 1-28 logins/checkouts indicating a medium level of use, and 29+ logins/checkouts indicating a high level of use.

Over the three semesters the student population decreased from 6330 to 4883 to 4684. The study did find that retained students did login to e-resources and borrow more physical resources than students who were not retained. In fact, logins tended to increase over time. While older students withdrew at higher rates, no significant associations were found regarding SES, which could be due to the problematic classification of this variable based on zip code.

 

Matthews, J. R. (2012). Assessing library contributions to university outcomes: The need for individual student level data. Library Management, 33(6/7), 389-402.

This conceptual paper examined ways to examine academic library impact from student, instructor, and researcher perspectives. Matthews notes that most LIS studies focus on library instruction and/or information literacy programs. Unfortunately, the small sample size and the stand alone natures of each study make it impossible to clearly correlate library resources and services to user and institutional outcomes. These outcomes include student success, retention, and graduation rates.

Matthews’ suggested approach is to combine library and university data at the individual student level. Once a library user’s library data is combined with institutional and demographic data their identity can be erased from the combined data set.

Additionally, the paper covers student learning frameworks, literature on the library’s contribution to university outcomes, broad-based data analysis, library data collection that supports assessment, collections and services space, virtual space, community space, ways to combine data, and challenges to the process.

 

Research data services bibliography by Minglu Wang of the ACRL Research Planning and Review Committee (RPRC).

(Part of RPRC’s 2015 Environmental Scan.)

Akers, Katherine G., and Jennifer a. Green. 2014. “Towards a Symbiotic Relationship Between Academic Libraries and Disciplinary Data Repositories: A Dryad and University of Michigan Case Study.” International Journal of Digital Curation 9 (1): 119–31. doi:10.2218/ijdc.v9i1.306.

Budin-Ljøsne, Isabelle, Julia Isaeva, Bartha Maria Knoppers, Anne Marie Tassé, Huei-yi Shen, Mark I McCarthy, and Jennifer R Harris. 2014. “Data Sharing in Large Research Consortia: Experiences and Recommendations from ENGAGE.” European Journal of Human Genetics 22 (3): 317–21. doi:10.1038/ejhg.2013.131.

Carlson, Jake, Lisa Johnston, Brian Westra, and Mason Nichols. 2013. “Developing an Approach for Data Management Education: A Report from the Data Information Literacy Project.” International Journal of Digital Curation 8 (1): 204–17. doi:10.2218/ijdc.v8i1.254.

“CHORUS: Advancing Public Access to Research.” 2014. http://www.chorusaccess.org/.

Creamer, Andrew T, Elaine R Martin, and Donna Kafel. 2014. “Research Data Management and the Health Sciences Librarian.” Library Publications and Presentations, Paper 147.

Diekema, Anne R., Andrew Wesolek, and Cheryl D. Walters. 2014. “The NSF/NIH Effect: Surveying the Effect of Data Management Requirements on Faculty, Sponsored Programs, and Institutional Repositories.” The Journal of Academic Librarianship 40 (3-4): 322–31. doi:10.1016/j.acalib.2014.04.010.

“DIL: Data Information Literacy.” 2013. DIL: Data Information Literacy. http://wiki.lib.purdue.edu/display/ste/home.

“DIL Symposium.” 2013. DIL: Data Information Literacy. http://wiki.lib.purdue.edu/display/ste/Symposium.

Elmore, Justina M., and Charissa O. Jefferson. 2014. “Business Librarians Donning the Data Hat: Perspectives on This New Challenge.” Public Services Quarterly 10 (3): 252–62. doi:10.1080/15228959.2014.931206.

Erway, Ricky. 2013. Starting the Conversation: University-Wide Research Data Management Policy. Dublin, Ohio: OCLC Research. http://www.conference-center.oclc.org/content/dam/research/publications/library/2013/2013-08.pdf.

Ferguson, Liz. 2014. “How and Why Researchers Share Data (and Why They Don’t).” Wiley Exchange. December 3. http://exchanges.wiley.com/blog/2014/11/03/how-and-why-researchers-share-data-and-why-they-dont/.

Giarlo, Michael J. 2013. “Academic Libraries as Data Quality Hubs.” Journal of Librarianship and Scholarly Communication 1 (3): eP1059. doi:10.7710/2162-3309.1059.

Hense, Andreas, and Florian Quadt. 2011. “Acquiring High Quality Research Data.” D-Lib Magazine 17 (1/2). doi:10.1045/january2011-hense.

Holdren, John P. 2014. “Memorandum for the Heads of Executive Departments and Agencies: Improving the Management of and Access to Scientific Collections.” Office of Science and Technology Policy. http://www.whitehouse.gov/sites/default/files/microsites/ostp/ostp_public_access_memo_2013.pdf.

Johnston, Lisa. 2014. A Workflow Model for Curating Research Data in the University of Minnesota Libraries: Report from the 2013 Data Curation Pilot. University of Minnesota – Twin Cities. http://hdl.handle.net/11299/162338.

Johnston, Lisa, and Jon Jeffreys. 2014. “Data Management Skills Needed by Structural Engineering Students: Case Study at the University of Minnesota.” Journal of Professional Issues in Engineering Education and Practice 140 (2): 05013002. doi:10.1061/(ASCE)EI.1943-5541.0000154.

Kim, Youngseek, and Jeffrey M. Stanton. 2012. “Institutional and Individual Influences on Scientists’ Data Sharing Practices.” Journal of Computational Science Education 3 (1): 47–56.

Martin, Elaine, Tracey Leger-Hornby, and Donna Kafel. 2012. Frameworks for a Data Management Curriculum: Course Plans for Data Management Instruction to Undergraduate and Graduate Students in Science , Health Sciences , and Engineering Programs. http://library.umassmed.edu/data_management_frameworks.pdf.

Marx, Vivien. 2012. “My Data Are Your Data.” Nature Biotechnology 30 (6): 509–12.

Mayernik, Matthew S, Lynne Davis, Karon Kelly, Bob Dattore, Gary Strand, Steven J Worley, and Mary Marlino. 2014. “Research Center Insights into Data Curation Education and Curriculum.” In Theory and Practice of Digital Libraries – TPDL 2013 Selected Workshops, edited by Łukasz Bolikowski, Vittore Casarosa, Paula Goodale, Nikos Houssos, Paolo Manghi, and Jochen Schirrwagen, 416:239–48. Communications in Computer and Information Science. Cham: Springer International Publishing. doi:10.1007/978-3-319-08425-1.

Miller, Laniece E., James E. Powell, Joyce a. Guzik, Paul a. Bradley, and Lillian F. Miles. 2014. “A Pilot Project to Manage Kepler-Derived Data in a Digital Object Repository.” Science & Technology Libraries 33 (3): 280–88. doi:10.1080/0194262X.2014.927339.

“New England Collaborative Data Management Curriculum.” 2015. Lamar Soutter Library. Accessed January 10. http://library.umassmed.edu/necdmc/index.

Nilsen, Karl, Robin Dasler, Trevor Muñoz, and Sarah Hovde. 2013. “The Position of Library-Based Research Data Services : What Funding Data Can Tell Us.” University of Maryland, College Park, April 5. http://drum.lib.umd.edu/handle/1903/14742.

Olendorf, Robert, and Steve Koch. 2012. “Beyond the Low Hanging Fruit: Archving Complex Data and Data Services at University of New Mexico.” Journal of Digital Information 13 (1). https://journals.tdl.org/jodi/index.php/jodi/article/view/5878/5882.

Palmer, Carole L, Cheryl A Thompson, Karen S Baker, and Megan Senseney. 2014. “Meeting Data Workforce Needs: Indicators Based on Recent Data Curation Placements.” In iConference 2014 Proceedings. iSchools. doi:10.9776/14133.

Peer, Limor, Ann Green, and Elizabeth Stephenson. 2014. “Committing to Data Quality Review.” International Journal of Digital Curation 9 (1): 263–91. doi:10.2218/ijdc.v9i1.317.

Prado, Javier Calzada, and Miguel Ángel Marzal. 2013. “Incorporating Data Literacy into Information Literacy Programs: Core Competencies and Contents.” Libri 63 (2): 123–34.

Shorish, Yasmeen. 2012. “Data Curation Is for Everyone! The Case for Master’s and Baccalaureate Institutional Engagement with Data Curation.” Journal of Web Librarianship 6 (4): 263–73. doi:10.1080/19322909.2012.729394.

Tenopir, Carol, Suzie Allard, Kimberly Douglass, Arsev Umur Aydinoglu, Lei Wu, Eleanor Read, Maribeth Manoff, and Mike Frame. 2011. “Data Sharing by Scientists: Practices and Perceptions.” PloS ONE 6 (6): e21101. doi:10.1371/journal.pone.0021101.

Tenopir, Carol, Robert J. Sandusky, Suzie Allard, and Ben Birch. 2013. “Academic Librarians and Research Data Services: Preparation and Attitudes.” IFLA Journal 39 (1): 70–78.

Williams, Sarah C. 2013a. “Data Sharing Interviews with Crop Sciences Faculty: Why They Share Data and How the Library Can Help.” Issues in Science and Technology Librarianship 72. doi:10.5062/F4T151M8.

Williams, Sarah C. 2013b. “Using a Bibliographic Study to Identify Faculty Candidates for Data Services.” Science & Technology Libraries 32 (2): 202–9. doi:10.1080/0194262X.2013.774622.

 

Jantti, M., & Cox, B. (2013). Measuring the value of library resources and student academic performance through relational datasets. Evidence Based Library and Information Practice, 8(2), 163-171.

This conference paper describes the University of Wollongong Library’s Library Cube. The project uses datasets from the library, university administration, information technology services, and the Performance Indicator Project (PIP) Team. The latter is a university level team that creates and maintains performance data. The Library Cube, which was built by PIP, contains datasets on student demographics, resource usage, and academic performance. The Library Cube is updated weekly. Its purpose is to improve accountability and to support process improvement and marketing.

Preliminary results from using the Library Cube data indicate that there is a positive correlation between borrowing and academic performance, although e-resource usage had not been added to the Cube at the time the paper was written. It is hoped that the Library Cube will facilitate improved collection development, academic relationships, and marketing. With regard to the latter, the researchers hope to identify what types of users require more outreach and the effect of marketing activities on different groups of students.

 

Gann, L. B., & Pratt, G. F. (2013). Using library search service metrics to demonstrate library value and manage workload. Journal of the Medical Library Association: JMLA, 101(3), 227.

This study explored how the searches done at a medical research library at the University of Texas contributed to requester outcomes, such as publication and education. Their data consisted of requester information, search topic, number of hours spent on searching, and what outcomes the searches contributed to over a two year period. These outcomes included publication scholarly research, grant proposals, and education.

Overall there was information on 989 searches that took 4400 hours. The three outcomes most supported by the searches were for metrics (e.g. for publications), publication of articles and books, and non-patient education (e.g. classes and presentations). However, more time was spent on searches that supported systematic reviews and clinical effectiveness/guidelines. The study authors noted that framing their impact in terms of user centered outcomes was effective in demonstrating the library’s value to various university stakeholders.

 

Gerke, J., & Maness, J. M. (2010). The physical and the virtual: the relationship between library as place and electronic collections. College & Research Libraries, 71(1), 20-31.

This study investigated patron satisfaction with electronic collections. The data came from 520 LibQUAL+ surveys from the University of Colorado – Boulder returned in 2006. The variables considered in the study included the respondents’ responses to all questions related to the electronic collections, their library usage levels, age, discipline, and physical library most often used. Correlations were run between the electronic collection satisfaction scores, usage levels, and age. Independent t-tests were run between the electronic collection satisfaction scores, discipline, and the physical library most often used.

The only significant correlation found in the first set of tests indicated that the more a respondent used the library’s website the higher their satisfaction with the electronic collections. The only significant t-test result indicated that respondents who used the main library, which was also the oldest, were less satisfied with the electronic collections. This result prompted the study authors to suggest that users’ perceptions of physical resources may influence their perception of electronic resources. Such a relationship could be one of many that exists between library factors and user satisfaction or valuation of the library, but more research into why this relationship exists and how it can be modified is needed.

 

Gatten, J. N. (2004). Measuring consortium impact on user perceptions: OhioLINK and LibQUAL+™. The Journal of Academic Librarianship, 30(3), 222-228.

This study used LibQUAL+™ scores to investigate the appropriateness of comparing libraries in the same consortium. The author speculated that comparing libraries in the same consortium would be more effective than non-consortium members because those in the consortium would have more similar characteristics. The libraries in the study belonged to the OhioLINK consortium in 2002 and 2003, and consisted of 84 academic libraries and the State Library of Ohio. The adequacy gap scores of the public universities, community colleges, private colleges, and branch campuses were compared.

The author concluded that comparing libraries in the same consortium was more useful because the shared policies within the consortium helped contextualize the changes in the LibQUAL+™ scores. Some of these explanations included changes in access policies that facilitated borrowing between institutions, increases in the availability of online help, and the dissemination of information about useful services. The findings also indicated which institutions showed the most improvements after consortium wide changes.

Site Admin

© 2010-2012 Association of College & Research Libraries, a division of the American Library Association

Suffusion theme by Sayontan Sinha