2020 Sessions & Presentations

Sessions & Presentation Abstracts

Proposals have been reviewed and presenters have been notified of decisions. Session abstracts will be added below as they come in with revisions from presenters. Please check back often!

Concurrent Sessions

Best Practices for Whom? Building Equitable and Accessible Research Data Management Education and Support in Academic Libraries

Dr. Hannah Gunderman, Carnegie Mellon University Libraries
Leah Cannon, MSIS Candidate, University of Tennessee School of Information Sciences
Kevin Mallary, Doctoral Student, University of Tennessee School of Information Sciences

Librarians providing research data management (RDM) services in library environments help support patrons’ original research through education on how to create organized, documented, accessible, and reusable research data. RDM education, outreach, and services provided to researchers in academic libraries generally follow a cookbook of best practices surrounding the management of these research data. Best practices, however, often assume the researcher is operating at a baseline socioeconomic status, with access to certain institutional services, and having certain cognitive and physical abilities. Librarians can help provide researchers with techniques for overcoming certain barriers to RDM while also reframing best practices towards accessibility and equity.

This concurrent session brings together LIS graduate students with an interest in RDM and accessibility with a faculty librarian in RDM to approach equitable RDM support from multiple topic areas. Using a “calling in” approach, we aim to identify opportunities within best practices in RDM to use compassionate language, inclusive technical support which factors in socioeconomic status, and considerations for accessibility. This session also represents the groundwork for the development of the Toolkit for Equitable RDM Support in Libraries, a community-created toolkit of resources for students and librarians working in RDM, slated to debut in summer 2020. This toolkit will incorporate the findings from the papers in this session, combined with audience feedback, to provide a community-curated resource for new and experienced RDM librarians who are supporting patrons’ original research.

But, we’ve always done it this way: a mixed-methods approach to reconsidering library workshops

Jesse Klein, Mina Akbari, and Dan (Brew) Schoonover
Florida State University

Academic libraries have established trends in developing and implementing library programming that, as a profession, tends to apply across the board with little needs assessment or evaluation. We are interested in connecting our current and future instructional programming attempts with the research and teaching needs of undergraduate and graduate students, and faculty at FSU. This project is exploratory and intended to help improve instructional programming at FSU Libraries so that we may better serve our campus community.

The Florida State University Libraries provides instruction workshops each semester for both FSU students and faculty members on a variety of research-related topics. Workshop topics are typically chosen based on personal interactions with students, librarian intuition, informal observation of researcher behavior, or local, small-scale assessment. There is a gap of actual data informing the Libraries of the perceived needs of FSU researchers and how the Libraries can best provide instruction services. It is the goal of this study to strategically develop library instruction programming based on the collection and assessment of data rather than on historical precedent or our own assumptions.

We are contributing to existing knowledge in several ways. Most importantly, we will be introducing a comprehensive, mixed-methods research design not currently demonstrated in the literature. By introducing a more in-depth, triangulation approach, this particular study will build upon existing studies that have informed library instruction programs by surveying or interviewing their local research communities. In this presentation, we will share the research planning and design process we used to assess existing data and information upon which workshop planning has been based. We will also examine the decisions that went into the development of a mixed-methods approach of triangulating a broad survey, in-depth interviews, and subsequent focus groups as well as the benefits and challenges of this type of research design.

For example, after identifying the lack of mixed-methods research designs in the literature pertaining to library workshops, we discussed how no single method is complete or comprehensive. To align with extant research we agreed that a survey would be the first logical step while also acknowledging that surveys only provide one layer of quantitative information. Since we are also exploring users’ preferences and desires, the interviews and focus groups are a better method to capture those qualitative responses. Another benefit of focus groups is that they facilitate group learning about how others use library services, and their experiences that inform their preferences and decisions. When taken together with the survey, we will be better positioned to answer both the “what” and the “why” of users preferences and needs related to the content and delivery of library workshops. While there are challenges to any approach, it is important to note that we attempted to balance issues related to time, capacity, recruitment strategies, and whether to apply for funding for incentives. One of our strategies for addressing these challenges includes having a more thorough communication plan for the multiple stages of recruitment, which we anticipate will counteract the lack of incentives. By the time of the conference, we anticipate being able to share preliminary findings from the research.

Creating a Research Support Framework at a New High Research Activity University

Stephanie Crowe, University of North Carolina Wilmington, Library

This past year, our university was reclassified to R2 status (Doctoral Universities – High Research Activity) in the Carnegie Classification system. While our library is highly regarded on campus, faculty have typically perceived it as mainly focused on supporting students. As our new Coordinator of Liaison Librarian Services, part of my job has been to collaborate with units across the library and across campus to reconceptualize the library’s role as a research partner for graduate students and faculty. In this presentation, I will discuss our goals, processes, growing pains, and outcomes thus far. Attendees will leave with information about the methods by which one new R2 library went about identifying funding, service, and staffing needs, details about the specific gaps that we identified, and the initial steps that we are taking to meet those needs.

Data sharing policies in the social sciences: How business journals compare, and implications for subject librarians

Brianne Dosch, University of Tennessee Libraries
Tyler Martindale, Auburn University Libraries

Within the social sciences, the practices of data management and data sharing have become standard and they are often required by many publishers and academic institutions. This “open data” movement benefits researchers committed to the concept of data sharing and reuse, and is in line with the overarching trends of Open Access and Open Science within the social sciences research community. However, for business researchers within the social sciences, data needs tend to stress discovery, visualization, and analysis rather than data management, making the trend of “open data” less critical). Put simply, even though research data management support is a major trend in academic libraries, data services often look different for different disciplines. And specifically for business researchers, these differences need to be explored to define what data services should be curated and focused on by business librarians. To identify these data culture differences, two social sciences subject librarians reviewed and documented the data policies of ~140 business journals. These policies were then analyzed for trends specific to business research and compared them to general data culture discussion taking place within the rest of the social sciences (i.e. Open Data and Open Science). This session will focus on (a) how and why we decided analyzing business journals’ data policies would help us answer our research question(s); (b) our process and methodology of reviewing and analyzing the business journals; and (c) strategies to apply a similar methodologies in different subject areas . It will also include a discussion of how we connected our findings to the larger conversation of open data within the social sciences and why it’s important to discuss the unique data needs of business researchers. Librarians supporting specific schools or departments may benefit from a concentrated understanding of their constituents’ data needs, and this presentation will share how we sought to begin uncovering the specialized data needs for business researchers.

Email Interviews in Practice: A Multi-Site, Qualitative Case Study from the ACRL Visual Literacy Task Force

Dana Statton Thompson, Murray State University

The new ACRL Visual Literacy Standards Task Force was formed in 2018 to update the 2011 Association of College and Research Libraries (ACRL) Visual Literacy Competency Standards for Higher Education and align them with the 2016 ACRL Framework for Information Literacy for Higher Education. As part of this effort, we are conducting a qualitative study using email interviews to understand outside stakeholders’ professional views on definitions of visual literacy, visual literacy pedagogy, and crucial 21st century visual literacy skills and competencies. In this presentation, as a representative and co-principal investigator of the group comprised of eight women at eight different organizations, I will share lessons learned and practical advice for librarians who are interested in learning more about multi-site projects, especially in regards to choosing principal investigators and IRB processes and procedures; choosing qualitative methods over quantitative and using a grounded theory approach to coding; and the pitfalls and successes of using email interviews rather than face-to-face interviews. Attendees will leave the session with suggested resources on these topics and a better understanding of conducting multi-site research.

Faculty Mentor & Librarian: Leading Original Research toward Student Presentation

Derek Malone, University of North Alabama Libraries

Our campus has an undergraduate research initiative. The culmination of the original research for each student is a presentation at the end of the year during “Scholars Week,” the annual showcase of research endeavors.

Each student requires a faculty mentor in the undergraduate research process. This presentation will cover a first-time initiative, and now a sustained occurrence for our library; a librarian serving as the faculty research mentor. Now, four years into serving as a research faculty mentor: I’ll share what I’ve learned in the process regarding recruiting students (disciplines, background, academic standing, etc.), choosing research projects with students (ex. matching their research interest with mine), conducting the research, analyzing the research, and motivating toward presentation with extensive evaluation.

Future initiatives and opportunities concerning outreach, the research process, and librarian-led research will also be discussed.

Get a (Study) Room: Assessing Room Use Patterns from Pre-Existing Data Sources

Norah Mazel, University of Colorado, Colorado Springs

In various surveys conducted by the library at the University of Colorado Colorado Springs, students often request more quiet space, more collaborative space, and more group study rooms – with a recurring complaint being that the study rooms are always full and that these rooms intended for groups are often occupied by one person. However, qualitative survey responses, particularly feedback in open text comments, raise a larger question: Does this anecdotal data reflect a few bad student experiences or a consistent problem? The library identified two options for investigating these concerns. We could design a new study to examine study room use patterns or look for evidence that either supported or refuted students’ perceptions inside secondary data the library already had access to. This presentation describes the latter process of examining pre-existing data to answer the questions of whether the study rooms are consistently full and whether they are occupied by single users more frequently than groups. It will outline how we initially identified data sets that could provide insights about these questions, assessed the gaps left by that data, and determined whether combining data from those sources addressed these concerns. By merging library headcounts collected with SUMA and room reservations made through LibCal, we confirmed that we had a clear enough quantitative picture of study room use to make an original study unnecessary. Attendees will learn about the benefits of working with secondary data to determine whether they can answer a research question with existing information before committing time and resources to an original study. The presentation will also highlight aspects to consider when approaching secondary data in order to determine whether meaningful information can be extracted from it, whether it is feasible to merge data from multiple sources, and what questions the data leaves unanswered.

Qualifying Virtual Reference: An Examination of Chat Data in an Academic Library Setting

Kat Brooks, Michael Deike, Sarah Johnson, and Niki Kirkpatrick
University of Tennessee Libraries

The University of Tennessee, Knoxville’s John C. Hodges Library provides research assistance via chat any time the library is open, including overnight on weeknights and until midnight on the weekends. Due to this wide timespan, chat services are provided by a large team of Commons Librarians, Public Services staff, Subject Librarians, and Graduate Student Library Assistants all working various shifts. Curious about the quality of chat provided in a team this size, as well as the consistency of services provided by professionals of various levels, the Commons Librarians sought to develop a method of assessing chat interaction quality. To this end, we sought to determine a metric to quantify what makes a good chat interaction.

For this presentation we will discuss the creation of a rubric designed to meet the need of quantifying a good chat interaction. To create the rubric, we began with sections of the RUSA Guidelines for Behavioral Performance of Reference and Information Service Providers focusing on remote reference. From here we narrowed down the rubric to areas which directly applied to our unique context as a high traffic academic library. With this in mind, we sought to consistently measure the speed of transactions, pleasantness of transactions, and accuracy of the information provided. To ensure standard application of the rubric, we tested it on 10% of our sample of chat data, and we convened our team to iron out inconsistencies and reach an agreement. During this process we found exemplars of chats representing the best transaction and the worst transaction creating a scale between the two to determine a standard quality.

While the project was overall successful in providing a measure for the success of our chat service, designing the rubric highlighted limitations to the use of rubrics in the evaluation of chat.  Attempting to standardize the application of the rubric revealed the bias intrinsic to measuring the correctness or incorrectness of reference transactions. Basing the rubric on RUSA Guidelines also highlighted the limited nature of premade standards to apply to all contexts.

This process of assessment development goes beyond chat assessment as other types of library customer service assessment including library instruction can benefit from this presentation as this rubric schema development process is applicable to all. As rubric development is critical in quality assessment, this process insight may provide clarification or optional approaches to rubric and quality assessment development.

Using the Research Consultation to Empower Researchers Struggling with Impostor Syndrome

Vanjury “V” Dozier, University of San Diego

Impostor phenomenon (AKA IP, imposter syndrome, or imposter feelings) generally refers to a person’s feelings of inadequacy in situations despite evidence otherwise. Scholars also recognize IP’s likelihood to negatively affect those with historically marginalized identities, particularly when they are in predominantly white, male, and/or heterosexual environments or institutions. How do we as librarians learn to recognize imposter syndrome? How can we empower our researchers to overcome their feelings of inadequacy during research consultations? How can we infuse critical pedagogy and librarianship into our research consultations? This interactive workshop will use a scenario-recommendation approach to help participants learn to recognize IP, explore best research consultation practices, and empower our researchers.

Using Websites to Study Library Resources, Services, and Organizations

Ashley Sergiadis, East Tennessee State University Libraries

Libraries rely on websites to inform patrons of their resources, services, and organizations. Consequently, these websites are a rich source of research data to discover library trends, whether it is determining which databases are most commonly offered or which departments have the most employees. Content from library websites not only allow researchers to explore these types of questions, but they also indicate how libraries communicate information to patrons.

Come to this session to learn when and how to conduct a content analysis of library websites. It will begin with an overview of the types of research questions that can be explored using this method, including examples from research projects within the library and information science literature. Then, the presenter will delve into the process of conducting a content analysis of library websites, breaking down the general steps and providing helpful tips along the way. These steps will be demonstrated with an example of a study conducted by the presenter that used information from library websites to determine which academic library departments (collections/technology, research/instruction, etc.) manage institutional repositories. Lastly, the session will conclude with an activity that will walk attendees through thinking of a research question and how to design a content analysis for that question.

Lightning Talk Sessions

Analyzing and Visualizing a Hospital Library’s Patrons: Utilizing Data to Shape Library Services

David Petersen, Kelsey Grabeel, and Cameron Watson
University of Tennessee Graduate School of Medicine Library

Objectives:
Researchers analyzed and visualized a hospital library’s consumer health request database to better understand who was utilizing the library and how the library was reaching the hospital’s 21 county service area. Data was assessed using defined methods to examine users by county, usage from high poverty level zip codes, and areas where future outreach is needed.
Methods:
Researchers downloaded data from the Consumer and Patient Health Information System (CAPHIS) database, located on a hospital server; data has been collected in this database since 1997. Every request form asks for the patron name, date, county, zip code, and topic requested. Requests are labeled with a form identification number, which allows for personal names to be deleted, thus helping to preserve anonymity. Researchers had to manually eliminate requests that did not include demographic information or used the hospital’s zip code as a placeholder. Out of area requests were removed. Researchers separated data into pre-move and post-move. The request forms were then sorted by county and zip code. Data was inputted into Tableau to create several maps, visually showing where the largest concentrations of patrons are located as well as usage by zip code based on poverty levels.
Results:
There were 3,141 health information requests from September 21, 2014 to May 31, 2019, representing a 208% average monthly increase from the old location. The majority of requests were from Knox county and adjacent counties. Requests were also received from counties not previously reached in a 2012 study and counties with elevated poverty levels, including some of the highest poverty zip codes in the hospital’s service region.
Conclusion:
Collecting data on patron interactions is not only critical for institutional reporting, but also for community outreach. Understanding that data requires taking additional steps to ask precise questions, filter the information, assess local demographics, and provide quality data in a visual format for institutional representatives. Researchers anticipate being able to better tailor services to the community based on the results.

Jumpstarting a Culture of Research in an Academic Library, Step 1: Journal Club

Megan Sheffield, Jenessa McElfresh & Maggie Albro
Clemson University Libraries

Clemson University recently obtained Carnegie R1 classification and simultaneously revised and updated library faculty guidelines to put more emphasis on research output. As a result, faculty librarians (especially the pre-tenure crowd) have unprecedented pressure to perform research to achieve tenure. Due to this shift, there is a lack of culture of talking about research methods or fitting projects into schedules, let alone envisioning librarians as research faculty on par with institutional peers. One initiative we have started to combat this stale culture of research is a bimonthly Journal Club. We select research articles in library science journals and meet to discuss their merits, methodologies, and weaknesses. While the concept of Journal Clubs is not revolutionary, their usage in this context might be useful to other research librarians who seek to jumpstart a culture of research in their institutions. We will discuss several tips and best practices we’ve implemented, as well as what our next steps are towards growing our culture of research.

Lightning in a bottle: Channeling undergraduate energy into a polished research proposal

Karen N. Reed, Ph.D., Middle Tennessee State University Library

Motivated undergraduates may bring great passion for their subject matter, yet lack the training to conduct original research or even understand how to get started. One librarian will present her experience of mentoring a highly motivated undergraduate who faced obstacles in securing a faculty research partner in his discipline. Based on her own experience of creating a research proposal during dissertation, the librarian helped the student craft a strong proposal; this document ultimately secured the commitment of a faculty member in the student’s major course of study. The presentation will begin by discussing strategies for helping students select a topic realistic to their interests and resources. Practical steps for creating a quality undergraduate research proposal will be presented, including a proposal template and other planning materials developed during the course of the project.

Measuring library support for institutional research endeavors using a return-on-investment model

Douglas L. Varner, Jett McCann, and Jennifer Kluge
Dahlgren Memorial Library, Georgetown University Medical Center

Nancy Woelfl, University of Nebraska Medical Center

Dahlgren Memorial Library at Georgetown University Medical Center implemented and refined an established return-on-investment model adapted for use in health sciences libraries which calculates a quantitative measure of the impact of institutional investment in library resources and services on extramural grant income. 

The model used in this analysis generates a dollar amount figure based on entry of data points derived from NIH grant data and analysis of cited literature sections from funded NIH grant proposals. Results of the study demonstrated that for every dollar the institution invests in library resources $1.89 in grant income will be realized.  The model used in this analysis will be described and discussion will follow on strategies for more wide-spread implementation of the model for benchmarking and assessment purposes.

Not Just a Literature Review : The Librarian’s Role in Research Review and Synthesis

Ann Hallyburton, Western Carolina University

This presentation will focus on how librarians can make researchers aware of the numerous avenues available to them in synthesizing research. When contemplating research, new and not-so-new researchers may focus solely on familiar “primary” research methods traditionally used within their disciplines. Case and cohort studies, trials randomized or not, ethnographies, and various quantitative analyses represent just some of the mainstay research types would-be investigators limit themselves to. The proliferation and variable quality of these studies has spurred the formalization of a whole range of review methodologies (more than forty!) to aid in evaluating and synthesizing these studies. As experts in information finding and organization, librarians must play a pivotal role in carrying out reviews and even lead the way in writing, publishing, and otherwise disseminating this growing type of research. The presentation will briefly familiarize participants with popular, widely applicable review types like narrative, quantitative systematic, meta-analytic, integrative, and scoping reviews as well as introduce some lesser known review types that are newer or that have recently gained notice outside a singular field (e.g., qualitative systematic review, meta-synthesis, umbrella review, review of reviews). Participants will learn how to apply traditional search skills in locating formal guidelines for different review types and how to determine which guideline might best work when multiple applicable guidelines exist (as with concept analyses, etc.). The presenter will also share her experience with forming interprofessional partnerships in conducting these reviews.

Quantitative Research Design as an Inexperienced Librarian Researcher

Isabella Baxter, University of Tennessee Libraries

I will present a case where I, a librarian inexperienced with using quantitative research, designed methods to assess the readability levels of veterinary client handouts from three veterinary information services, and then wrote the results of this analysis. I will describe the quantitative method design choices I made in consultation with a statistician to answer the research questions: What are the readability levels of these handouts, and do they meet the sixth-grade reading level set for human medical information. Methods included collecting a total sample of 150 handouts, selecting the Simple Measure of Gobbledygook (SMOG) Index and the Flesch-Kincaid Grade Level Formula, and using a Two-Way Analysis of Variance (ANOVA) test to analyze and compare the readability levels. I will also discuss the process of writing and displaying the results of this analysis in a manuscript. This lightening talk will be of interest to librarians who wish to know more about using quantitative methods in their research.

Reviewing published article supplements for insight into the documentation practices of systematic review authors

Mark MacEachern, Whitney Townsend, and Marisa Conte
University of Michigan

Health sciences librarians produce data as part of their systematic review work. As funders and journals enhance their data sharing policies, librarians will, as part of this work, need to share the data they create in an accessible and shareable format. At present, there are no standards about what data should be captured and how that data should be made available. In an effort to understand current documentation practices, we identified a sample set of recently published systematic reviews in PubMed and coded the associated data files. We developed a standardized data extraction form and calibrated coding across three reviewers. Through this method we hope to better understand current data documentation and curation practices within systematic reviews, and highlight potential areas of growth within the standards that govern best practice systematic review methodology.

Using REDCap to Reach out to Researchers: Happenstance and Seizing Opportunities for Outreach and Supporting Researchers

Will Dean, Temple University

Outreach is a priority for most libraries and librarians, especially ones with new positions and new liaison areas. Our library was looking to expand our services to clinical health science researchers and we were considering different outreach options. At the same time, I (the new librarian) was given the chance to help support our hospital’s iteration of REDCap, a secure research data collection platform already used by most of our researchers. After discussion with the library and hospital administration, and weighing outreach needs while trying to avoid overpromising (and burnout), we began to offer support for REDCap. Following the idea of ‘Go where people already are’ we chose to focus my time on offering support for a tool that our researchers already needed to use, instead of devising new methods of outreach. With hands-on workshops, on-site trainings, and numerous in-person and digital consultations, we have created new connections with researchers and, hopefully, increased the standing of our library in the minds of researchers at our institution.

Using Web of Science to Assess OA Publishing Trends

Jess Newman, Randall Watts, and Hilary Jasmin
University of Tennessee Health Science Center Library

This research examines the trends surrounding scholarly communications amongst disciplinary faculty authors at UTHSC to determine prevalence of Open Access (OA) publishing in order to formulate a strategy for the library to reallocate services and funds to meet the changing needs of the Library’s researchers. Using filters built into Clarivate Analytics Web of Science database, the authors were able to isolate OA articles authored by UTHSC affiliated faculty. OA and non-OA publications were compared to determine what, if any, impact OA had on citations. Further, results were filtered to identify in which journals faculty authors were publishing.