E-Resources Usage

In talking with our newly hired E-Resources Librarian, we discovered we were both excited about finding out about how e-resources were being used in the library.

The Problem

The only user stats we had were from our vendors. We had no other way of knowing if the e-resources we subscribed to were being used or how they were being used. Were we putting our budget, outreach, and instruction to good use? Using the Music Library as our initial case study, we wanted this to be a pilot project to deploy in other areas of the library.

The Goal

Find out how music faculty and students (both undergraduate and graduate) were using library e-resources and, where possible, identify user behaviors.

The Team

The E-Resources Librarian and I collaborated on researching, completing the IRB training and forms, and writing the survey. She did most of the data crunching when the results in, and we both identified trends.

The Process

We followed the general Design Thinking process of Define, Ideate, Prototype, Test, and Share.


We started by talking to colleagues who had recently done surveys and did our research drawing up best practices and surveys in librarianship journals. This allowed us to create a clear plan.

We came up with four categories of information we wanted:

Four Categories of Information

This included finding out if students were able to access the physical Music Library during the course of their days on campus. We wanted to know how they were spending their time in the Music Building. The rest of the questions centered on the library and digital resources and services.

Then we created personas. These centered on faculty, graduate students, undergraduate students, though we talked about the variety of students we have and the different ways we thought they used the library (e.g., non-traditional students, academic music majors, performance majors, education majors, etc.).


Over four weeks, we wrote and revised our survey. Music Library student workers as well as fellow librarians volunteered to user test it multiple times, which allowed us to refine our questions. This resulted in a well-crafted and focused survey.

Because we had done our research and interviewed others who had done similar projects, the survey writing went smoothly and our revisions were relatively minor. While we initially intended to do a single survey for all School of Music students and faculty, we ended up with a mixture of shared questions and some separate questions for faculty and students.


The survey was distributed twice. Once in the fall 2014 semester, and again in the spring 2015 semester. To encourage participation, users could enter a raffle for one of two $25 iTunes gift cards.

The Data

We combined our qualitative and quantitative data of the survey responses and user statistics compiled by the E-Resources Librarian and our vendors to identify user trends and trends in the usage data as well as see where the two did not agree.

What We Learned

We found that certain music library e-resources (particularly Oxford Music Online and Music Index) accounted for a significant portion of use from those vendors and that both students and faculty were more tech savvy than we had realized. This information was integral in maintaining funding for music e-resources (and knowing that we are subscribing to resources that were being used). I also changed my library instruction sessions and reference interactions based on what we learned about student library usage and their technology skills.

Graph of selected e-resources highlighting the use of Oxford Music Online. Graph created by Kelly Blanchat.

The collaborative experience was very rewarding. While we had hoped to repeat this for other library departments, and possible repeat it for music, the E-Resource Librarian was offered a job she couldn’t refuse and move onto greener pastures.

%d bloggers like this: