User Interfaces is continually working on improving resource discovery interfaces, and of course an integral part of improvement is always evaluation. One major element we want to evaluate is our changes’ impact on patrons’ library use. Near-term plans for discovery hinge on Summon, so we’re especially interested in how Summon and our Find Articles search have affected e-resource use. But this is really tough to characterize and quantify. For one thing, the difficulty of getting meaningful usage stats for library electronic resources is notorious. Our e-resources are spread out among many different databases from many different vendors, where each vendor manages its own interfaces and its own content and collects its own stats its own way. We have meta-interfaces that are supposed to help funnel our users to the correct resources in the correct databases, and these systems collect stats as well. Although we can compile stats from each of these sources, little of it is unified. Stats from different sources measure different things differently.

Summon landed right smack in the middle of an already complex situation when we launched the beta in 2012. According to Karen Harker, the library’s Collection Assessment Librarian (and e-resource stats guru), Summon has played havoc with the stats we get from database vendors. Some vendors have shown a steep increase in usage while others have shown a steep decrease, and it’s impossible to tell from the data what traffic comes from Summon and what doesn’t.

However, one set of statistics we have that has remained pretty consistent over the years is what we get from Serials Solutions about our “360” services we have through them: 360 Core (our e-resource knowledgebase), 360 Link (our link resolver), and our e-journal portal. This includes “click-through” stats, which measure users’ requests to get resources’ full text.

What’s convenient for our purposes is that:

  1. Summon uses the link resolver to link out to many (but not all) full-text resources, so the 360 stats include some Summon usage.
  2. Our 360 services predate Summon.
  3. Summon implementation did not change anything about our 360 services that would affect click-through stats other than the effect Summon itself has on full-text downloads.

So, if nothing else, the 360 click-through stats seem to provide a good way to compare pre-Summon e-resource usage to post-Summon e-resource usage. Although they can’t give us the whole picture, they can help us determine whether or not Summon is having an impact and maybe partially characterize that impact.

Click-through Stats, 2006-2013

In truth, I’ve been keeping an eye on these stats for a while, as they are quick and easy to obtain. Last spring I put together a visualization that shows a comparison of the years 2006 through the present, and I have been updating the graph with new data on a regular basis. Last week, in preparation for a Liaison’s meeting on the topic, I updated the graph with complete data for Fall 2013. Now that our Find Articles service has been live for over a year, I thought this would be a good time to share what we’re seeing.

The visualization I created is located here. Note that I built this using the D3.js JavaScript library, so it works best in Chrome and Firefox.

Here’s a screenshot showing the relevant data.

 

We implemented Summon as a “beta” at the very end of January 2012, so the post-Summon lines are the red and green ones, and the pre-Summon ones are the brown, blue, and light grey ones. Here are the features I want to point out.

  • Pre-Summon lines are grouped together at the bottom and are startlingly consistent in terms of data values and line shape.
  • The first month of our Summon beta click-throughs were almost double the previous February peak from 2011. The months following consistently show much higher usage compared with previous months. (Except July/August 2013, which I’ll discuss in a minute.)
  • September 2012 is when we launched Summon as “live” in conjunction with the launch of our new website, and Spring 2013 (live) shows a nice increase compared to Spring 2012 (beta).
  • Fall 2012 and Fall 2013 both show “live” usage and are pretty consistent.
  • Summers are still low, which is to be expected. But most striking is Summer 2013. In 2013, the difference between summer and spring and summer and fall looks proportionally much greater than the difference between summer and spring/fall during pre-Summon years.

So based on this data, what conclusions might we draw about Summon and the impact that Summon has had on e-resource use?

First, I think we can safely say that the leap in numbers of click-throughs that we see from 2011 to 2012 and 2013 was caused directly by Summon. The data corresponds perfectly with our Summon implementation timeline. Nothing else happened in 2012 and 2013 to explain the change any other way. We always have fluctuations in enrollment and database subscriptions, and, despite these, 2006-2011 numbers are relatively consistent.

Second, the magnitude of the leap in click-throughs after Summon implementation and the levels that are being sustained suggests that Summon is well-used and that people are continuing to find it at least somewhat useful. If people were not finding what they needed through Summon, I’d expect the click-through rates to drop off. (Of course, the question is: well-used and useful compared to what?)

Third, there’s the matter of Summer 2013. I checked the figures and found that there’s an average difference of -61.3% in 2013 between summer and spring/fall. Comparatively, 2010 and 2011 only have average summer and spring/fall differences of -39.1% and -35.1%, respectively. I also checked enrollment figures and found that there was a comparatively larger dip in enrollment between summer and spring/fall in 2013 than previous years—but still not large enough to explain the large difference in click-throughs by itself. My guess? I think this data may show what we have heard anecdotally about Summon: that students use it, while faculty (overall) tend to stick with their tried-and-true methods of research. If Summon’s impact on click-throughs is disproportionally weighted toward times of the year when students are around, then it seems reasonable to assume students are using it more than faculty.

What about Summer 2012? Click-throughs were a bit higher than in 2013. Unfortunately 2012 is a little difficult to generalize about, since we were in beta in spring and summer and then went live in the fall. It’s possible that a lot of faculty members were giving Summon a test run during the summer and decided it was not too useful for them, boosting its stats in 2012 and leading to a drop-off in 2013. It’s tempting to think that enrollment numbers probably play a role (15743 in 2012 and 13866 in 2013), but if we look at Summer 2010 and Summer 2011, we had a big drop in enrollment (17259 in 2010 and 15909 in 2011) while the overall number of clickthroughs is higher in 2011. The forces at work here may simply be more complex than we can determine based on the numbers we have available. It will be interesting to see what happens this summer.

To summarize, here’s what I think we can say based on these statistics.

  • People are using Summon to get access to the library’s full-text e-resources.
  • With Summon, we’ve seen a big increase in requests being passed through the link resolver.
  • People continue to find Summon useful and continue to use Summon to access full-text e-resources.
  • We think that, overall, students are using Summon more than faculty members and that most faculty members are sticking with their tried-and-true methods of research.

And here are a few things that we most assuredly can’t say anything about based solely on these statistics.

  • If or to what extent Summon use is cannibalizing direct database usage.
  • If Summon is just shifting usage of e-resources or if it’s actually increasing use.
  • Summon’s effect on use of A&I resources.
  • If Summon is causing a decline in use of resources not indexed in Summon.
  • If people are finding better, more relevant resources through Summon compared to other sources (such as direct database searches).
  • What impact our Summon setup and the defaults we use for our Find Online Articles search has on e-resource use.

Posted by & filed under Uncategorized.

Comments are closed.

top