Zombse

The Zombie Stack Exchanges That Just Won't Die

View the Project on GitHub anjackson/zombse

Metrics for value of online resources

Our library (science-based government department, but we write policy too) is increasingly focused on providing online access to resources. This is great because it facilitates access to information for the department's employees, but it invalidates many traditional metrics for library use (circulation, visits, etc). We're finding that that the number of searches isn't the right number for conveying ROI to senior management. What metrics are you using? Is there a simple answer here? I can use Scopus to measure citations - have you used bibliometrics successfully in this domain? That would show the impact of our org's published research, but not necessarily which library resources contributed to that research. I'm trying to get our "story" straight in an organizational context of budget cuts.

Emily Gusba

Comments

Answer by David Rothman

It really depends what you're attempting to measure.

Your subject line indicates interest in metrics for online resources, then you mention scopus, which provides some metrics for journal articles...which aren't 'online resources. They are items which can be accessed online, but that's not necessarily the same thing.

If you're trying to measure the impact/importance/significance of a particular article, we're talking about Bibliometrics. http://en.wikipedia.org/wiki/Bibliometrics

See also:

Hope that helps!

Comments

Answer by Ed Summers

Web analytics software like Google Analytics offer some really exciting possibilities similar to what circulation statistics have traditionally offered libraries. For example you can see what Web resources are viewed more than others, how long people spend on various pages, when they leave your website, etc which can help guide digitization efforts. If you have an institutional repository, spending a bit of time making your resources crawlable, and tracking visits can pay dividends also in user centered design. For an example of this I recommend Chris Prom's excellent Using Web Analytics to Improve Online Access to Archival Resources which appeared in American Archivist last year.

Comments

Answer by phette23

I interpreted this question very differently from David so I thought I would share a couple more resources that might help. While David's resources are all related to journal prominence, one can also evaluate usage statistics at a particular library.

The most pertinent project here is COUNTER, an attempt at getting standardized & thus comparable statistics from database vendors. One can retrieve the number of searches, sessions, downloads, & other metrics at various levels of granularity: by database, by journal, by ebook. I tend to rely on Database Report 1 which gives "Total Searches, Result Clicks and Record Views by Month and Database" but other libraries might be better suited to Journal Report 5 ("Number of Successful Full-Text Article Requests by Year-of-Publication (YOP) and Journal") or Book Report 1 ("Number of Successful Title Requests by Month and Title") depending on whether they focus on e-journals or e-books. See the COUNTER Code of Practice [.pdf] for details & sample reports.

There are also ICOLC statistics which is a similar project but I've only seen a few resources report those.

I'm not sure if that's what you were asking. If you want to know how the library enables faculty research output in general and not by resource, then the best analysis would probably be to review the citation lists of faculty publications to see how many come from library resources. You could then make powerful statements such as "these would cost 15 trillion dollars total at \$15/article if the library did not provide access."

Comments

Answer by trevormunoz

I think it's important to take a broader view of this question than just measuring journal impact but there is ambiguity in the OP from the examples cited.

Another relevant resource is a recent study that Simon Tanner of King's Digital Consultancy Service completed for JISC, a major funder in the UK, of the "values, benefits and impacts of digitised resources." The study is available from here: http://www.kdcs.kcl.ac.uk/innovation/inspiring.html This report is explicitly aimed at policy-makers.

A few findings from the report to add to list above include:

Comments

Answer by Stephanie Willen Brown

I look at usage data for various databases, and I find that the raw numbers (ie, 7,000 searches in LexisNexis) are not particularly helpful. However, comparing the number of searches in one database vs. another is helpful (ie, 7,000 searches in LexisNexis vs 25,000 searches in America's News tells me something about which database is used more by my patrons). Also helpful is comparing the number of searches over time (ie 7,000 searches in LexisNexis in 2010 vs 5,000 searches in LexisNexis in 2011 means that something happened between 2010 & 2011.)

Another useful comparison, if you can get it from COUNTER or the vendor, is the number of sessions vs searches vs downloads. If there are 2,000 sessions and 500 searches, it could mean that users aren't able to navigate the database interface (if there are way fewer searches than sessions); my sense is that there should be at least as many searches as sessions on average. If there are 500 searches and only 10 downloads, that means users aren't finding what they want in the search results -- or that they aren't able to figure out how to download the articles they want to read.

hope that helps.

Comments