Evidence, Impact, Metrics

Gathering evidence, understanding impact and using metrics

  • Status of this blog

    This blog was used to support the Evidence, Impact, Metrics work which took place in 2010-2011. After the completion of this work, the blog was closed and no further posts will be made.

A Framework For Metrics

Summary of UKOLN’s Evidence, Impact, Metrics Work

UKOLN’s Evidence, Impact, Metrics activity developed a methodology for gathering quantitative evidence on use of online services which can help to understand  the impact of the services and inform the development of the services.

Initially there was some scepticism about the relevance of quantitative evidence gathering work. There were legitimate concerns that metrics can provide only a partial understanding of the services and that metrics can be ‘gamed’ if undue emphasis is placed on their importance.  However in light of an awareness of the need to be able to gather evidence in order to justify funding there became a better appreciation of the value of such work.

Framework For Metrics for JISC Programmes

The following framework for metrics is proposed.

Projects should therefore provide a summary of the context of their work since a ‘one-size-fits-all’ approach to metrics is unlikely to be of value.
Purpose of the metrics:
The purposes of gathering metrics should be documented.  Note that gathering metrics in order to gain an understanding of how they might be used can be a legitimate purpose, but this needs to be documented.
Where known, the tools to be used in gathering, analysing, visualising and interpreting the metrics should be documented.
A summary of how the metrics may be interpreted.
General comments including a summary of known limitations.
Risk Assessment:
Risks associated with use of metrics, together with risks of not gathering metrics.

Using the Framework

How can this framework be used and what benefits can it provide to projects?

Using the Framework:
Projects should embed the framework in their initial planning. As well as providing detailed documention of projkect work plans, the framework can be used to identify success criteria for various aspects of the work.
Projects may be able to useful objective evidence in subsequent proposals. Evidence of a lack of success may be useful in modifying work plans.

Two example of use of the framework in providing evidence of the effectiveness of communication channels are given below.

Case Study 1: Project Blog

A project blog has a role to play in encouraging discussion and collaboration across its key stakeholders and disseminating its outputs to a wider audience.
Purpose of blog metrics:
Metrics for project blogs are intended to understand usage patterns, especially examples of good practices which could be adopted more widely.
The blog is registered with the Technorati and EBuzzing services with a programme tag.  The services will give rankings based on the number of links to the blog. Use of a tag will enable good practices across the programme to be easily identified. In addition to these services, Google Analytics will provide usage statistics.
Regular summaries of the numbers of posts and comments will be provided to programme managers who will be able to have an oversight of how blogs are being used across the programme.
Anecdotal evidence suggests that project blogs may find it difficult to gain a significant audience. Metrics can be useful in helping to identify examples which may be successful in reaching out and engaging with its audience. However since the benefits of project blogging are likely to be in implementing open practices and allowing key stakeholders, such as project managers, to more easily read reports from across all projects, it will be inappropriate to use metrics in league tables.

Case Study 2: Project Slides

A project hosts its slides on the Slideshare repository in order to allow the slides to be embedded within Web sites and viewed on mobile devices.
Purpose of Slideshare metrics:
As described in [1] Slideshare metrics can help to identify successful outreach strategies including reuse on other blogs. By using programme tags to aggregate slides, use across a programme can be identified [2].
Slideshare is the most popular slide sharing services. Note that richer statistics requires subscription to the service.
The usage metrics do not say whether a complete slide set was viewed.
The risk of hosting slides locally include difficulties in gathering metrics, potentially limiting access to resources and additional effort in developing other approaches for identifying the value of the resources.


  1. What’s the Value of Using Slideshare?, UK Web Focus blog, 23 December 2010, <http://ukwebfocus.wordpress.com/2010/12/23/whats-the-value-of-using-slideshare/>
  2. Evidence of Slideshare’s Impact, UK Web Focus blog, 31 May 2011, <http://ukwebfocus.wordpress.com/2011/05/31/evidence-of-slideshares-impact/>

3 Responses to “A Framework For Metrics”

  1. Final Reports from UKOLN’s Evidence, Impact, Metrics Work « UK Web Focus Says:

    [...] provides a summary of the lightweight framework developed for gathering quantitative evidence. [HTML] – [MS [...]

  2. #ukolneim: what’s it all for? at Danegeld Says:

    [...] Update, December 2011: the final report from UKOLN’s evidence, impact and metrics work has now been published. This includes a summary of blog posts and a framework for metrics. [...]

  3. Final Reports from Evidence, Impact, Metrics Work | Innovation Support Centre at UKOLN Says:

    [...] Framework For Metrics: [HTML] – [MS [...]

Leave a Reply

You must be logged in to post a comment.