Evidence, Impact, Metrics

Gathering evidence, understanding impact and using metrics

  • Status of this blog

    This blog was used to support the Evidence, Impact, Metrics work which took place in 2010-2011. After the completion of this work, the blog was closed and no further posts will be made.

Running Your Own Surveys

Background

UKOLN’s Evidence, Impact, Metrics activity [1] ran from August 2010-July 2011. The aim of this work was to explore ways in which systematic quantitative evidence-gathering approaches could be used help identify the impact of online services.

Three 1-day workshops were held as part of this work, together with a number of additional presentations at a variety of events [2]. A series of evidence-based surveys were published on the UK Web Focus blog [3]. The surveys were accompanied by commentary on the tools and methodologies used to gather the numerical evidence. In addition suggested interpretations of the findings and their implications of the findings were published on the posts. Feedback was invited on the posts including critiques of the survey methodologies and interpretations of the findings.

Reflections

In some quarters there were suspicions about the value of quantitative surveys and concerns that such approaches could be counter-productive, leading to the development of inappropriate league tables and people ‘gaming the system’ in order to exploit limitations in the tools and techniques used to produce metrics.

Whilst such concerns do have some validity, there is also an awareness of the needs to gather quantitative date related to the provision or use of services. There is an awareness that such data can be used to inform developments of services and that development plans can be usefully informed by making comparisons with one’s peers. There is also a growing awareness of the risks that, in an open networked environment, third parties could make use of data about use of services and interpret finding in ways which fail to appreciate complexities. An example of this can be seen from a blog post entitled “University Web Sites Cost Money!”[4] which described how the Daily Telegraph newspaper interpreting the data in ways which reflected their political agenda.

The significant interest in the evidence-based blog posts suggest that the need to engage in such activities is now becoming more widely appreciated. However rather than reducing the findings to simplistic league tables, the surveys demonstrated a number of ways in which such surveys could be beneficial to the service providers:

  • Trends over time: Organisations, services or individuals can find it useful to monitor trends of uses of services over time. Examples of such surveys were described in posts on “Evidence of Personal Usage Of Social Web Service” [6] (which suggested that the take-up of a new service was reliant on reaching a critical mass of users, identification of personal use cases which demonstrated the value of a new services and the deployment of more effective tools for using he service) and two posts on “DCMI and JISCMail: Profiling Trends of Use of Mailing List”[6] and “The Decline in JISCMail Use Across the Web Management Community” [7] which provided evidence of the decline in use of well-established communication tools in certain sectors.
  • Comparisons with one’s peers: A series of blogs provided evidence of uses of services such as Facebook, Twitter, YouTube and iTunes across the twenty Russell Group Universities.
  • Identification of differing patterns: The ‘sense-making’ of the data which has been collected can help in understanding differing usage patterns which may be emerging.
  • Providing benchmarks: Snapshots of usage may be useful in identifying future trends. For example the post on the “HTML and RDFa Analysis of Welsh University Home Pages” [8] showed that currently RDFa is not currently being deployed on institutional home pages within this community.
  • Conforming Expectations or challenging orthodoxies? WH Smiths made a policy decision to stop selling LPs in their stores at one stage based on analysis of purchasing patterns and predictions of future trends. At the time this was a noteworthy decisions which was featured in the national press. Surveys may be useful in confirming expectations (such as the surveys which confirmed the decline in use of mailing lists) or challenging conventional beliefs and expectations. A survey on “How People Find This Blog, Five Years On” [9] provided evidence which challenged orthodox thinking on the primacy of Google for finding content on Web sites and questioned the importance of RSS for providing access to blog posts, by highlighting the importance of Twitter.

Implementing Your Own Metrics-Based Surveys

The experiences gained in providing a range of surveys and interpreting the findings may be useful for others who wish to carry out their own surveys, perhaps to help benchmark organisational, project-based or individual developments or by funders and other third parties to monitor developments, identify best practices which can be used to inform the sector or services which may be in decline.

  • Identify purposes: When planning a survey you should identify the purpose of the survey. Possible purposes may include understanding how one is doing in comparisons with one’s peers; helping to identify return on investment or gaining understanding of a new area.
  • Identification of a community to compare: If you wish to benchmark your service with others you will need to identify the others services to be compare with. UKOLN’s surveys have included comparisons across Russell Group and 1994 Group Universities; regional groups (Scotland and Wales) and participants at particular events.
  • Identification of tools and methodologies: You will need to identify the tools and methodologies used to gather the evidence. The UKOLN surveys have typically made use of freely available tools, often web-based, which can analyse open data.
  • Understanding limitations of tools and methodologies: There will be a need to understand the limitations of the tools and methodologies used. It should be noted that in many cases quantitative data may only provide proxy indicators of value. In order to avoid accusations of publishing flawed summaries one should be willing to document the limitations of the approaches used.
  • Documentation of survey processes (paradata): There is also a need to document data associated with the gathering of data [10]. This might include, for example, documenting the dates of the data collection. As an example, if you are comparing Twitter usage across events you should ensure that you use equivalent data ranges.
  • Support openness: Commercial organisations may seek to provide surveys as an income-generation activity. Within the higher education sector there may be expectations regarded the openness of data, with sharing of data helping to provide cost-effectiveness across the sector. Unless your organisation has chosen to seek to make a profit from its data collections and analysis services it would be beneficial across the sector if data was to be made freely available for reuse by others. An example can be seen in Katrina James’s post on “Evaluating networks: Twitter activity of 1994 Group universities” [11].
  • Openness of additional data generated: You should try to ensure that new data you create is made available for others to reuse (which might include validating or repudiating your methodology). As an example the post on JISCMail usage statistics for DCMI lists [12] which required manual collection of data, stored the data in a publicly readable Google Spreadsheet [13].
  • Interpretation: Once you have gathered data and published the analysis you may also wish to provide an interpretation of the findings and discuss the implications. It should be noted that when interpretting findings based on data associated with use of innovative services you should be careful of misinterpretting the findings. There may be particular temptations to do this when the evidence suggests that one’s own services are highly rated.
  • Encourage feedback: In order to minimise the risks of misinterpretation of the findings you should encourage feedback and discussion.
  • Publish corrections: If your findings or interpretations are shown to be incorrect you should publish a correction.

References

  1. Evidence, Impact, Metrics: About, UKOLN, <http://blogs.ukoln.ac.uk/evidence-impact-metrics/about/>
  2. Evidence, Impact, Metrics: Events, UKOLN, <http://blogs.ukoln.ac.uk/evidence-impact-metrics/events/>
  3. Evidence category, UK Web Focus blog, <http://ukwebfocus.wordpress.com/category/evidence/>
  4. University Web Sites Cost Money!, UK Web Focus blog, 16 November 2011, <http://ukwebfocus.wordpress.com/2010/11/16/university-web-sites-cost-money/>
  5. Evidence of Personal Usage Of Social Web Service, UK Web Focus blog, 12 January 2011,  <http://ukwebfocus.wordpress.com/2011/01/12/evidence-of-personal-usage-of-social-web-services/>
  6. DCMI and JISCMail: Profiling Trends of Use of Mailing List, UK Web Focus blog, 14 December 2011,  <http://ukwebfocus.wordpress.com/2010/12/14/profiling-trends-of-use-of-mailing-lists/>
  7. The Decline in JISCMail Use Across the Web Management Community, UK Web Focus blog, 4 June  2010,  <http://ukwebfocus.wordpress.com/2010/06/04/the-decline-in-jiscmail-use-across-the-web-management-community/>
  8. HTML and RDFa Analysis of Welsh University Home Pages, UK Web Focus blog, 17 November 2011,  <http://ukwebfocus.wordpress.com/2010/11/17/html-and-rdfa-analysis-of-welsh-university-home-pages/>
  9. How People Find This Blog, Five Years On, UK Web Focus blog, 1 November 2011, <http://ukwebfocus.wordpress.com/2011/11/01/how-people-find-this-blog-five-years-on/>
  10. Paradata for Online Surveys, UK Web Focus blog, 29 November 2011, <http://ukwebfocus.wordpress.com/2011/11/29/paradata-for-online-surveys/>
  11. Evaluating networks: Twitter activity of 1994 Group universities, Katrina James, blog, 28 September 2011,  <http://katrinajames.co.uk/press-pr/1994-group-twitter/>
  12. DCMI and JISCMail: Profiling Trends of Use of Mailing Lists, UK Web Focus blog, 14 December 2010,  <http://ukwebfocus.wordpress.com/2010/12/14/profiling-trends-of-use-of-mailing-lists/>
  13. DCMI Usage of JISCMail Mailing Lists, Google Spreadsheet, <https://docs.google.com/spreadsheet/ccc?key=0AqyjJ9Eviy8idHMzR3dYNFR6VnJOZXNHNXJLT3hWa3c&hl=en#gid=0>