Bibliometrics service
The bibliometrics team provide publication analysis services at Institutional and School level. We help explore and understand the University and Schools citation, collaboration and topics profiles using tools such as Scopus/Scival, Web of Science and Altmetric.
-
Bibliometrics is the application of statistical methods to publications in order to gain relational and evaluative insights into the scholarly ecosystem. The nature of these insights varies considerably, from discovering new emerging topics in a field, to identifying potential research partners, or measuring the impact of publications.
There has been growing interest in bibliometrics for measuring impact in recent years, and there is a wide range of free and subscription tools available providing different metrics and types of insight. It is important to remember that no single metric will provide a rounded overview of research performance, and citation analysis is a measure of impact not quality of research. Metrics should be considered alongside appropriate qualitative measures.
-
It is important that the Institution and Schools understand how they are viewed through bibliometrics, as it is increasingly forming an important part of institutional assessment both internally (KPIs) and externally alongside other measures of research impact. Bibliometric indicators are used as part of the following league tables:
- Times Higher Education World University Ranking – Field-Weighted Citation Impact accounts for 30% of the overall ranking, and ranges from 15% to 35% in the subject rankings
- QS World University Ranking – Citations per Faculty accounts for 20% of the overall ranking.
- Leiden University Ranking – citations are used to measure scientific impact via several metrics
-
The bibliometrics team provides publication analysis services at the Institutional and School levels. We help explore and understand the University and School's citation, collaboration and topics profiles. We use the most appropriate metrics to understand the visibility and impact of our research outputs and to reveal new insights.
Support to the Institution:
- Key performance indicators (KPIs)
- National and global citation performance and trends
- Institutional bibliometric profile analysis and benchmarking
Support to Schools:
- Schools’ key performance indicators (KPI)
- Citation, collaboration and research areas analysis
- Training to Head of Schools, Director of Research and Directors of Impact and Innovation on how to use the available tools to assess their School’s bibliometric profile
Consultations
Analyses and reports are not standard, therefore an initial consultation is required to agree on parameters, deliverables and timescales.
-
We recognise that the responsible use of metrics can play a valuable role in complementing and supporting the management and assessment of research. Therefore, the University’s Open Research Working Group, directed by the Research Committee and in consultation with the Academic Schools, developed a set of principles for the use of indicators in research assessment and management. As part of the implementation of these principles, the University has signed the San Francisco Declaration on Research Assessment (DORA).
The purpose of these principles is to guide the use of indicators for any form of research assessment or evaluation by central committees and services in support of strategic goals as well as by Schools/Departments and individual academics and managers.
The principles are:
- Expertise - Any use of indicators can only inform and not override expert judgement.
- Diversity - Any use of indicators must take differences between disciplines and career stages, or related to equality and diversity, into account.
- Data - Any use of indicators should be underpinned by data that is reliable, statistically valid, multi-faceted and its limitations understood.
- Integrity - Any use of indicators must abide by research integrity standards and follow the University's Principles of Good Research Conduct.
- Transparency - Any use of indicators must be clearly understood by those being assessed, with the methodologies and data available where possible.
More information on the DORA declaration and the University’s principles can be found on the University’s responsible metrics webpage.
-
The university provides access to a wide range of tools to help you understand your scholarly impact.
- Scopus - The subscription abstract and citation database which provides data for the THE World University rankings and the QS World University Rankings.
- SciVal – A benchmarking and analysis tool that is built on top of the Scopus database to provide bibliometric insights, particularly at the institution and country level.
- Web of Science – The subscription citation database that underpins the Leiden ranking.
- Altmetric - A provider of alternative metrics, showing the impact of scholarly publications online.
Other bibliometric tools, to which the University does not subscribe, are available to use and include Google Scholar, Dimensions, Publish or Perish, OpenAlex, and many others.
OpenAlex is particularly noteworthy because of its contribution to the open research infrastructure, with all its bibliometric data available under an open CC0 license. This makes it suitable for both the biggest and smallest bibliometric investigations, and it is now being incorporated into the Leiden Ranking Open Edition. It is still a fairly new bibliometric source, however, and author profiles still need work. OpenAlex welcome feedback on data accuracy from authors.
Before using any of the available bibliometrics tools it is important to understand their strengths and limitations.