San Francisco Declaration on Research Assessment
School of Technology
Guidance on the Implementation of DORA
December 2025
Background
In July 2019 the University of Cambridge signed the San Francisco Declaration on Research Assessment. As a signatory, the University is committed to transparency, integrity, fairness, and consideration of a diversity of outputs in evaluating an individual’s research for such purposes as recruitment, probation, and promotion.
As a signatory of DORA, the University has committed itself to the following:
- To eliminate journal-level metrics (JLMs) in the research assessment of individuals.
- To be transparent about how research quality will (and won’t) be evaluated.
- To encompass a wide range of potential research impacts, reaching beyond a narrow focus on peer-reviewed publications and including contributions to open science, the translation and application of research, public engagement and policy impacts.
The University developed high-level guidance and asked Schools to further develop this into bespoke DORA implementation policies appropriate for their disciplinary needs.
This guidance, approved by the Council of the School, outlines recommendations for implementing each of the three key DORA actions across the School of Technology. As well as the information on this page, we also have Specific Considerations for Key Research Exercises.
Elimination of journal-level metrics
Action 1: To eliminate journal-level metrics (JLMs) in research assessment of individuals.
Recommendations for implementation:
The Departments of the School of Technology will not use JLMs as a proxy for research quality in the formal evaluation of candidates’ research, for example in recruitment, or in promotion or probation assessments.
Steps that could be taken include:
- Ensuring that individuals being evaluated, and external reviewers, are aware that wedo not accept JLMs as a proxy for research quality. JLMs should only be one part ofa holistic assessment of an individual’s research quality.
- The use of JLMs in relation to candidates’ publications will be discouraged indocuments prepared by applicants or assessors.
- Chairs of School committees for Academic Career Pathways (ACP) and other careerdevelopment programmes will be asked to regulate informal use of journal titles asproxies for research quality in discussions about candidates.
- The use of terms such as “target journal” and “high impact journal” in recruitment,probation, and promotion processes, guidance documents and advertisements, will not be used; terms such as “high quality research”, “influential research” and“impactful research” are preferred.
The School recognises that journal metrics and titles have become embedded in perceptions of research quality and it will take some time to lessen the unconscious bias that equates journal-level metrics with the quality of an individual’s research output. Mitigation measures such as redacted publication lists (providing only the DOI for each paper) are not favoured by the School because of the extra workload associated with retrieving publications when many candidates are being assessed, and because a redacted form of citation is contrary to the general DORA recommendation that research evaluation should be informed by complete information on all relevant outputs. It is therefore imperative that assessment committees are resolute in ensuring that holistic evaluations of research quality are used.
Transparency around the evaluation of research quality
Action 2: To be transparent about how research quality will (and won’t) be evaluated.
Recommendations for implementation:
Research excellence or quality should ultimately be decided by competent and impartial assessors who have the necessary subject knowledge and expertise to make qualitative judgements about the content of research outputs.
Any evidence to be used in the assessment process will permit fair and equal comparison between researchers and will be appropriate for the purposes of the evaluation. The limitations of any evidence or methods will be considered.
The School will ensure that the objectives, criteria, range of admissible evidence, methods, and interpretation of results in any assessment process are set out in guidance for assessors and those submitting materials for assessment, including referees. All those subject to evaluation will be treated equally and impartially. Evaluation will consider only such material and information as is submitted for assessment in accordance with the guidance produced for the process in question.
The use of metrics may be considered as part of a complete assessment of publications submitted for evaluation of individuals. However, such metrics should only inform and must not supplant expert evaluations and any quantitative bibliometrics must be used carefully, recognising any biases associated with them. Use of a single metric alone to rank or evaluate individuals is not acceptable.
General guidance about appropriate use of metrics, and the strengths and weaknesses of various metrics is available from:
- The University Library, https://libguides.cam.ac.uk/research-skills/metrics.
- The Metrics Toolkit website, https://www.metrics-toolkit.org,
Consideration of a diversity of research outputs and impacts
Action 3: To encompass a wide range of potential research impacts, reaching beyond a narrow focus on peer-reviewed publications and including contributions to open science, the translation and application of research, public engagement and policy impacts.
Recommendations for implementation:
While peer-reviewed publications will retain a central place in research quality assessments, the scope of research assessments will be widened so that peer-reviewed publications or related metrics are not the only focus of evaluation. Individuals should have an opportunity to report their research impact in terms of other outputs, which might include open science, public engagement, translation and application, economic activity, and T/26/5
policy-making.
In accordance with the University’s commitment to the Open Research agenda (https://www.openresearch.cam.ac.uk/cambridge/policies-frameworks/open-research-position-statement), evaluation processes should recognise contributions to open research, e.g. making datasets and/or software freely available, in Cambridge and more widely.
Evaluation will appropriately take account of the diversity of a researcher’s outputs. Guidance for each assessment process should encourage the submission of materials across the relevant range of formats appropriate for a given discipline.
Qualitative measures of research excellence will be used, as appropriate. For example the use of narratives and/or summaries of key papers and other outputs in application materials to explain the quality of selected outputs. Referees for promotion panels are expected to explain why particular research outputs are significant.
The use of narratives is often useful to document research outputs in relation to open science, public engagement, translation and application, economic activity, and policy making criteria for impact and should not be discouraged.
Narratives are already widely used in research assessment across the University, for example, in evaluating research summaries or plans, and in probation and promotion processes. It is recognised that narrative elements come with their own biases (for example, they could privilege candidates with enhanced literary skills). Within the School, an exclusively narrative CV is not deemed appropriate. The following is recommended:
- Narratives be requested under suitable headings as a part of assessment processes.
- Applicants are invited to include, in narrative form, an account of what they see as the importance of an appropriate subset of their outputs and to justify the use of any citation-based author-level or paper-level metrics that they wish to include in their supporting documentation.
Examples of information which may be captured via a narrative include research culture, researcher development or open science. Specific information could include:
- Contributions to research teams and the development of others, e.g. project management, supervision, mentoring, involvement in collaborations/networks within and outside the University, strategic leadership, etc. Narrative documentation of contributions of this nature could appropriately include statements from mentees, or members of research teams, who have been mentored or managed by the applicant.
- Contributions to the wider research and innovation community: e.g., reviewing, refereeing, editorial board or funding panel membership, crucial to the success of the ecosystem; committee membership in the Department, University, or nationally/internationally; organisation of workshops, conferences or other events that benefit the research community; the development of research facilities or service platforms, contributions to improving research culture; contributions to open
research.
- Exploiting and communicating research impact: e.g., knowledge exchange, generation of IP or new commercial activity; engagement with industry, private/public sector partnerships, or policy makers; communication with researchers in different fields/disciplines or general public engagement through books, broadcasts, talks or other general media.
Implementation
The School of Technology is committed to the principles of DORA, working in accordance with the University’s commitment to transparency, fairness, integrity and diversity in all matters of staff recruitment and evaluation.
Academic promotion processes need to compare individuals fairly across the diverse disciplines of the School, and so employment of these principles has been ingrained in operations for many years. Sources of information used in evaluations for promotion are made explicit in the School’s Academic Careers Pathway (ACP) documentation. The above recommendations provide overarching guidance for implementing DORA while both recognising current good practice and acknowledging that there is room for improvement.
The recommendations will ideally be implemented in all matters of staff evaluation, and with Departmental and staff feedback taken into account. Any local guidance for recruitment, probation and promotion procedures will need to be updated in light of these recommendations.
Staff Guidance and Support
The School will need to take steps to ensure that appropriate guidance and information is made available to all staff, and provided as a mandatory requirement to assessors formally responsible for evaluating research quality.
It is anticipated that the University will develop a short online guide to DORA as part of its unconscious bias training, and it is recommended that all those who are involved in evaluating research quality, for example in the ACP scheme, complete this training when it becomes available.
Assessors and members of appointments or promotion committees will be expected to understand and adhere to the principles of DORA when carrying out any research quality assessments.
Learning and improvement
The School will, from time to time, review this policy. Reviews may include, for example, reflexive evaluation of an individual assessment exercise (e.g. an annual promotion round), an annual or bi-annual appraisal of several assessment exercises, or applicant and evaluator feedback. Policies and guidance will be modified in light of such evaluation, if needed, in the spirit of continuously seeking to improve how we recognise and assess research quality.