We're still developing the platform, and your comments will help us identify any issues and opportunities for improvement.We need your feedback to help make it better.If you'd like to get more involved with platform development, do consider joining the Octopus user community.
Help us improve by providing feedback or contacting help@jisc.ac.uk
Research Problem
Rationale / Hypothesis
Real World Application

Open [Research Training] for [Open Research] Training

Publication type:Research Problem
CC BY 4.0
Peer Reviews (This Version): (0)
Red flags:


Sign in for more actions

Can a participatory approach to the evaluation of open research training help to bootstrap the uptake of open research methods in the university sector?


The UK Reproducibility Network is running a five-year Open Research Programme across 20+ collaborating member institutions. A key deliverable in this project is a series of training events for academics and research support staff. The project employs a train-the-trainer (T3) model, in which circa 180 attendees from across the network will receive training from project partners. This intervention is designed to help these "T3" trainers deliver training in open research methods to a further 2700 attendees across their home institutions.

Another major project deliverable seeks to evaluate this training programme. Here, we outline an approach to training evaluation which augments the standard instruments with techniques arising in open research. We expect that pairing the dissemination of open research methods (through training) with an opportunity to practice these methods (in training evaluation) will help to bootstrap the uptake of open research in the sector.


As we detail below, we will invite participants in the Open Research Programme to use the eight publication types that are available on Octopus.ac to log and evaluate the open research training events that they attend and deliver. This proposal embeds a research problem, insofar as it provides a context for meta-evaluation. We are curious to learn whether and how participation in open evaluation of open research training influences the uptake of open research within individual and institutional practice.

The first indicators of uptake will be through use of the eight Octopus publication types, as outlined below. In order to assess the different roles that training attendance, training delivery, and participatory training evaluation all play in uptake of open research methods, we will complement participatory evaluation with a more typical centralised evaluation strategy which will draw on focus groups and interviews with participants.

1. Research Problem

In short, the question that UKRN's Open Research Programme addresses is:

How to Grow and Embed Open Research in Institutional Practice and Culture?

Here, we explore how we can embed open research in practice by inviting participation in the open evaluation of an open research training programme.

2. Rationale/Hypothesis:

As will be elaborated in a separate document, alongside a train the trainer (T3) model for dissemination of open research methods, UKRN’s Open Research Project (ORP) will also bootstrap open research practices by inviting T3 attendees to contribute to open, participatory evaluation of the training that they deliver. This process of evaluation and refinement is expected to result it a distilled set of best practices. Complementary evaluation through focus groups and interviews will explore the roles that different activities performed by T3 attendees play in encouraging behaviour change.

3. Method:

Octopus.ac will be used to gather evidence of what kinds of open research training work well and what doesn’t, treating each training event as a small experiment.

As such, trainers will be asked to pre-register at least a Rationale and Method for their training before it is delivered, linking it to this Research Problem or to a subsequently identified sub-problem; and to share measurable Results at a suitable follow-up date. This applies to both train-the-trainer (T3) training (delivered by project partners) and standard training ("T1") delivered by T3 attendees at their institutions.

4. Results:

Each training that is delivered should be associated, in its associated Method, with measurable open research indicators. UKRN have identified several such indicators, a description of which will be shared with T3 participants. The documented Results of training will then assess the training intervention regarding these indicators, and provide any further contextualising information about implementation and immediate outcomes. (E.g., in the case of T3, links to any associated T1 training materials produced by attendees could be provided.)

5. Analysis:

A straightforward analysis of an individual training activity will be invited in the form of a “lessons learned” narrative, and a description of how the training might be done differently in the future.

6. Interpretation:

A subsequent narrative synthesis building on previous analyses might inherit from several such analyses and other research objects to distil best practice guidelines.

7. Real World Application:

Trainers may develop case studies, in advance, to share with attendees as examples of good practice. Further (new) evidence of how open research practices taught in the training are actually being used "in the wild" could be gathered and shared under this heading as well.

8. Peer Review:

Feedback on any of the other objects could be provided, and may be requested of (T3, T1) attendees, e.g., peer feedback on training design could be attached to a Method; trainee feedback on whether the training was actually useful might instead attach to the Results.


This Research Problem has the following sources of funding:

Conflict of interest

This Research Problem does not have any specified conflicts of interest.