skip to content »

Updating systematic reviews

Updating systematic reviews to incorporate new evidence has traditionally involved a manual search for new relevant studies and a re-evaluation of the results.

Some investigators have used clinical trial registries to identify publication bias, they can be a timelier source of evidence than bibliographic databases, and may be especially useful for signaling when new trial evidence is—or is soon to be—available for incorporation into a systematic review update.To reduce the effort needed to undertake systematic reviews, researchers have automated some systematic review processes.Much of the effort on automation has been on the most burdensome processes: searching and screening of bibliographic databases.Our aim was to create a new method for systematic reviewers to monitor when relevant trial evidence becomes available and assess the need for a systematic review update.We created , a shared space for humans and software agents to work together to proactively monitor the status of registered trials that are likely to be relevant to a systematic review update.Systematic reviews of clinical trials are a critical resource supporting policy and clinical decision-making, but they are challenging to keep current.

Producing and updating systematic reviews is resource-intensive and the volume with which new evidence is produced can outpace our ability to keep up.

Users interact with the system by browsing, searching, or adding systematic reviews, verifying links to trials included in the review, and adding or voting on trials that they would expect to include in an update of the systematic review.

The system can trigger the actions of software agents that add or vote on included and relevant trials, in response to user interactions or by scheduling updates from external resources.

The central component of the system is a database of structured records linking trials from their registrations to published systematic reviews.

The database is populated by extracting information from a range of data sources and from crowd-sourcing.

Systematic reviews, just like other research articles, can be of varying quality.