Open and Sustainable Innovation Systems (OASIS) Lab working notes

Powered by 🌱Roam Garden

systematic review

This is an "Orphan" page. Its core content has not been shared: what you see below is a loose collection of pages and page snippets that mention this page, as well as snippets of this page that were quoted elsewhere.

Referenced in

@ervinMotivatingAuthorsUpdate2008

Title: Motivating authors to update systematic reviews: practical strategies from a behavioural science perspective

@oconnorFocusCrosspurposeTools2020

The fourth meeting of the International Collaboration for Automation of systematic reviews (ICASR) was held 5–6 November 2019 in The Hague, the Netherlands. ICASR is an interdisciplinary group whose goal is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. The group seeks to facilitate the development and acceptance of automated techniques for systematic reviews. In 2018, the major themes discussed were the transferability of automation tools (i.e., tools developed for other purposes that might be used by systematic reviewers), the automated recognition of study design in multiple disciplines and applications, and approaches for the evaluation of automation tools.

@hsiaoVisualizingEvidencebasedDisagreement2020

systematic reviews answer specific questions based on primary literature. However, systematic reviews on the same topic frequently disagree, yet there are no approaches for understanding why at a glance. Our goal is to provide a visual summary that could be useful to researchers, policy makers, and health care professionals in understanding why health controversies persist in the expert literature over time. We present a case study of a single controversy in public health, around the question: “Is reducing dietary salt beneficial at a population level?” We define and visualize three new constructs: the overall evidence base, which consists of the evidence summarized by systematic reviews (the inclusion network) and the unused evidence (isolated nodes). Our network visualization shows at a glance what evidence has been synthesized by each systematic review. Visualizing the temporal evolution of the network captures two key moments when new scientific opinions emerged, both associated with a turn to new sets of evidence that had little to no overlap with previously reviewed evidence. Limited overlap between the evidence reviewed was also found for systematic reviews published in the same year. Future work will focus on understanding the reasons for limited overlap and automating this methodology for medical literature databases.

@ervinMotivatingAuthorsUpdate2008

Notes on time-cost of systematic reviews (highlights: as of mid-2000s, the cost of each review is about $20-30k, and takes anywhere from 1.5 yrs to 4 yrs!) (p.3)

Z: Ideas develop simultaneously at multiple timescales, levels of granularity, and completeness

This is another hidden reason why incremental formalization is powerful for creative synthesis in particular, and may not become as important for, say, a systematic review or meta-analysis: need to bridge and connect the multiple overlapping layers of ideas.

@hoangOpportunitiesComputerSupport2018

systematic review is a type of literature review designed to synthesize all available evidence on a given question. systematic reviews require significant time and effort, which has led to the continuing development of computer support. This paper seeks to identify the gaps and opportunities for computer support. By interviewing experienced systematic reviewers from diverse fields, we identify the technical problems and challenges reviewers face in conducting a systematic review and their current uses of computer support. We propose potential research directions for how computer support could help to speed the systematic review process while retaining or improving review quality.

@hoangOpportunitiesComputerSupport2018

researchers basically never used specialized software for their systematic reviews: instead, the most common tools were sys/Microsoft Excel and sys/EndNote

@flemingCochraneNonCochraneSystematic2013

Claim Only about 20% of published systematic reviews in orthondotics are "good" by AMSTAR standards of review quality; 20% are considered "poor"! (Table 2, p.246) synthesis

@reyndersContactingAuthorsModified2019

systematic review teams very frequently (on the order of 70% of the time) need to contact authors for additional details (context) for reported findings synthesis

🌲 zettels

systematic reviews are a narrow subset of synthesis

@greenhalghTimeChallengeSpurious2018

Claim systematic review are typically narrowly focused, and provide less insight

Z: Synthesis is creating a new whole out of component parts

Thought-Fragment Additionally, the IVEO Matrix isn't quite where we want it to be yet: it's more of a meta-analysis or systematic review, rather than a coherent new whole. There is no causal model or graph that ties it all together to support sophisticated reasoning and decision-making

@dowdWhyPublishSystematic2020

Systematic reviews provide more than just a summary of the research literature related to a particular topic or question--rather they offer clear and compelling answers to questions related to the ”who,” "why," and "when" of studies. In this chapter, the authors draw on their experiences with systematic reviews—one as an editor of a highly regarded educational research journal, the other as a researcher and review author—to trace the growing popularity of systematic reviews in education literature and to pose a series of challenges to aspiring review authors to motivate and enliven their work. In particular, the authors stress the importance of melding scientific and rigorous review procedures with 'stylish' academic writing that engages its audience through effective storytelling, attention to context (the people, places, policies, and practices represented in the studies under review), and clear implications for research and practice.

Z: How can we support explicit contention with evidence when synthesizing knowledge claims?

In the limit, systematic reviews do sophisticated computations over these quantitative "fixed" values of evidence level to reach an overall conclusion about the weight of evidence behind a single claim. These are extremely valuable! But they do presuppose a level of "fixedness" and consensus over what constitutes certainty or strength of evidence, which may or may not exist!

@hsiaoVisualizingEvidencebasedDisagreement2020

systematic reviews that came to different conclusions also frequently relied on very different (types of) evidence

@beelResearchPaperRecommender2013

as far as systematic review goes, not necessarily exhaustive - screening was done manually too, first by title, so there's risk of false negatives if it's not in the title. Not sure if Google Scholar did query expansion or soft matching at the time, maybe that might help.

🌲 zettels

systematic reviews are infrequently updated, despite often being outdated very quickly

@turnerProducingCochraneSystematic2017

Title: Producing Cochrane systematic reviews—a qualitative study of current approaches and opportunities for innovation and improvement

@dowdWhyPublishSystematic2020

Publication: systematic reviews in Educational Research: Methodology, Perspectives and Application

@petrosino1999lead

systematic reviews can take 5-6 people more than 1000 hours without special tech (cited in @ervinMotivatingAuthorsUpdate2008) synthesis

@blakeCollaborativeInformationSynthesis2006

is an example-of how context is critical for synthesis (in this case, showing the kinds of context queries scientists look for to try to do synthesis over contradictory findings in a systematic review

Z: Synthesis is creating a new whole out of component parts

It's also that these systematic reviews and meta-analyses are often laser-focused on just one edge in a causal model!

@knightEnslavedTrappedData2019

> The Data Extraction Process seemed particularly demanding to those involved in the corresponding tasks. The Coordinating Editor (P1) described systematic reviewers as being "enslaved to the trapped data", with reviewers "chiseling the mine" to get at the data they needed. (p.208) synthesis

@delaneySearchingEvidenceApproval2018

Despite recognition that database search alone is inadequate even within the health sciences, it appears that reviewers in fields that have adopted systematic review are choosing to rely primarily, or only, on database search for information retrieval. This commentary reminds readers of factors that call into question the appropriateness of default reliance on database searches particularly as systematic review is adapted for use in new and lower consensus fields. It then discusses alternative methods for information retrieval that require development, formalisation, and evaluation. Our goals are to encourage reviewers to reflect critically and transparently on their choice of information retrieval methods and to encourage investment in research on alternatives.

🌲 zettels

Most systematic reviews are done by temporary volunteers

@hsiaoVisualizingEvidencebasedDisagreement2020

different systematic reviews between 2002 and 2014 came to very different conclusions on dietary salt

🌲 zettels

Many systematic reviews are done in an ad-hoc mix of tools

systematic review