Science Surveyor


One of the biggest challenges facing science journalists is the ability to quickly contextualize peer-reviewed journal articles that they are reporting on deadline. A science reporter must rapidly get a sense of what has come before in a field they may not know well, understand whether the new paper represents a significant advance or not, and establish whether this finding is an outlier, falls in a camp or is part of the field’s consensus.


SCIENCE SURVEYOR is an algorithm-based method to help science journalists rapidly and effectively characterize the rich literature for any topic they might cover, as a way to inform and assist news judgment and reporting. We see the development of these algorithms as having the potential to greatly improve science news coverage by making it more independent, contextualized and investigative. News stories could better convey the incremental character of science, its complexity, its nuance. If successful, the methodology we devise could have wider use. Scientists and other researchers could use it to improve communication about their fields and practice within their fields. Members of the public could use it to engage with the specialized literature in new ways.

To these ends our goals for 2015/16 is to innovate on three fronts:

  1. To develop novel computational methodologies addressing the problem of scientific consensus.
  2. To create new ways of visualizing the growth and decay of shared knowledge.
  3. To apply our tools and methods to several controversial stories from the fields of climate research and neuroscience, as model of data-driven journalism.


Our inter-disciplinary / multi-departmental / bicoastal team consists of researchers at Columbia University and Stanford University:

  • Marguerite Holloway - Graduate School of Journalism, Columbia University
  • Dennis Tenen - Department of English and Comparative Literature, Columbia University
  • Laura Kurgan - Graduate School of Architecture, Planning and Preservation, Columbia University
  • Juan Francisco Saldarriaga - Graduate School of Architecture, Planning and Preservation, Columbia University
  • Dan Jurafsky - Department of Linguistics and the Computer Science Department, Stanford University
  • Dan McFarland - Graduate School of Education and Department of Sociology



  • Juan Alperin - Simon Fraser University
  • Shuheng Gong - Brown Institute for Media Innovation, Columbia University
  • Cheryl Holzmeyer - Graduate School of Education, Stanford University
  • Lauren Maggio - Lane Medical Library, Stanford University
  • Christopher Manning - Department of Computer Science, Stanford University
  • Laura Moorhead - Graduate School of Education, Stanford University
  • Phillip Polefrone - Department of English and Comparative Literature, Columbia University
  • Dragomir Radev - Department of Electrical Engineering and Computer Science, University of Michigan
  • John Willinsky - Graduate School of Education, Stanford University


In September SCIENCE SURVEYOR will begin its second year of research with an extremely generous Magic Grant from the Helen Gurley Brown Institute for Media Innovation. SCIENCE SURVEYOR was chosen as the Brown Institute’s first Flagship Project. The expanded team aims to develop a methodology and to then test that methodology in interactive visualizations of several case studies from climate science and neuroscience.

In August the principal investigators met in Stanford to plan the upcoming year and brainstorm about their approach. Several graduate students and post-docs in computer science, linguistics, sociology and education will soon join the team and start devising the methodologies.

In May Science Surveyor was described in ​the ​Communications of the ACM.

Please check back here for more updates soon!

2014 - 2015

SCIENCE SURVEYOR was awarded its initial Magic Grant for 2014—2015. Over the last year, the original team—Holloway, Tenen, Saldarriaga and Kurgan—conducted an extensive literature review, looking at prior art and work in the realm of citation and network analysis and characterizations and visualizations of the scientific literature.

The team then began to experiment on a small, clean corpus of peer-reviewed papers—the Archive of Computational Linguistics—that was generously donated by Dragomir Radev. The team worked on devising ways of depicting “centrality,” a measure of how related and similar a paper is to its larger field, based on language and citations, and how that changes over time. The team developed a methodology—with the help of Phillip Polefrone and Shuheng Gong. And they created two innovative visualizations:

In December 2014 the team reached out to Dan Jurafsky and Dan McFarland—via Chris Manning. The two have deep expertise in natural language processing and characterizing the scientific literature as well as deep knowledge of the sociology of science. The ensuing collaboration offers SCIENCE SURVEYOR its greatest chance of success.