Take your supplements

There's one thing I want to address before I go into writing a research article, and that's supplemental figures. I mentioned in my last post that high impact journals such as Science, Nature and Cell require vast amounts of data to publish there. Fair enough. They want well-supported, in-depth studies which require a lot of experiments. The issue is that they also have very strict space limitations, usually around 5 pages. And that's including space for the actual figures! Which means there's very little room for providing background on the area of research or a detailed discussion of how this data impacts science. This can make the articles feel rushed when you're reading them, but it's also kind of refreshing to just get the information without all the blabbity-blah (says the blabbity-blah blogger).

The other consequence is that a lot of the "non-essential" information gets hidden in the supplemental material. This online-only section was originally intended for large data sets that simply aren't feasible to publish in print, such as with a bacterial genome sequencing project. That's WAY too many A, G, C, T's for a printer to handle! The supplemental material can also contain details of the methods used to perform the experiments, which I completely agree with including here. That's extra information that is important for anyone who wants to reproduce the experiments, but isn't necessarily essential to understanding the study (assuming the authors put enough detail in the text).

The problem arises, however, when researchers are forced to put important supporting data into supplemental figures. Only recently has there even been standard guidelines described for writing and editing this section, and it's debatable that these will become common practice. The people who peer review manuscripts submitted for publication are researchers themselves and their time is already limited by other demands. So the worry is that supplemental figures aren't as thoroughly reviewed and the quality of the experiments may be overlooked. That's not good; no one wants sloppy science.

The other issue is that sometimes the supplemental figures become a data dump. Negative data is very difficult to publish and only gets into low end journals, so it's often considered a waste of time to write up. But it is important information for scientists. Kind of a PSA to save others time and funding: "Hey guys, don't bother testing this idea. It totes doesn't work!" So why not put that in the supplemental material? I don't have a good answer for that, and to be honest, I'm not sure it's a bad idea. Although a better idea has been broached by journals that exclusively publish negative data, such as the Journal of Negative Data in Biomedicine or Ecology and Evolutionary Biology. I love this idea, but I don't feel like these have quite found their footing yet. In particular the Biomedicine journal focuses on "unexpected, controversial, provocative and/or negative results", which misses the point a bit. But it's a start! Let's get those allegedly "failed" experiments out of the supplemental wasteland and into a proper manuscript.

But it all comes back to journal expectation. If 5 figures and double that in the supplemental figures (I'm not even exaggerating) are what's expected to get a high impact journal on your CV, that's what you have to do. Publish or perish. The good news is that two fairly high impact journals (that I know of, there may be more) have decided to eliminate supplemental data-dumping by either restricting what is allowed or getting rid of it all together. I say good on them, change is good! Encyclopedias have given way to Wikipedia in our technological world, maybe it's time to reconsider how research data is published too.