The title of this post is copied from an article in the Scientist back in 2011, but is just as relevant now. I'm a big fan of sensitivity analyses: demonstrating that the findings are not due to a particular choice of parameters (preprocessing, classifier, searchlight diameter, etc), but hold over a range of reasonable choices. Such sensitivity analyses strike me as a practical way to guard against false-positive or over-fished findings; and perhaps the only way to demonstrate robustness for highly complex studies like most MVPA. I really don't think there's going to be a single statistical approach that's going to solve all our problems - multiple approaches are necessary.
Which brings me back to "supplemental or detrimental": there's no succinct way to describe and show the outcome of things like sensitivity analyses in a few sentences, or incorporate them into a short manuscript. Many, many methodological details just can't be explained briefly, aren't critical for the main discussion, but are very interesting for specialists and necessary for replication. For example, papers often simply state that "permutation testing was used", without explaining exactly how the relabeling was carried out. The precise permutation scheme sometimes makes a big difference in what's considered significant. But even I don't necessarily want to go through a half-page explanation of the permutation scheme in the main text; I'd rather see a sentence-or-two description, accompanied by a pointer to a complete description (maybe even with figures and code) in the Supplemental. Similarly, I might just want to see a sentence in the main Results mentioning that they found a similar pattern of results with a different group-level statistic, with the actual details in the Supplemental.
But what's an author to do when journals have extremely restrictive policies like those for Neuron ("Each main figure or table can be associated with up to one supplemental
figure. Therefore, the total number of supplemental figures should not
exceed the number of main figures and tables.") or Nature Neuroscience ("Please note that Supplementary methods are not allowed. A supplementary
note should be used only in consultation with the editors...")? None of the options are great: Authors can submit in journals without such restrictions (like NeuroImage), but other concerns sometimes make submitting to particular journals necessary. Authors can strip out the "extra" information to fit space requirements, perhaps leaving oblique references to it ("single-subject information maps were consistent"), but that makes it impossible to judge what the hidden analyses actually were. Supplemental analyses could perhaps be hosted elsewhere (e.g. lab websites, personal blogs), in the hopes that readers can find the material via google, but these links are not permanent, nor tied to the article.
It strikes me as inherently contradictory to lament replicability problems while simultaneously having highly restrictive publication rules for supplementary information. Complex science is complex, and requires complex description. Good science can't always be properly conveyed by a few words, even by the best authors; and MVPA studies should be accompanied by additional details, even when such details make a publication "confusing and unwieldy".
Thursday, November 14, 2013
Wednesday, November 6, 2013
travels: NIN and HDM
I'll be traveling a fair bit in the next month: I'll be at the Netherlands Institute of Neuroscience the last two weeks of November, then giving a talk at the 1st International Workshop on High Dimensional Data Mining (HDM) in Dallas, TX on 7 December. Drop me a line if you'll be around either and like to meet.
Subscribe to:
Posts (Atom)