MULTIPLE TESTING FOR MODERN DATA: STRUCTURE, CURATION, AND REPLICABILITY

EUGENE KATSEVICH – STANFORD UNIVERSITY

ABSTRACT:

Often modern scientific investigations start by testing a very large number of hypotheses in an effort to comprehensively mine the data for possible discoveries. Multiplicity adjustment strategies are employed to ensure replicability of the results of this broad search. Furthermore, in many cases, discoveries are subject to a second round of selection, where researchers identify the rejected hypotheses that better represent distinct and interpretable findings for reporting and follow-up. For example, in genetic studies, one DNA variant is often chosen to represent a group of neighboring polymorphisms, all apparently associated to a trait of interest. Unfortunately the guarantees of false discovery rate (FDR) control that might be true for the initial set of findings do not translate to this subset, possibly leading to an inflation of FDR in the reported discoveries. To guarantee valid inference, we introduce Focused BH, a multiple testing procedure that allows the researcher to curate rejections by subsetting or prioritizing them according to pre-specified but possibly data-dependent rules (filters). Focused BH assures FDR control on the selected discoveries under a range of assumptions on the filter and the p-value dependency structure; simulations illustrate that this is obtained without substantial power loss and that the procedure is robust to violations of our theoretical assumptions.