Download PDFOpen PDF in browser

ADAGES: Adaptive Aggregation with Stability for Distributed Feature Selection

EasyChair Preprint no. 3999

24 pagesDate: August 2, 2020

Abstract

 In this era of big data, not only the large amount of data keeps motivating distributed computing, but concerns on data privacy also put forward the emphasis on distributed learning. To conduct feature selection and to control the false discovery rate in a distributed pattern with multi-machines or multi-institutions, an efficient aggregation method is necessary. In this paper, we propose an adaptive aggregation method called ADAGES which can be flexibly applied to any machine-wise feature selection method. We will show that our method is capable of controlling the overall FDR with a theoretical foundation while maintaining power as good as the Union aggregation rule in practice.

Keyphrases: aggregation method, controlled feature selection, distributed learning, false discovery rate, stability

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:3999,
  author = {Yu Gui},
  title = {ADAGES: Adaptive Aggregation with Stability for Distributed Feature Selection},
  howpublished = {EasyChair Preprint no. 3999},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser