AK    @ak92501    11/25/2021      

On the Unreasonable Effectiveness of Feature propagation in Learning on Graphs with Missing Node Features abs: https://t.co/aboq1hkhdU
    6         58




Emanuele Rossi    @emaros96    12/1/2021      

We introduce Feature Propagation (FP), a new method to learn on graphs with partially missing node features. With 99% of features missing, FP loses only 4% rel. acc. compared to when all features are available. Bonus: It scales to graphs with >100M nodes https://t.co/ANdEGoBnoE
    4         6

Emanuele Rossi    @emaros96    12/1/2021      

When evaluated on node classification, it outperforms all previous methods for any rate of missing features. Moreover, it can withstand surprisingly high rates of missing features: on average we observe only ~4% relative accuracy drop when 99% of the features are missing.

Mathias Niepert    @Mniepert    11/26/2021      

Since the GNNs for missing data are trending, I feel inclined to advertise our method published 2018 (https://t.co/uW5bgAZZn6) which learns, for each node and feature type, a vector for the available and one for the missing values. These are aggregated during message passing.
    1         3

NAACL HLT    @naaclmeeting    12/5/2021      

Workshop at #NAACL2022 (@naaclmeeting): 📣 Deep Learning on Graphs for Natural Language Processing
    2         2