DeepMind    @DeepMind    6/16/2020      

Moving away from negative pairs in self-supervised representation learning: our new SotA method, Bootstrap Your Own Latent (BYOL), narrows the gap between self-supervised & supervised methods simply by predicting previous versions of itself. See here: https://t.co/qyaSXnPQjN
  
    238         837




  Hashtags  

   #NeurIPS2021
   #MetaAI






 
  Related  

Chris J. Maddison    @cjmaddison    4 hours      

Although we pitched this as compression, there's nice math in this paper that can be reused to understand representation learning and self-supervised learning. Don't forget to check out the appendices! Come visit us at Poster Session 2.
  
    4         10



Russ Salakhutdinov    @rsalakhu    12/4/2021      

SEAL: Self-supervised Embodied Active Learning using Exploration and 3D Consistency: Closing the Action-Perception Loop: improving object detection / instance segmentation by simply moving around in the physical world. Paper: https://t.co/2uBcBd902V Web: https://t.co/V4OHEm3QZT
  
    4         14



MosaicML    @MosaicML    12/4/2021      

Transfer learning has become a key tool in efficiently training deep neural networks. Typically, a network is pretrained using a large amount of data on a related supervised or self-supervised task. Then, the final few layers (the "head") are removed and a new head is added.(2/8)
  
          3



Devendra Chaplot    @dchaplot    12/3/2021      

Presenting SEAL: Self-supervised Embodied Active Learning! #NeurIPS2021 SEAL is a self-supervised framework to close the action-perception loop. It improves perception & action models by just moving in the physical world w/o any human supervision. https://t.co/QeAMcp99Cp 1/N
  
    6         18



Yann LeCun    @ylecun    12/4/2021      

SEAL: Self-Supervised Embodied Active Learning from #MetaAI
  
    7         45