Sergey Levine    @svlevine    11/24/2021      

Value function spaces (VFS) uses low-level primitives to form a state representation in terms of their "affordances" - the value functions of the primitives serve as the state. This turns out to really improve generalization in hierarchical RL! https://t.co/yJqJwwCT6r Short 🧵>
  
    25         154










 
  Related  

Elias Najarro    @enasmel    6 hours      

"Tailoring: encoding inductive biases by optimising unsupervised objectives at prediction time" by @FerranAlet Gist: Add extra loss function during inference. New loss "tailors" the NN parameters to improve NN representation and hence performance. Paper: https://t.co/hc678ebSAl
  
          3



Alex Nichol    @unixpickle    11/25/2021      

I loved learning math as a kid because I knew I'd be consuming well thought-out abstractions. Math is full of really useful primitives developed over thousand of years, and I knew I was in good hands if I learned them.
  
          3



AK    @ak92501    11/30/2021      

Improving Zero-shot Generalization in Offline Reinforcement Learning using Generalized Similarity Functions abs: https://t.co/5B163MSLHN
  
          15



Demis Hassabis    @demishassabis    12/3/2021      

So I’m really proud of our latest @Nature paper https://t.co/L4UTPV39PE (also on the cover!) detailing our fantastic collaborations with Maths Profs Williamson, Lackenby, and Juhasz to make surprising and significant discoveries in topology and representation theory
  
    12         93