Sergey Levine    @svlevine    11/24/2021      

Value function spaces (VFS) uses low-level primitives to form a state representation in terms of their "affordances" - the value functions of the primitives serve as the state. This turns out to really improve generalization in hierarchical RL! Short 🧵>
    25         154


Elias Najarro    @enasmel    6 hours      

"Tailoring: encoding inductive biases by optimising unsupervised objectives at prediction time" by @FerranAlet Gist: Add extra loss function during inference. New loss "tailors" the NN parameters to improve NN representation and hence performance. Paper:

Alex Nichol    @unixpickle    11/25/2021      

I loved learning math as a kid because I knew I'd be consuming well thought-out abstractions. Math is full of really useful primitives developed over thousand of years, and I knew I was in good hands if I learned them.

AK    @ak92501    11/30/2021      

Improving Zero-shot Generalization in Offline Reinforcement Learning using Generalized Similarity Functions abs:

Demis Hassabis    @demishassabis    12/3/2021      

So I’m really proud of our latest @Nature paper (also on the cover!) detailing our fantastic collaborations with Maths Profs Williamson, Lackenby, and Juhasz to make surprising and significant discoveries in topology and representation theory
    12         93