Alfredo Canziani    @alfcnz    11/24/2021      

Boltzmann machines are stochastic Hopfield nets with hidden units and can be used to learn the regularities of our data. The added noise allows the model to climb energy walls and land at wider lower minima. Restricted BMs allow us to speed up inference and training…
  
    62         359










 
  Related  

Alfredo Canziani    @alfcnz    11/17/2021      

But… where do these Hopfield nets come from?? 🧐 A spin glass 🔄🍸 is characterised by a metastable magnetic 🧲 config that can be used to encode “memories” 💭. The Ising model tells us how the magnetic moments 🧭 combine to give us an energy 🔋 that admits multiple minima 📉. https://t.co/asqeRrgF6x
  
    7         39



MosaicML    @MosaicML    12/2/2021      

Happy December! Today, we're looking back at Stochastic Weight Averaging (SWA), now a classic ML efficiency win! SWA is a simple method for improving accuracy with no increase in training time. It is built into fastai, pytorch, PTL and our Composer https://t.co/ky395uD1vH. (1/12)
  
    4         14



Dan Roy    @roydanroy    12/2/2021      

i've got energy for research got energy for teaching even got energy for service but i need suggestions for how not to plop down in front of netflix/appletv/youtub at night instead of taking care of nonacademic tasks piling up around me.
  
    1         29



TheSequence    @TheSequenceAI    11/30/2021      

Modern deep neural networks are large and require incredibly large training datasets. The traditional sequential approach is simply impractical. But we can use parallel training. The idea of parallelizable training is intuitive but incredibly hard to achieve. 1/2
  
    1         7