Cambridge MLG    @CambridgeMLG    11/25/2021      

Check out this new post on our blog by our PhD student @siddharthswar about scaling up natural-gradient VI to large models and datasets!
  
    3         22




  Hashtags  







 
  Related  

Joaquin Vanschoren    @joavanschoren    8 hours      

The #NeurIPS Datasets and Benchmarks Track #NeurIPS2021DandB is off to a great start: 8 orals on bias and data and models, responsible data usage, the pervasiveness of label errors, human-in-the-loop evaluation, physical simulation, and great new clinical datasets. @NeurIPSConf
  
          9



AK    @ak92501    11/29/2021      

arxiv: https://t.co/ga00IjnZ6z By co-training PolyViT on a single modality, achieved sota results on three video and two audio datasets, while reducing the total number of parameters linearly compared to single-task models
  
          3



Hugging Face    @huggingface    10/4/2021      

The @PyTorch-based pipelines in 🤗 Transformers now support native torch datasets. GPUs were often underutilized: 30-50%. They now automatically use torch's `DataLoader` when possible leading to much better GPU utilization (90+% on most models)! 🤯
  
    8         59



Thomas Wolf    @Thom_Wolf    11/25/2021      

I see people commenting that this is either an unfortunate reality or better than releasing nothing I disagree I think we can find ways to share code/datasets/models I also tend to think industry papers without any of these are closer to press-release than science reports
  
          13



Thomas Wolf    @Thom_Wolf    8/19/2021      

A few years ago I was mostly interested in models, creating 🤗transformers, adding BERT, GPT, T5… Over time I’ve seen my interests shift to data (sharing, evaluation, processing) leading to 🤗datasets And I see many people around me follow a similar path We are slowly maturing
  
    65         420