Robocrunch        AI

Hugging Face    @huggingface   ·   9/14/2021
📢 Introducing 🤗 Optimum A new open source library to optimize 🤗Transformers for production performance. 🏎 Quantize, Prune, Optimize models easily, targeting hardware from our partners @intel @graphcoreai @Qualcomm! 🤩
 Reply      Retweet   101      Like     372    

  Similar Tweets  

Hugging Face    @huggingface   ·   9/15/2021
We are excited to team up with @graphcoreai, to make training of 🤗 Transformers on cutting-edge IPUs super easy! With our new library 🤗 Optimum, using IPUs will become plug-and-play. Read all about it in this guest blog post from the Graphcore team:
 Reply      Retweet   7      Like     15    

Sara Hooker    @sarahookr   ·   9/10/2021
+ The overfitting of hardware to a small list of open source models: "This is also why you shouldn’t read too much into MLPerf’s results. A popular model running really fast on a type of hardware doesn’t mean an arbitrary model will run really fast on that hardware."
 Reply      Retweet   1      Like     5    

Nils Reimers    @Nils_Reimers   ·   6/3/2021
Small & Fast Models 🏎️💨 We added several small & fast models, for optimal encoding speed on GPU & CPU. Multi-Lingual Models 🇺🇳 Multi-lingual models for 50+ languages are available. They achieve by far the best performance across all available multilingual models for many tasks. h
 Reply      Retweet   2      Like     21    

Nils Reimers    @Nils_Reimers   ·   6/28/2021
Really happy about the launch of Sentence Transformers v2. All models are now hosted on the HF models hub. This makes sharing & finding your custom sentence embedding models extremely easy. Plus: You can directly interact with these models on the hub.
 Reply      Retweet   18      Like     95    

Nils Reimers    @Nils_Reimers   ·   7/19/2021
Have a look at this nice library how to use sentence embedding models for topic modeling. Works for many languages and also for zero-shot cross-lingual settings. Can be easily be used with any sentence-transformers model.
 Reply      Retweet   23      Like     94    

Jerome Pesenti    @an_open_mind   ·   7/17/2021
BlenderBot 2.0 improves on the fixed and "goldfish" memory of existing language generation models. Still not safe enough for production, but a great advance for #conversationalAI that we open source for the community to make progress in that direction
 Reply      Retweet   13      Like     74    

Patrick von Platen    @PatrickPlaten   ·   8/28/2021
ASR models trained on read-out audio alone are often not practical for real world usage. Robust Wav2Vec2 ( by @facebookai now shows how more diverse pretraining data leads to robuster ASR models🔥 Available on🤗Transformers & Hub:
 Reply      Retweet   59      Like     241    

Sara Hooker    @sarahookr   ·   9/10/2021
fantastic blog post by @chipro "With so many new offerings for hardware to run ML models on, one question arises: how do we make a model built with an arbitrary framework run on arbitrary hardware?"
 Reply      Retweet   2      Like     12    

Daniel Levy    @daniellevy__   ·   9/14/2021
Great collaboration with @violet_zct et al. on DRO for multilingual translation! Two key ideas: - surprisingly, the robust objective improves performance on *every* language pair (vs ERM) - tailoring the optimization algorithm to the architecture (here transformers) matters a lot
 Reply      Retweet   2      Like     6    

Google AI    @GoogleAI   ·   9/16/2021
As #NeuralNetwork models and training data size grow, training efficiency has become more important. Today we present two families of models for image recognition that train faster and achieve state-of-the-art performance. Learn more and grab the code
 Reply      Retweet   99      Like     388    

Posted by:
Google AI

Christopher Manning    @chrmanning   ·   9/12/2021
people use an easily measurable objective in place of what is good. Maybe they’re right that an optimization mindset tends to focus on process rather than goals. This demands refocusing on goals and defining a good, multifaceted objective function before you start to optimize.
 Reply      Retweet   1      Like     5    

Hugging Face    @huggingface   ·   8/19/2021
20,000+ machine learning models connected to 3,000+ apps? Hugging Face meets Zapier! 🤯🤯🤯 With the Hugging Face API, you can now easily connect models right into apps like Gmail, Slack, Twitter, and more: [1/2]
 Reply      Retweet   89      Like     469    

François Chollet    @fchollet   ·   9/14/2021
Introducing TensorFlow Similarity: a toolbox to make it easy and fast to train similarity models.
 Reply      Retweet   3      Like     33    

TensorFlow    @TensorFlow   ·   6/30/2021
🏆⚡️ The latest MLPerf Benchmark results are out and Google's #TPU v4 has set new performance records! Now you can train some of the most common ML models in seconds. Learn more
 Reply      Retweet   42      Like     175    

  Relevant People  

Hugging Face
The AI community building the future. #BlackLivesMatter #stopasianhate
Hugging Face 47.4

Nils Reimers
NLP researcher at @huggingface • Creator of SBERT (
Nils Reimers 22.3

Sara Hooker
Research @ Google Brain, model compression, robustness + interpretability. @trustworthy_ml Founder of data for good non-profit @deltanalytics.

Niels Rogge
ML Engineer @huggingface. @KU_Leuven grad. General interest in machine learning, deep learning, NLP. Making AI more accessible for everyone!
Niels Rogge 21.1

PyTorch Lightning
The lightweight PyTorch AI research framework. Scale your models, not the boilerplate! Use our platform @gridai_ to scale models from your laptop to the cloud.
PyTorch Lightning 37.0

TensorFlow is a fast, flexible, and scalable open-source machine learning library for research and production.
TensorFlow 56.5

Google AI
Google AI is focused on bringing the benefits of AI to everyone. In conducting and applying our research, we advance the state-of-the-art in many domains.
Google AI 67.3