Robocrunch        AI

Tom Goldstein    @tomgoldsteincs     9/14/2021
A dragonfly responds to prey in 50 milliseconds using a neural circuit that is about 3 layers deep (10ms per layer, plus time for eyes and muscles to respond). That right there is some fancy network pruning.
 Reply      Retweet   2      Like     7    

  Similar Tweets  

Arash Vahdat    @ArashVahdat     7/23/2021
馃摙 HANT: Hardware-Aware Network Transformation How can we accelerate a trained network to meet efficiency requirements for deployment? The current network compression methods such as pruning, kernel fusion & quantization address this w/o changing underlying network operations.
 Reply      Retweet   10      Like     64    

Brandon Rohrer    @_brohrer_     8/31/2021
At the office: This meeting could have been an email. In production: This deep neural network could have been k-nearest neighbors.
 Reply      Retweet   113      Like     1172    

Jonathan Frankle    @jefrankle     5/18/2021
NEW WORKSHOP: Sparsity in Neural Networks: Advancing Understanding and Practice (July 8-9, 2021). This workshop will bring together members of the many communities working on neural network sparsity to share their perspectives and the latest cutting-edge research (Deadline: 6/15)
 Reply      Retweet   89      Like     349    

Pin-Yu Chen    @pinyuchenTW     6/3/2021
Very interesting work that studies the security of neural networks through the lens of internal representations at different layers and their roles
 Reply      Retweet   2      Like     20    

PyTorch Best Practices    @PyTorchPractice     9/12/2021
Neural network based solvers for partial differential equations and inverse problems milky_way. Implementation of physics-informed neural networks in pytorch. #deeplearning #machinelearning #ml #ai #neuralnetworks #datascience #PyTorch
 Reply      Retweet   2      Like     2    

Simone Scardapane    @s_scardapane     6/18/2021
Twitter friends, time for suggestions! I have a neural network course that I teach with slides (Beamer) & notebooks. I'd like to innovate a little, but I am unsure about nice teaching tools that have math & code support & are collaborative. Any ideas? Am I too old?
 Reply      Retweet   1      Like     16    

Reza Zadeh    @Reza_Zadeh     9/15/2021
The Transformer is the most important neural network architecture progression of the past 9 years. Jakob Uszkoreit, its co-creator, will be presenting at ScaledML:
 Reply      Retweet   2      Like     9    

Bipin Krishnan    @bkrish_     6/1/2021
Pytorch reproducibility tip馃挕 Is your neural network giving different results for different runs with same hyper-parameters鉂 Soultion: Call this function before executing any other part of your code馃憞
 Reply      Retweet   1      Like     5    

Zach Mueller    @TheZachMueller     6/1/2021
fasterai, a @fastdotai library aimed at network distillation with sparse neural training (and knowledge distillation)
 Reply      Retweet   24      Like     119    

David Pfau    @pfau     9/11/2021
"Can we approximate any intelligence with a sufficiently big neural network" is the new "can we approximate any function with a sufficiently big polynomial"
 Reply      Retweet   1      Like     16    

Jonathan Frankle    @jefrankle     7/20/2021
We found a scaling law that describes the error of entire families of pruned neural networks. For the night owls among you, check out our work "On the Predictability of Pruning Across Scales" at ICML (tonight, 11pm-2am Eastern). Led by @jonsrosenfeld!
 Reply      Retweet   17      Like     114    

Neuromorphic Computing and Engineering    @IOPneuromorphic     8/19/2021
Our Focus Issue on Extreme Edge Computing will be kicked off by this Accepted Manuscript from @_ArunAjayan and @apjsengnu (@iiitmk) - Edge to Quantum: Hybrid Quantum-Spiking Neural Network Image Classifier. Read it for free at
 Reply      Retweet   11      Like     16    

Bosch Center for Artificial Intelligence    @Bosch_AI     6/21/2021
In this #CVPR21 paper, we introduce a differentiable Similarity Guided Sampling (SGS) module, which can be plugged into any existing 3D Convolutional Neural Network (CNN) architecture. More on馃摪 #computervision #AIresearch #BCAI
 Reply      Retweet   2      Like     26    

Cees Snoek    @cgmsnoek     8/20/2021
#ICCV2021 cam ready: "Motion-Augmented Self-Training for Video Recognition at Smaller Scale" w/ Kirill Gavrilyuk, Mihir Jain, @ikdeepl now available: TL;DR self-train a convolutional neural network using optical flow, but avoid flow during inference. 1/n
 Reply      Retweet   7      Like     51    

  Relevant People  

Tom Goldstein
Associate Professor at Maryland. I want my group to do theory and scientific computing but my students don’t listen to me so I guess we do deep learning 🤷‍♂️
Tom Goldstein 22.0

Jonathan Frankle
PhD student: science of deep learning, tech policy @MIT_CSAIL. Research lead: stealth. Current focus: hypothesizing about lottery tickets. Cover: @RobertTLange
Jonathan Frankle 27.7

Brandon Rohrer
Getting answers from data. he/him
Brandon Rohrer 42.6

Arash Vahdat
Senior research scientist @nvidia research, working on generative learning, efficient neural nets, representation learning @NvidiaAI, previously @dwavesys, @SFU
Arash Vahdat 25.8

Simone Scardapane
I fall in love with a new #machinelearning topic every month 🙄 Researcher @SapienzaRoma | Chairman @iaml_it | Co-host @SmarterPodcast | @GoogleDevExpert
Simone Scardapane 28.0

PyTorch Best Practices
PyTorch Best Practices @ #deeplearning #machinelearning #pytorch #ml #ai #neuralnetworks
PyTorch Best Practices 36.0

Reza Zadeh
ML at Matroid & Stanford. CEO @Matroid. Adjunct Professor @Stanford. Technical Advisor @Databricks. Creator Private Aerobatic Pilot.
Reza Zadeh 46.1