Robocrunch        AI
  Tweet  

Paul Kedrosky    @pkedrosky   ·   9/15/2021
Air pollution is the second largest cause of lung cancer worldwide, behind only smoking. I bang on here endlessly about PM2.5 and its health consequences, so a presentation on the topic today at the World Conference on Lung Cancer got my attention.
 Reply      Retweet   3      Like     30    






  Similar Tweets  

Sifted    @Siftedeu   ·   9/13/2021
London-based @getnuman has secured a $40m Series B Europe's largest raise in the space of men’s health to date. Now, the startup aims to remove the stigma around conditions that may be considered ‘taboo’ such as erectile dysfunction and hair loss. https://t.co/w16JJqQRLD
 Reply      Retweet   2      Like     6    

Posted by:
Sifted


Thomas Kipf    @thomaskipf   ·   7/19/2021
Excited to see how far Slot Attention (+ compositional NeRF) can be pushed for object-centric modeling of 3D scenes! This work also introduces "background-aware Slot Attention" which fixes a common issue of backgrounds being split across multiple slots.
 Reply      Retweet   5      Like     52    



Simone Scardapane    @s_scardapane   ·   6/3/2021
*How Attentive are Graph Attention Networks?* If you have ever used GATs, unmissable paper by Brody @urialon1 @yahave 👇 They show the standard formulation of GAT suffers from a significant limitation, solved easily by modifying the attention mechanism. 1/2
 Reply      Retweet   14      Like     43    



Andrew Davison    @AjdDavison   ·   6/1/2021
An attention mechanism can be used within RL to tackle multi-stage tasks from few demos. Attention gives a focused volume crop around the current point of interest, which flicks around and reminds me of eye tracking experiments on humans. Dyson Robotics Lab at Imperial College.
 Reply      Retweet   3      Like     40    



Elias Najarro    @enasmel   ·   9/8/2021
The ALife presentation -by @SudhakaranShyam- of our 3D Neural Cellular Automata paper is now available. We show how to build 3D functional morphogenesis models with NCAs.
 Reply      Retweet   1      Like     3    



Tom Goldstein    @tomgoldsteincs   ·   6/12/2021
Code is now available for SAINT - attention networks for tabular data. https://t.co/1IcjHJWCQr
 Reply      Retweet   8      Like     37    



Hyunjik Kim    @hyunjik11   ·   6/17/2021
We apply self-attention to this set of features, using both the features and the group distance between each pair to compute the attention weights. This attention mechanism is used to replace self-attention in the Transformer architecture, giving an equivariant Transformer.
 Reply      Retweet   1      Like     6    



Quanta Magazine    @QuantaMagazine   ·   9/5/2021
When both rails of DNA’s helical ladder are cut at the same position, genetic damage associated with cancer, neurodegeneration and aging can follow. Yet these breaks can also be constructive for immune and neural processes. https://t.co/YapTWpzBog
 Reply      Retweet   16      Like     55    



Chris Camillo    @ChrisCamillo   ·   9/5/2021
Love these people and the yrs we spent trying to convince WallStreet to pay attention to social data. Missing @jordan_mclain
 Reply      Retweet   2      Like     92    



Intel AI    @IntelAI   ·   6/10/2021
CTO of @JohnSnowLabs, @DavidTalby, explains how Natural Language Processing can spark breakthroughs in medical #AI. Watch the full presentation here. https://t.co/A8JdTJ9pzi
 Reply      Retweet   4      Like     12    

Posted by:
Intel AI


Nandan Thakur    @Nthakur20   ·   6/7/2021
A small reminder, the poster presentation starts in 30 mins. If you are interested and attending #NAACL2021, feel free to join today and have a chat on topics in #NLProc such as semantics or efficient sentence-level embeddings with SBERT.
 Reply      Retweet   4      Like     15    



François Chollet    @fchollet   ·   9/14/2021
New tutorial on https://t.co/m6mT8SrKDD: Graph Attention Networks. https://t.co/SBW1Var12l
 Reply      Retweet   17      Like     83    



Joel Simon    @_joelsimon   ·   9/14/2021
Very cool to see data-free / data-divergent methods getting more attention! My Dimensions of Dialogue is included too :)
 Reply      Retweet   2      Like     3    



Stanford HAI    @StanfordHAI   ·   8/24/2021
GPT-3, known for its scale and sophistication, is already being used for downstream applications such as reading and summarizing news articles. That could lead to serious consequences if bias in these models isn't remedied. https://t.co/Ejo4jpDICI
 Reply      Retweet   9      Like     26    














 
  Relevant People  

Paul Kedrosky
Ancient amateur. Tweets self-destruct. Partner at https://t.co/9bw1gCSeRS. Proprietor of @highwaydebris. Mailing list / audio: https://t.co/m8RyQMmZ0M / https://t.co/6VsMP9cLap
Paul Kedrosky 59.3

François Chollet
Deep learning @google. Creator of Keras. Author of 'Deep Learning with Python'. Opinions are my own.
François Chollet 58.2

Sifted
Get the latest European startup news and views in your inbox: https://t.co/6Y8HoMgUpF Follow us to get them live 🚀🇪🇺
Sifted 41.7

Elias Najarro
Physicist interested in {brains, complexity, evolutionary computation & ALife}. RA at @RoboEvoArtLab
Elias Najarro 18.3

Balaji Srinivasan
Immutable money, infinite frontier, eternal life. #Bitcoin
Balaji Srinivasan 63.6

Thomas Kipf
Research Scientist at @GoogleAI in the Brain Team. Deep Learning with Graphs, Abstractions & Objects; e.g. GCNs, Neural Relational Inference, Slot Attention.
Thomas Kipf 39.7

Chris Camillo
investing with friends. The social arb guy in Jack Schwager’s Unknown Market Wizards
Chris Camillo 45.9




  Hashtags  

   #AI

   #NAACL2021

   #NLProc