robocrunch
Jian Tang
@tangjianpku
Assistant Professor at Mila. Working on geometric deep learning, AI for small molecule/protein design, knowledge graphs, generative models
Tweets by Jian Tang
Mila(https://t.co/cVXe4xYaEh) has multiple postdoc positions on ML for molecular modelling with applications in drug and material discovery. Candidates will be co-supervised by Mila professors Yoshua Bengio, me, Guy Wolf, ... Please DM me or email me (tangjianpku@gmail.com).
Shared by
Jian Tang
at
5/3/2022
We particularly encourage candidates with ML background in geometric deep learning, graph neural networks, deep generative models, molecular modelling or domain experts in chemistry/physics/biology to apply.
Shared by
Jian Tang
at
5/3/2022
Another ICLR oral presentation in my group "Neural Structured Prediction for Inductive Node Classification", which marries statistical relational models (e.g., CRF) with GNNs for reasoning on graphs. Paper: https://t.co/fOxjvlc9NN Excellent work by amazing student @mengqumn
Shared by
Jian Tang
at
4/25/2022
Our latest work of learning protein representations by pretraining on the #AlphaFold2 structure database, outperforming sequence-based methods.
Shared by
Jian Tang
at
3/15/2022
At this year's AAAI, with my PhD students @mengqumn @zhu_zhaocheng, we just gave a tutorial about #reasoning_on_knowledge_graphs, covering both neural and symbolic logic rule based approaches, their combinations, and logic rule induction methods. Slides: https://t.co/fKtdu7cQOg
Shared by
Jian Tang
at
2/24/2022
Geometric deep learning for drug discovery is a really promising research direction to work on. I recently gave a tutorial on this topic. #GeometricDeepLearning #DrugDiscovery The slides are available here: https://t.co/LiCXil6Hcj
Shared by
Jian Tang
at
2/16/2022
Our new state-of-the-art algorithm NBFNet for knowledge graph reasoning (to appear @NeurIPSConf '21) is open sourced: https://t.co/DbByR5I0Gu. NBFNet is a general GNN framework for link prediction, and works in both transductive and inductive setting, by @zhu_zhaocheng @Oxer22
Shared by
Jian Tang
at
11/25/2021
3) Xu et al. Joint Modeling of Visual Objects and Relations for Scene Graph Generation. A fully expressive probabilistic model for scene graph generation, which models relational reasoning among different objects.
Shared by
Jian Tang
at
10/6/2021
4) Deac et al. Neural Algorithmic Reasoners are Implicit Planners. Using GNNs for implicit planning and algorithmically aligning to Value Iteration.
Shared by
Jian Tang
at
10/6/2021
5) Xhonneux et al. How to transfer algorithmic reasoning knowledge to learn new algorithms? Studied transferability of algorithmic reasoning across different algorithms.
Shared by
Jian Tang
at
10/6/2021
1) Zhu et al. Neural Bellman-Ford Networks: A General Graph Neural Network Framework for Link Prediction. https://t.co/GwrPcvyO67 A state-of-the-art algorithm for link prediction based on GNNs in both transductive and inductive settings.
Shared by
Jian Tang
at
10/6/2021
2) Luo and Shi et al. Predicting Molecular Conformation via Dynamic Graph Score Matching. A new approach for molecular conformation generation by modeling both short- and long-range interactions between atoms.
Shared by
Jian Tang
at
10/6/2021
Five papers related to GNNs are accepted to #NeurIPS2021 in my group, ranging from knowledge graph reasoning, drug discovery, scene graph generation and algorithmic reasoning. Congrats to all my students and collaborators.
Shared by
Jian Tang
at
10/6/2021
We will give a tutorial on TorchDrug @DrugTorch . Join us if you are interested in machine learning for drug discovery.
Shared by
Jian Tang
at
10/5/2021
Interested in single-cell RNA sequencing analysis? We proposed single-cell Embedded Topic Model, which learns a transferrable cell encoder and an interpretable decoder, inferring both gene and topic embeddings. Published at Nature Communications:https://t.co/PX31xE55SB
Shared by
Jian Tang
at
9/7/2021
Our recent work on graph neural networks. We propose a flexible, scalable, explainable graph neural network framework for link prediction, which works in both transductive and inductive settings. State-of-the-art results on both homogenous graphs and knowledge graphs.
Shared by
Jian Tang
at
6/15/2021
Another thing I forget to mention is that this model is SE(3)-equivariant, i.e. equivariant under continuous 3D roto-translations. It has already been successfully applied to protein structure prediction in some very recent work: https://t.co/PUuzQWgIDo https://t.co/zWZBeO3ZGk
Shared by
Jian Tang
at
6/14/2021