robocrunch
Oscar Sainz
@osainz59
PhD student on NLP.
Tweets by Oscar Sainz
Finally, we released the code, verbalizations and pre-trained models in order to easily reproduce the results! Models: https://t.co/yYq2chDRkB Thanks to @huggingface for hosting the models!
Shared by
Oscar Sainz
at
5/10/2022
(3) We compared the time spent on annotating examples (a very optimistic aproximation) and making verbalizations. We concluded that it's worth to initially spend some time verbalizing the ontology, which is also more natural and rewarding than annotating examples.
Shared by
Oscar Sainz
at
5/10/2022
(2) We performed an ablation study and concluded that using several NLI datasets such as SNLI, ANLI and Fever-NLI along with MNLI is significantly better than using just MNLI in all scenarios.
Shared by
Oscar Sainz
at
5/10/2022
The paper presents the following three contributions. (1) the TE task reduces the schema dependency of IE tasks and allows better knowledge transfer among them. We show that by improving the results (mostly on zero-shot) on ACE using WikiEvents data and vice versa.
Shared by
Oscar Sainz
at
5/10/2022
Our work on language models fined-tuned on entailment datasets yields sota results on Information Extraction (IE) using only a small fraction of the annotations. We are happy to announce that we have 2 further papers accepted in #NAACL2022 about zero and few-shot IE!
Shared by
Oscar Sainz
at
5/10/2022
We are pleased to present our new demonstration preprint: "ZS4IE: a toolkit for Zero-Shot Information Extraction with Simple Verbalizations." Preprint: https://t.co/WRy4fJiZre Code: https://t.co/gfNL0PVwlw A collaboration between @Hitz_zentroa and @BBNTechnologies .
Shared by
Oscar Sainz
at
3/29/2022