Sign Up to like & get
recommendations!
0
Published in 2022 at "Computational Linguistics"
DOI: 10.1162/coli_a_00462
Abstract: Abstract Specialized transformers-based models (such as BioBERT and BioMegatron) are adapted for the biomedical domain based on publicly available biomedical corpora. As such, they have the potential to encode large-scale biological knowledge. We investigate the…
read more here.
Keywords:
background knowledge;
biomedical background;
representation biomedical;
knowledge ... See more keywords