华为云摘得信息检索领域国际权威比赛金牌,实力全解析( 四 )

  learners[J]. OpenAI Blog, 2019, 1(8): 9.

  [5] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. (2018)

  BERT: Pre-training of Deep Bidirectional Transformers for Language

  Understanding. arXiv preprint arXiv:1810.04805,.

  [6] Jinhyuk Lee, Wonjin Yoon, Sungdong Kim, Donghyeon Kim, Sunkyu Kim,

  Chan Ho So, Jaewoo Kang,(2019) BioBERT: a pre-trained biomedical language

  representation model for biomedical text mining, Bioinformatics,

  [7] Iz Beltagy, Kyle Lo, Arman Cohan. (2019) SciBERT: A Pretrained Language

  Model for Scientific Text, arXiv preprint arXiv:1903.10676SciBERT: A

  Pretrained Language Model for Scientific Text, arXiv preprint arXiv:1903.10676,

  2019.

  [8] Nogueira R, Cho K.(2019) Passage Re-ranking with BERT. arXiv preprint

  arXiv:1901.04085.

  [9] Alsentzer E, Murphy J R, Boag W, et al. Publicly available clinical BERT

  embeddings[J]. arXiv preprint arXiv:1904.03323, 2019.


推荐阅读