Transformeur génératif pré-entraîné (French Wikipedia)

Analysis of information sources in references of the Wikipedia article "Transformeur génératif pré-entraîné" in French language version.

refsWebsite
Global rank French rank
1st place
1st place
2nd place
3rd place
7th place
28th place
low place
low place
4th place
12th place
low place
low place
234th place
147th place
1,559th place
1,879th place
1,564th place
1,809th place
low place
low place
69th place
232nd place
54th place
149th place
616th place
1,521st place
low place
low place

arxiv.org

  • Jacob Devlin, Ming-Wei Chang, Kenton Lee et Kristina Toutanova, « BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding », Association for Computational Linguistics,‎ (arXiv 1810.04805v2)

cmu.edu

cs.cmu.edu

cv-foundation.org

  • (en) Yukun Zhu, Ryan Kiros, Rich Zemel et Ruslan Salakhutdinov, « Aligning Books and Movies: Towards Story-Like Visual Explanations by Watching Movies and Reading Books », IEEE International Conference on Computer Vision,‎ , p. 19–27 (lire en ligne)

d2l.ai

doi.org

dx.doi.org

  • (en) Luo R, Sun L, Xia Y, Qin T, Zhang S, Poon H, « BioGPT: generative pre-trained transformer for biomedical text generation and mining. », Brief Bioinform, vol. 23, no 6,‎ (PMID 36156661, DOI 10.1093/bib/bbac409, lire en ligne)
  • (en) Ferruz, N., Schmidt, S. & Höcker, B., « ProtGPT2 is a deep unsupervised language model for protein design. », Nature Communications volume, vol. 13,‎ (DOI 10.1038/s41467-022-32007-7, lire en ligne)

forbes.com

nature.com

  • (en) Ferruz, N., Schmidt, S. & Höcker, B., « ProtGPT2 is a deep unsupervised language model for protein design. », Nature Communications volume, vol. 13,‎ (DOI 10.1038/s41467-022-32007-7, lire en ligne)

neurips.cc

proceedings.neurips.cc

  • Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser et Illia Polosukhin, « Attention is All you Need », Curran Associates, Inc., vol. 30,‎ (lire en ligne)

nih.gov

ncbi.nlm.nih.gov

  • (en) Luo R, Sun L, Xia Y, Qin T, Zhang S, Poon H, « BioGPT: generative pre-trained transformer for biomedical text generation and mining. », Brief Bioinform, vol. 23, no 6,‎ (PMID 36156661, DOI 10.1093/bib/bbac409, lire en ligne)

nytimes.com

openai.com

cdn.openai.com

the-decoder.com

venturebeat.com

web.archive.org