Jacob Devlin, Ming-Wei Chang, Kenton Lee et Kristina Toutanova, « BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding », Association for Computational Linguistics, (arXiv1810.04805v2)
(en) Yukun Zhu, Ryan Kiros, Rich Zemel et Ruslan Salakhutdinov, « Aligning Books and Movies: Towards Story-Like Visual Explanations by Watching Movies and Reading Books », IEEE International Conference on Computer Vision, , p. 19–27 (lire en ligne)
(en) Luo R, Sun L, Xia Y, Qin T, Zhang S, Poon H, « BioGPT: generative pre-trained transformer for biomedical text generation and mining. », Brief Bioinform, vol. 23, no 6, (PMID36156661, DOI10.1093/bib/bbac409, lire en ligne)
(en) Ferruz, N., Schmidt, S. & Höcker, B., « ProtGPT2 is a deep unsupervised language model for protein design. », Nature Communications volume, vol. 13, (DOI10.1038/s41467-022-32007-7, lire en ligne)
(en) Ferruz, N., Schmidt, S. & Höcker, B., « ProtGPT2 is a deep unsupervised language model for protein design. », Nature Communications volume, vol. 13, (DOI10.1038/s41467-022-32007-7, lire en ligne)
neurips.cc
proceedings.neurips.cc
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser et Illia Polosukhin, « Attention is All you Need », Curran Associates, Inc., vol. 30, (lire en ligne)
nih.gov
ncbi.nlm.nih.gov
(en) Luo R, Sun L, Xia Y, Qin T, Zhang S, Poon H, « BioGPT: generative pre-trained transformer for biomedical text generation and mining. », Brief Bioinform, vol. 23, no 6, (PMID36156661, DOI10.1093/bib/bbac409, lire en ligne)
nytimes.com
(en-US) Kevin Roose, « The Brilliance and Weirdness of ChatGPT » [archive du ], The New York Times, (consulté le ) : « Like those tools, ChatGPT — which stands for generative pre-trained transformer — landed with a splash. »
(en-US) Kevin Roose, « The Brilliance and Weirdness of ChatGPT » [archive du ], The New York Times, (consulté le ) : « Like those tools, ChatGPT — which stands for generative pre-trained transformer — landed with a splash. »