GPT-2 (Korean Wikipedia)

Analysis of information sources in references of the Wikipedia article "GPT-2" in Korean language version.

refsWebsite
Global rank Korean rank
1st place
1st place
114th place
81st place
69th place
54th place
1,559th place
711th place
484th place
779th place
383rd place
118th place
616th place
362nd place
8,920th place
3,453rd place
2nd place
3rd place
low place
low place
3,700th place
low place
12th place
65th place
2,202nd place
2,727th place
474th place
888th place
187th place
102nd place
low place
low place
low place
low place
low place
low place
1,943rd place
1,161st place

arxiv.org

  • Hegde, Chaitra; Patil, Shrikumar (2020년 6월 9일). “Unsupervised Paraphrase Generation using Pre-trained Language Models”. arXiv:2006.05477 [cs.CL]. 
  • Polosukhin, Illia; Kaiser, Lukasz; Gomez, Aidan N.; Jones, Llion; Uszkoreit, Jakob; Parmar, Niki; Shazeer, Noam; Vaswani, Ashish (2017년 6월 12일). “Attention Is All You Need”. arXiv:1706.03762 [cs.CL]. 
  • Bahdanau, Dzmitry; Cho, Kyunghyun; Bengio, Yoshua (2014년 9월 1일). “Neural Machine Translation by Jointly Learning to Align and Translate”. arXiv:1409.0473 [cs.CL]. 
  • Luong, Minh-Thang; Pham, Hieu; Manning, Christopher D. (2015년 8월 17일). “Effective Approaches to Attention-based Neural Machine Translation”. arXiv:1508.04025 [cs.CL]. 

creativeengines.ai

distill.pub

doi.org

dx.doi.org

github.com

  • “gpt-2”. 《GitHub》. 2023년 3월 13일에 확인함. 

gizmodo.com

huggingface.co

transformer.huggingface.co

openai.com

openai.com

cdn.openai.com

talktotransformer.com

techcrunch.com

technologyreview.com

theguardian.com

theregister.com

theverge.com

towardsdatascience.com

usgamer.net

venturebeat.com

vox.com

web.archive.org