BERT (Czech Wikipedia)

Analysis of information sources in references of the Wikipedia article "BERT" in Czech language version.

refsWebsite
Global rank Czech rank
1,272nd place
5,345th place
69th place
196th place
8,920th place
7,263rd place
low place
low place
1,131st place
1,165th place
low place
low place
2,218th place
9,180th place
1st place
1st place
low place
low place
5th place
3rd place
2nd place
4th place
low place
low place

arxiv.org

  • DEVLIN, Jacob; CHANG, Ming-Wei; LEE, Kenton. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv:1810.04805 [cs]. 2019-05-24. ArXiv: 1810.04805 version: 2. Dostupné online [cit. 2022-10-27]. 
  • ROGERS, Anna; KOVALEVA, Olga; RUMSHISKY, Anna. A Primer in BERTology: What we know about how BERT works. arXiv:2002.12327 [cs]. 2020-11-09. ArXiv: 2002.12327. Dostupné online [cit. 2022-10-27]. 
  • ZHU, Yukun; KIROS, Ryan; ZEMEL, Richard. Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books. arXiv:1506.06724 [cs]. 2015-06-22. ArXiv: 1506.06724. Dostupné online [cit. 2022-10-27]. 

blog.google

  • Understanding searches better than ever before. Google [online]. 2019-10-25 [cit. 2023-08-19]. Dostupné online. (anglicky) 

doi.org

  • ABAS, Ahmed; ELHENAWY, Ibrahim; ZIDAN, Mahinda. BERT-CNN: A Deep Learning Model for Detecting Emotions from Text. Computers, Materials & Continua. 2021, roč. 71, čís. 2, s. 2943–2961. Dostupné online [cit. 2023-08-19]. ISSN 1546-2218. doi:10.32604/cmc.2022.021671. (anglicky) 

googleblog.com

ai.googleblog.com

  • Transformer: A Novel Neural Network Architecture for Language Understanding. ai.googleblog.com [online]. 2017-08-31 [cit. 2023-08-20]. Dostupné online. (anglicky) 
  • Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing. ai.googleblog.com [online]. [cit. 2022-10-27]. Dostupné online. (anglicky) 
  • Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing. ai.googleblog.com [online]. 2018-11-02 [cit. 2023-08-20]. Dostupné online. (anglicky) 

h2o.ai

  • BERT Basics: What It Is, Creation, and Uses in AI. h2o.ai [online]. [cit. 2022-10-27]. Dostupné online. 

ibm.com

  • AJAYI, Demi. How BERT and GPT models change the game for NLP [online]. 2020-12-03 [cit. 2023-08-19]. Dostupné online. (anglicky) 

invgate.com

blog.invgate.com

  • MOTTESI, Celeste. GPT-3 vs. BERT: Comparing the Two Most Popular Language Models. blog.invgate.com [online]. [cit. 2023-08-19]. Dostupné online. (anglicky) 

techscience.com

  • ABAS, Ahmed; ELHENAWY, Ibrahim; ZIDAN, Mahinda. BERT-CNN: A Deep Learning Model for Detecting Emotions from Text. Computers, Materials & Continua. 2021, roč. 71, čís. 2, s. 2943–2961. Dostupné online [cit. 2023-08-19]. ISSN 1546-2218. doi:10.32604/cmc.2022.021671. (anglicky) 

towardsdatascience.com

  • HOREV, Rani. BERT Explained: State of the art language model for NLP. Medium [online]. 2018-11-17 [cit. 2023-08-19]. Dostupné online. (anglicky) 

vitalflux.com

  • KUMAR, Ajitesh. BERT vs GPT Models: Differences, Examples [online]. 2023-08-19 [cit. 2023-08-20]. Dostupné online. (anglicky) 

web.archive.org

worldcat.org

  • ABAS, Ahmed; ELHENAWY, Ibrahim; ZIDAN, Mahinda. BERT-CNN: A Deep Learning Model for Detecting Emotions from Text. Computers, Materials & Continua. 2021, roč. 71, čís. 2, s. 2943–2961. Dostupné online [cit. 2023-08-19]. ISSN 1546-2218. doi:10.32604/cmc.2022.021671. (anglicky)