Активациона функција (Serbian Wikipedia)

Analysis of information sources in references of the Wikipedia article "Активациона функција" in Serbian language version.

refsWebsite
Global rank Serbian rank
69th place
240th place
1st place
1st place
18th place
28th place
3rd place
2nd place
low place
low place
149th place
283rd place
5th place
12th place
2nd place
4th place
low place
low place
207th place
370th place
6,214th place
6,799th place
652nd place
1,147th place
6,223rd place
low place
low place
low place
1,185th place
1,193rd place
11th place
23rd place
low place
low place

acm.org

dl.acm.org

arxiv.org

  • Abbott, L. F.; Sussillo, David (19. 12. 2014). „Random Walk Initialization for Training Very Deep Feedforward Networks” (на језику: енглески). 
  • Carlile, Brad; Delamarter, Guy; Kinney, Paul; Marti, Akiko; Whitney, Brian (9. 11. 2017). „Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs)”. arXiv:1710.09967Слободан приступ [cs.LG]. 
  • Eidnes, Lars; Nøkland, Arild (2018). „Shifting Mean Activation Towards Zero with Bipolar Activation Functions”. International Conference on Learning Representations (ICLR) Workshop. arXiv:1709.04054Слободан приступ. 
  • He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (6. 2. 2015). „Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification”. arXiv:1502.01852Слободан приступ [cs.CV]. 
  • Xu, Bing; Wang, Naiyan; Chen, Tianqi; Li, Mu (4. 5. 2015). „Empirical Evaluation of Rectified Activations in Convolutional Network”. arXiv:1505.00853Слободан приступ [cs.LG]. 
  • Clevert, Djork-Arné; Unterthiner, Thomas; Hochreiter, Sepp (23. 11. 2015). „Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)”. arXiv:1511.07289Слободан приступ [cs.LG]. 
  • Klambauer, Günter; Unterthiner, Thomas; Mayr, Andreas; Hochreiter, Sepp (8. 6. 2017). „Self-Normalizing Neural Networks”. Advances in Neural Information Processing Systems. 30 (2017). Bibcode:2017arXiv170602515K. arXiv:1706.02515Слободан приступ. 
  • Jin, Xiaojie; Xu, Chunyan; Feng, Jiashi; Wei, Yunchao; Xiong, Junjun; Yan, Shuicheng (22. 12. 2015). „Deep Learning with S-shaped Rectified Linear Activation Units”. arXiv:1512.07030Слободан приступ [cs.CV]. 
  • Agostinelli, Forest; Hoffman, Matthew; Sadowski, Peter; Baldi, Pierre (21. 12. 2014). „Learning Activation Functions to Improve Deep Neural Networks”. arXiv:1412.6830Слободан приступ [cs.NE]. 
  • Hendrycks, Dan; Gimpel, Kevin (2016). „Gaussian Error Linear Units (GELUs)”. arXiv:1606.08415Слободан приступ [cs.LG]. 
  • Elfwing, Stefan; Uchibe, Eiji; Doya, Kenji (2017). „Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning”. arXiv:1702.03118Слободан приступ [cs.LG]. 
  • Ramachandran, Prajit; Zoph, Barret; Le, Quoc V (2017). „Searching for Activation Functions”. arXiv:1710.05941Слободан приступ [cs.NE]. 
  • Godfrey, Luke B.; Gashler, Michael S. (3. 2. 2016). „A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks”. 7th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management: KDIR. 1602: 481—486. Bibcode:2016arXiv160201321G. arXiv:1602.01321Слободан приступ. 
  • Klimek, Matthew D.; Perelstein, Maxim (26. 10. 2018). „Neural Network-Based Approach to Phase Space Integration”. arXiv:1810.11509Слободан приступ [hep-ph]. 
  • Gashler, Michael S.; Ashmore, Stephen C. (9. 5. 2014). „Training Deep Fourier Neural Networks To Fit Time-Series Data”. arXiv:1405.2262Слободан приступ [cs.NE]. 
  • Goodfellow, Ian J.; Warde-Farley, David; Mirza, Mehdi; Courville, Aaron; Bengio, Yoshua (2013). „Maxout Networks”. JMLR Workshop and Conference Proceedings. 28 (3): 1319—1327. Bibcode:2013arXiv1302.4389G. arXiv:1302.4389Слободан приступ. 

books.google.com

doi.org

harvard.edu

adsabs.harvard.edu

  • Klambauer, Günter; Unterthiner, Thomas; Mayr, Andreas; Hochreiter, Sepp (8. 6. 2017). „Self-Normalizing Neural Networks”. Advances in Neural Information Processing Systems. 30 (2017). Bibcode:2017arXiv170602515K. arXiv:1706.02515Слободан приступ. 
  • Godfrey, Luke B.; Gashler, Michael S. (3. 2. 2016). „A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks”. 7th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management: KDIR. 1602: 481—486. Bibcode:2016arXiv160201321G. arXiv:1602.01321Слободан приступ. 
  • Goodfellow, Ian J.; Warde-Farley, David; Mirza, Mehdi; Courville, Aaron; Bengio, Yoshua (2013). „Maxout Networks”. JMLR Workshop and Conference Proceedings. 28 (3): 1319—1327. Bibcode:2013arXiv1302.4389G. arXiv:1302.4389Слободан приступ. 

ieee.org

ieeexplore.ieee.org

jmlr.org

mathworks.com

mlr.press

proceedings.mlr.press

psu.edu

citeseerx.ist.psu.edu

  • Elliot, David L. (1993), „A better activation function for artificial neural networks”, ISR Technical Report TR 93-8, University of Maryland, College Park, MD 20742., CiteSeerX 10.1.1.46.7204Слободан приступ 

sciencedirect.com

semanticscholar.org

pdfs.semanticscholar.org

umontreal.ca

iro.umontreal.ca

unicam.it

didattica.cs.unicam.it

web.archive.org

wikipedia.org

en.wikipedia.org

  • Elliot, David L. (1993), „A better activation function for artificial neural networks”, ISR Technical Report TR 93-8, University of Maryland, College Park, MD 20742., CiteSeerX 10.1.1.46.7204Слободан приступ 

worldcat.org