Page 94 - Read Online
P. 94
Bah et al. Intell Robot 2022;2(1):7288 I http://dx.doi.org/10.20517/ir.2021.16 Page 88
26. Liu Y, Chen Y, Wang J, Niu S, Liu D, Song H. Zerobias deep neural network for quickest RF signal surveillance. arXiv preprint
arXiv:2110.05797, 2021.
27. Hanin B, Rolnick D. How to start training: The effect of initialization and architecture. arXiv preprint arXiv:1803.01719, 2018.
28. Datta L. A survey on activation functions and their relation with xavier and he normal initialization. arXiv preprint arXiv:2004.06632,
2020.
29. Bjorck J, Gomes C, Selman B, Weinberger KQ. Understanding batch normalization. arXiv preprint arXiv:1806.02375, 2018.
30. Santurkar S, Tsipras D, Ilyas A, Mądry A. How does batch normalization help optimization?. In: Proceedings of the 32nd international
conference on neural information processing systems. 2018, pp. 248898.
31. You H, Yu L, Tian S, et al. MCNet: Multiple maxpooling integration module and cross multiscale deconvolution network. Knowledge
Based Systems 2021;231:107456. DOI
32. Hinton GE, Srivastava N, Krizhevsky A, Sutskever I, Salakhutdinov R. Improving neural networks by preventing coadaptation of feature
detectors. CoRR 2012;abs/1207.0580. Available from http://arxiv.org/abs/1207.0580
33. Yarin G, Jiri H, Alex K. Concrete dropout. arXiv preprint arXiv:1705.07832, 2017.
34. Chen H, Chen A, Xu L, et al. A deep learning CNN architecture applied in smart nearinfrared analysis of water pollution for agricultural
irrigation resources. Agricultural Water Management 2020;240:106303. DOI
35. Goodfellow IJ, Erhan D, Luc Carrier P, et al. Challenges in representation learning: a report on three machine learning contests. Neural
Netw 2015;64:5963. DOI
36. Song L, Gong D, Li Z, Liu C, Liu W. Occlusion robust face recognition based on mask learning with pairwise differential siamese network.
In: Proceedings of the IEEE/CVF International Conference on Computer Vision. IEEE, 2019, pp. 77382.
37. Shorten C, Khoshgoftaar TM. A survey on image data augmentation for deep learning. J Big Data 2019;6:148. DOI
38. Gao X, Saha R, Prasad MR, et al. Fuzz testing based data augmentation to improve robustness of deep neural networks. In: 2020
IEEE/ACM 42nd International Conference on Software Engineering (ICSE). IEEE, 2020, pp. 114758.
39. Halgamuge MN, Daminda E, Nirmalathas A. Best optimizer selection for predicting bushfire occurrences using deep learning. Nat
Hazards 2020;103:84560. DOI
40. Zhang Z, Sabuncu MR . Generalized cross entropy loss for training deep neural networks with noisy labels. In: 32nd Conference on
Neural Information Processing Systems (NeurIPS). 2018.
41. Han Z. Predict final total mark of students with ANN, RNN and BiLSTM. Available from http://users.cecs.anu.edu.au/~Tom.Gedeon/
conf/ABCs2020/paper/ABCs2020_paper_v2_135.pdf.
42. Li M, Soltanolkotabi M, Oymak S. Gradient descent with early stopping is provably robust to label noise for overparameterized neural
networks. In: International conference on artificial intelligence and statistics. PMLR, 2020, pp. 431324.
43. Lucey P, Cohn JF, Kanade T, et al. The extended CohnKanade dataset (CK+): A complete dataset for action unit and emotionspecified
expression. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2010. IEEE,
2010, pp. 94101. DOI
44. Cheng S, Zhou G. Facial expression recognition method based on improved VGG convolutional neural network. Int J Patt Recogn Artif
Intell 2020;34:2056003. DOI