Page 78 - Read Online
P. 78
Ji et al. Intell Robot 2021;1(2):151-75 https://dx.doi.org/10.20517/ir.2021.14 Page 171
Performed critical review, commentary and revision, and provided administrative, technical, and material
support: Quek YT
Availability of data and materials
Not applicable.
Financial support and sponsorship
None.
Conflicts of interest
All authors declared that there are no conflicts of interest.
Ethical approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Copyright
© The Author(s) 2021.
REFERENCES
1. Cannon DF, Edel K, Grassie SL, Sawley K. Rail defects: an overview. Fatigue Fract Eng M 2003;26:865-86. DOI
2. Track circuit monitoring tool: standardization and deployment at CTA. Available from: http://www.trb.org/Main/Blurbs/177054.aspx
[Last accessed on 5 Jan 2022].
3. Rail Defects Handbook. Available from: https://extranet.artc.com.au/docs/eng/track-civil/guidelines/rail/RC2400.pdf [Last accessed
on 5 Jan 2022].
4. Dey A, Kurz J, Tenczynski L. Detection and evaluation of rail defects with non-destructive testing methods. Available from:
https://www.ndt.net/article/wcndt2016/papers/we1g4.pdf [Last accessed on 5 Jan 2022].
5. Min Y, Xiao B, Dang J, Yue B, Cheng T. Real time detection system for rail surface defects based on machine vision. J Image Video
Proc 2018. DOI
6. Serin G, Sener B, Ozbayoglu AM, Unver HO. Review of tool condition monitoring in machining and opportunities for deep learning.
Int J Adv Manuf Technol 2020;109:953-74. DOI
7. Zhao R, Yan R, Chen Z, Mao K, Wang P, Gao RX. Deep learning and its applications to machine health monitoring. Mech Syst
Signal Process 2019;115:213-37. DOI
8. Fu J, Chu J, Guo P, Chen Z. Condition monitoring of wind turbine gearbox bearing based on deep learning model. IEEE Access
2019;7:57078-87. DOI
9. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015;521:436-44. DOI PubMed
10. Mcculloch WS, Pitts W. A logical calculus of the ideas immanent in nervous activity. Bull Math Biol 1943;5:115-33. PubMed
11. Rosenblatt F. The perceptron: a probabilistic model for information storage and organization in the brain. Psychol Rev 1958;65:386-
408. DOI PubMed
12. Newell A. A step toward the understanding of information processes. Science 1969;165:780-2. DOI
13. Rodan A, Faris H, Alqatawna J. Optimizing feedforward neural networks using biogeography based optimization for E-mail spam
identification. IJCNS 2016;9:19-28. DOI
14. Robert HN. Theory of the backpropagation neural network. Proc 1989 IEEE IJCNN 1989;1:593-605.
15. Lecun Y, Boser B, Denker JS, et al. Backpropagation applied to handwritten zip code recognition. Neural Comput 1989;1:541-51.
DOI
16. Hochreiter S. Untersuchungen zu dynamischen neuronalen Netzen. Diploma: Technische Universität München 1991. DOI
17. Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput 1997;9:1735-80. DOI PubMed
18. Quinlan JR. Induction of decision trees. Mach Learn 1986;1:81-106. DOI
19. Cortes C, Vapnik V. Support-vector networks. Mach Learn 1995;20:273-97. DOI
20. Freund Y, Schapire RE. A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci
1997;55:119-39. DOI
21. Cristianini N, Scholkopf B. Support vector machines and kernel methods: the new generation of learning machines. Ai Magazine
2002;23:31. DOI
22. Breiman L. Random forests. Mach Learn 2001;45:5-32. DOI