[1] Kusi-Sarpong, S., Bai, C., Sarkis, J., and Wang, X. (2015). Green supply chain practices evaluation in the mining industry using a joint rough sets and fuzzy TOPSIS methodology. Resources Policy, 46, 86–100.
[2] Holmberg, K., Kivikytö-Reponen, P., Härkisaari, P., Valtonen, K., and Erdemir, A. (2017). Global energy consumption due to friction and wear in the mining industry. Tribology International, 115, 116–139.
[3] Ranängen, H., and Lindman, Å. (2017). A path towards sustainability for the Nordic mining industry. Journal of Cleaner Production, 151, 43–52.
[4] Sanmiquel, L., and Rossell, J. M., and Vintró, C. (2015). Study of spanish mining accidents using data mining techniques. Safety Science, 75, 49–55.
[5] Permana, H. (2012). Risk assessment as a strategy to prevent mine accidents in Indonesian mining. Revista de Minería. 18(4): 43–49.
[6] Kizil, M. (2003). Virtual reality applications in the Australian minerals industry. Application of computers and operations research in the minerals industries, South African. 569-574.
[7] Kazan, E., and Usmen, M. A. (2018). Worker safety and injury severity analysis of earthmoving equipment accidents. Journal of safety research, 65, 73-81.
[8] Squelch, A. P. (2001). Virtual reality for mine safety training in South Africa. Journal of the Southern African Institute of Mining and Metallurgy, 101(4): 209-216.
[9] Kia, K., Fitch, S. M., Newsom, S. A., and Kim, J. H. (2020). Effect of whole-body vibration exposures on physiological stresses: Mining heavy equipment applications. Applied ergonomics, 85, 103065.
[10] Domingues, M. S., Baptista, A. L., and Diogo, M. T. (2017). Engineering complex systems applied to risk management in the mining industry. International journal of mining science and technology, 27(4): 611-616.
[11] Tichon, J., and Burgess-Limerick, R. (2011). A review of virtual reality as a medium for safety related training in mining. Journal of Health and Safety Research and Practice, 3(1): 33-40.
[12] Shahmoradi, J., Talebi, E., Roghanchi, P., andHassanalian, M. (2020). A comprehensive review of applications of drone technology in the mining industry. Drones, 4(3): 34.
[13] Patrucco, M., Pira, E., Pentimalli, S., Nebbia, R., and Sorlini, A. (2021). Anti-collision systems in tunneling to improve effectiveness and safety in a system-quality approach: A review of the state of the art. Infrastructures, 6(3): 42.
[14] Kim, H., and Choi, Y. (2021). Autonomous driving robot that drives and returns along a planned route in underground mines by recognizing road signs. Applied Sciences, 11(21): 10235.
[15] Backman, S., Lindmark, D., Bodin, K., Servin, M., Mörk, J., and Löfgren, H. (2021). Continuous control of an underground loader using deep reinforcement learning. Machines, 9(10): 216.
[16] Kim, H., and Choi, Y. (2020). Comparison of three location estimation methods of an autonomous driving robot for underground mines. Applied Sciences, 10(14): 4831.
[17] Lööw, J., Abrahamsson, L., and Johansson, J. (2019). Mining 4.0—The impact of new technology from a work place perspective. Mining, Metallurgy and Exploration, 36, 701-707.
[18] Sishi, M., andTelukdarie, A. (2020). Implementation of Industry 4.0 technologies in the mining industry-a case study. International Journal of Mining and Mineral Engineering, 11(1): 1-22.
[19] Sánchez, F., and Hartlieb, P. (2020). Innovation in the mining industry: Technological trends and a case study of the challenges of disruptive innovation. Mining, Metallurgy and Exploration, 37(5): 1385-1399.
[20] Groves, W. A., Kecojevic, V. J., andKomljenovic, D. (2007). Analysis of fatalities and injuries involving mining equipment. Journal of safety research. 38(4): 461-470.
[21] Amponsah-Tawiah, K., Jain, A., Leka, S., Hollis, D., and Cox, T. (2013). Examining psychosocial and physical hazards in the Ghanaian mining industry and their implications for employees’ safety experience. Journal of safety research. 45: 75-84.
[22] Ruff, T., Coleman, P., and Martini, L. (2011). Machine-related injuries in the US mining industry and priorities for safety research. International journal of injury control and safety promotion. 18(1): 11-20.
[23] Imam, M., Baïna, K., Tabii, Y., Ressami, E. M., Adlaoui, Y., Benzakour, I., and Abdelwahed, E. H. (2023). The future of mine safety: a comprehensive review of anti-collision systems based on computer vision in underground mines. Sensors. 23(9): 4294.
[24] Ali, D., and Frimpong, S. (2020). Artificial intelligence, machine learning and process automation: Existing knowledge frontier and way forward for mining sector. Artificial Intelligence Review. 53(8): 6025-6042.
[25] Hyder, Z., Siau, K., and Nah, F. (2019). Artificial intelligence, machine learning, and autonomous technologies in mining industry. Journal of Database Management (JDM). 30(2): 67-79.
[26] Viola, P., and Jones, M. (2001). Rapid object detection using a boosted cascade of simple features. In Proceedings of the 2001 IEEE computer society conference on computer vision and pattern recognition. CVPR 2001, Vol. 1, pp. I-I.
[27] Dalal, N., and Triggs, B. (2005). Histograms of oriented gradients for human detection. In 2005 IEEE computer society conference on computer vision and pattern recognition. CVPR'05, Vol. 1, pp. 886-893.
[28] Felzenszwalb, P., McAllester, D., and Ramanan, D. (2008). A discriminatively trained, multiscale, deformable part model. In 2008 IEEE conference on computer vision and pattern recognition. pp. 1-8.
[29] Ren, S., He, K., Girshick, R., and Sun, J. (2016). Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE transactions on pattern analysis and machine intelligence. 39(6): 1137-1149.
[30] Zou, Z., Chen, K., Shi, Z., Guo, Y., and Ye, J. (2023). Object detection in 20 years: A survey. Proceedings of the IEEE. 111(3): 257-276.
[31] Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., and Berg, A. C. (2016). Ssd: Single shot multibox detector. In Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11–14, 2016, Proceedings, Part I 14, Springer International Publishing. pp. 21-37.
[32] Redmon, J. (2016). You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 779-788.
[33] Lin, T., Goyal, P., Girshick, R., He, K., andDollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE International Conference on Computer Vision. pp. 2980–2988.
[34] Duan, K., Bai, S., Xie, L., Qi, H., Huang, Q., and Tian, Q. (2019). Centernet: Keypoint triplets for object detection. In Proceedings of the IEEE/CVF international conference on computer vision. pp. 6569-6578.
[35] Redmon, J., and Farhadi, A. (2017). YOLO9000: better, faster, stronger. In Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 7263-7271.
[36] Redmon, J. (2018). Yolov3: An incremental improvement. arXiv preprint arXiv:1804.02767.
[37] Bochkovskiy, A., Wang, C. Y., and Liao, H. Y. M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934.
[38] Jocher, G., Chaurasia, A., Stoken, A., Borovec, J., Kwon, Y., Fang, J., Michael, K., Montes, D., Nadar, J. and Skalski, P. (2022). ultralytics/yolov5: v6. 1-tensorrt, tensorflow edge tpu and openvino export and inference. Zenodo.
[39] Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L. and ;Desmaison, A. (2019). Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32.
[40] Elfwing, S., Uchibe, E., andDoya, K. (2018). Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. Neural networks, 107, 3-11.
[41] Li, C., Li, L., Jiang, H., Weng, K., Geng, Y., Li, L., Ke, Z., Li, Q., Cheng, M., Nie, W. and Li, Y. (2022). YOLOv6: A single-stage object detection framework for industrial applications. arXiv preprint arXiv:2209.02976.
[42] Wang, C. Y., Bochkovskiy, A., and Liao, H. Y. M. (2023). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. pp. 7464-7475.
[43] Jocher, G., Chaurasia, A., and Qiu, J. (2023). Ultralytics YOLOv8 (version 8.0. 0).
[44] Zhang, S., Yang, H., Yang, C., Yuan, W., Li, X., Wang, X., Zhang, Y., Cai, X., Sheng, Y., Deng, X. and Huang, W. (2023). Edge device detection of tea leaves with one bud and two leaves based on shuffleNetv2-YOLOv5-lite-E. Agronomy, 13(2), 577.
[45] Powers, D. M. (2020). Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation. arXiv preprint arXiv:2010.16061.