Improved Rare Species Identification Using Focal Loss Based Deep Learning Models
The use of deep learning for species identification in camera trap images has revolutionised our ability to study, conserve and monitor species in a highly efficient and unobtrusive manner, with state-of-the-art models achieving accuracies surpassing the accuracy of manual human classification. The high imbalance of camera trap datasets, however, results in poor accuracies for minority (rare or endangered) species due to their relative insignificance to the overall model accuracy. This paper investigates the use of Focal Loss, in comparison to the traditional Cross Entropy Loss function, to improve the identification of minority species in the “255 Bird Species” dataset from Kaggle. The results show that, although Focal Loss slightly decreased the accuracy of the majority species, it was able to increase the F1-score by 0.06 and improve the identification of the bottom two, five and ten (minority) species by 37.5%, 15.7% and 10.8%, respectively, as well as resulting in an improved overall accuracy of 2.96%.
 T. G. O’Brien, “Abundance, Density and Relative Abundance,” in Camera Traps in Animal Ecology, A. F. O’Connel, J. D. Nichols, K. U. Karanth. Tokyo: Springer, 2011, pp. 71-96.
 A. Swanson, M. Kosmala, C. Lintott, C. Packer, “A generalised approach for producing, quantifying, and validating citizen science data from wildlife images,” Conservation Biology, vol. 30, issue 3, pp. 520-531, Apr 2016.
 S. Schneider, S. Greenberg, G. W. Taylor, S. C. Kremer, “Three critical factors affecting automated image species recognition performance for camera traps,” Ecology and Evolution, vol. 10, issue 7, pp. 3503-3517, Mar 2020.
 M. S. Norouzzadeh, A. Nguyen, M. Kosmala, “Automatically identifying wild animals in camera trap images with deep learning,” Proceedings of the National Academy of Sciences, vol. 115, issue 25, pp. E5716-E5725, Jun 2018.
 J. Martin, W. M. Kitchens, J. E. Hines, “Importance of well-designed monitoring programs for the conservation of endangered species,” Conservation Biology, vol. 21, issue 2, pp. 472-481, Apr 2007.
 A. Panesar, Machine Learning and AI for Healthcare. Coventry: Apress, 2019.
 M. Mitchell, Artificial Intelligence: A Guide for Thinking Humans. London: Penguin, 2019.
 Z. Alom, T. M. Taha, C. Yakopcic, S. Westberg, P. Sidike, et al., “The History Began from AlexNet: A Comprehensive Survey on Deep Learning Approaches,” Post-Doctoral research, Dept. Comp. Sci., Univ. of Dayton, OH, 2018.
 D. H. Hubel and T. N. Wiesel, “Receptive fields of single neurones in the cat’s striate cortex,” The Journal of Physiology, vol. 148, issue 3, pp. 574-591, Oct 1959.
 S. Sutherland, “The vision of David Marr,” Nature, vol. 298, pp. 691-692, Aug 1982.
 I. Goodfellow, Y. Bengio, A. Courville, Deep Learning (Adaptive Computation and Machine Learning series). Cambridge: MIT Press, 2016.
 Z. Yang, T. Dan, Y. Yang, “Multi-temporal Remote Sensing Image Registration Using Deep Convolutional Features,” IEEE Access, vol. 6, pp 38544-38555, Jul 2018.
 V. H. Phung and E. J. Rhee, “A High‐Accuracy Model Average Ensemble of Convolutional Neural Networks for Classification of Cloud Image Patches on Small Datasets,” Applied Sciences, vol. 9, issue 21, pp. 4500, Nov 2019.
 A. Gomez, G. Diez, A. Salazar, A. Diaz, “Animal Identification in Low Quality Camera-Trap Images Using Very Deep Convolutional Neural Networks and Confidence Thresholds,” in International Symposium on Visual Computing, Las Vegas, NV, 2016, pp. 747-756.
 ImageNet. (2016, May 31). Large Scale Visual Recognition Challenge 2016 (ILSVRC2016) (Online). Available: http://image-net.org/challenges/LSVRC/2016/
 Zooniverse. (2020). Snapshot Serengeti (Online). Available: https://www.zooniverse.org/projects/zooniverse/snapshot-serengeti
 M. A. Tabak, M. S. Norouzzadeh, D. W. Wolfson, S. K. Sweeney, et al., “Machine learning to classify animal species in camera trap images: Applications in ecology,” Methods in Ecology and Evolution, vol. 10, issue 4, pp. 585-590, Nov 2018.
 Parks Canada. (2019, Oct. 16). Wildlife webcams and remote cameras (Online). Available: https://www.pc.gc.ca/en/nature/science/controle-monitoring/cameras
 O. J. Robinson, V. R. Gutierrez, D. Fink, “Correcting for bias in distribution modelling for rare species using citizen science data,” Diversity and Distributions, vol. 24, issue 4, pp. 460-472, Dec 2017.
 A. Swanson, M. Kosmala, C. Lintott, R. Simpson, A. Smith, C. Packer, “Snapshot Serengeti, high-frequency annotated camera trap images of 40 mammalian species in an African savanna,” Scientific Data, vol. 2, Article 150026, Jun 2015.
 T. Y. Lin, P. Goyal, R. Girshick, K. He, P. Dollár, “Focal Loss for Dense Object Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 42, issue 2, pp. 318-327, Feb 2020.
 M. Lofty, R. Shubair, S. Albarqouni, “Investigation of Focal Loss in Deep Learning Models For Femur Fractures Classification” in The 2019 IEEE International Conference on Electrical and Computing Technologies and Applications, UAE, 2019.
 K. Pasupa, S. Vatathanavaro, S. Tungjitnob, “Convolutional neural networks based focal loss for class imbalance problem: a case study of canine red blood cells morphology classification,” Journal of Ambient Intelligence and Humanized Computing, Feb 2020.
 S. Schneider. (2019, Sep. 9) Camera Trap Species Classifier (Source Code). Available: https://github.com/Schnei1811/Camera_Trap_Species_Classifier
 Tensorflow. (2020, Aug. 5). Focal_loss.py (Source Code).Available: https://github.com/tensorflow/addons/blob/v0.11.2/tensorflow_addons/losses/focal_loss.py
 G. Piosenka. (2020, Jul. 27). 225 Bird Species (Online). Available: https://www.kaggle.com/gpiosenka/100-bird-species