Performance Analysis of Feature-Based Automated Measurement of Mouse Social Behavioral
Main Article Content
Abstract
Automated social behavior analysis in the mammalian animal has become an increasingly popular and attractive alternative to traditional manual human annotation with the advancement of machine learning and video tracking system for automatic detection. In this work, we study a framework of how different features perform on the different classifiers to analyze automatic mice behavior. We conducted experiments on the Caltech Resident-Intruder Mouse (CRIM13) dataset, which provides two types of features: trajectory features and spatio-temporal features. With this feature, we train AdaBoost and Random Decision Forest (TreeBagger) classifiers to classify different mouse behaviors to show which features perform best on which classifier. The experimental result shows that the trajectory features are more informative and provide better accuracy than the widely used spatio-temporal features, and AdaBoost classifier shows better performance than the TreeBagger on these features.
Article Details
References
1. F. de Chaumont, E. Ey, N. Torquet, T. Lagache, S. Dallongeville, A. Imbert, T. Legou, A.-M. Le Sourd, P. Faure, T. Bourgeron et al., “Live mouse tracker: real-time behavioral analysis of groups of mice,” bioRxiv, p. 345132, 2018.
2. P. K. Thanos, C. Restif, J. R. ORourke, C. Y. Lam, and D. Metaxas, “Mouse social interaction test (most): a quantitative computer automated analysis of behavior,” Journal of Neural Transmission, vol. 124, no. 1, pp. 3–11, 2017.
3. S. Belongie, K. Branson, P. Doll ́ar, and V. Rabaud, “Monitoring animal behavior in the smart vivarium,” in Measuring Behavior. Wageningen The Netherlands, 2005, pp. 70–72.
4. R. Ulrich, S. Dulaney, M. Arnett, and K. Mueller, “An experimental analysis of nonhuman and human aggression,” in Control of Aggression. Routledge, 2017, pp. 79–111.
5. G. Lavee, E. eyjolfsRivlin, and M. Rudzsky, “Understanding video events: a survey of methods for automatic interpretation of semantic occurrences in video,” IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 39, no. 5, pp. 489–504, 2009.
6. Y. Nie, I. Ishii, K. Yamamoto, T. Takaki, K. Orito, and H. Matsuda, “High-speed video analysis of laboratory rats behaviors in forced swim test,” in IEEE International Conference on Automation Science and Engineering, 2008, pp. 206–211.
7. H. Ishii, M. Ogura, S. Kurisu, A. Komura, A. Takanishi, N. Iida, and H. Kimura, “Development of autonomous experimental setup for behavior analysis of rats,” in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2007, pp. 4152–4157.
8. X. Xue and T. C. Henderson, “Video-based animal behavior analysis from multiple cameras,” in IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 2006, pp. 335–340.
9. X. P. Burgos-Artizzu, P. Doll ́ar, D. Lin, D. J. Anderson, and P. Perona, “Social behavior recognition in continuous video,” in IEEE Conference on Computer Vision and Pattern Recognition, 2012, pp. 1322–1329.
10. L. Giancardo, D. Sona, H. Huang, S. Sannino, F. Manag`o, D. Scheggia, F. Papaleo, and V. Murino, “Automatic visual tracking and social behaviour analysis with multiple mice,” PloS one, vol. 8, no. 9, p. e74557, 2013.
11. W. Hong, A. Kennedy, X. P. Burgos-Artizzu, M. Zelikowsky, S. G. Navonne, P. Perona, and D. J. Anderson, “Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning,” Proceedings of the National Academy of Sciences, vol. 112, no. 38, pp. E5351–E5360, 2015.
12. M. Kabra, A. A. Robie, M. Rivera-Alba, S. Branson, and K. Branson, “Jaaba: interactive machine learning for automatic annotation of animal behavior,” nature methods, vol. 10, no. 1, p. 64, 2013.
13. R. E. Schapire, Y. Freund, P. Bartlett, W. S. Lee et al., “Boosting the margin: A new explanation for the effectiveness of voting methods,” The annals of statistics, vol.26, no. 5, pp. 1651–1686, 1998.
14. L. Breiman, “Random forests,” Machine learning, vol. 45, no. 1, pp. 5–32, 2001.
15. Z. Khan, T. Balch, and F. Dellaert, “Mcmc-based particle filtering for tracking a variable number of interacting targets,” IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 11, pp. 1805–1819, 2005.
16. H. Dankert, L. Wang, E. D. Hoopfer, D. J. Anderson, and P. Perona, “Automated monitoring and analysis of social behavior in drosophila,” Nature methods, vol. 6, no. 4, p. 297, 2009.
17. E. Eyjolfsdottir, S. Branson, X. P. Burgos-Artizzu, E. D. Hoopfer, J. Schor, D. J. Anderson, and P. Perona, “Detecting social actions of fruit flies,” in European Conference on Computer Vision ECCV, 2014, pp. 772–787.
18. K. Branson, A. A. Robie, J. Bender, P. Perona, and M. H. Dickinson, “High-throughput ethomics in large groups of drosophila,” Nature methods, vol. 6, no. 6, p. 451, 2009.
19. H.-Y. Tsai and Y.-W. Huang, “Image tracking study on courtship behavior of drosophila,” PloS one, vol. 7, no. 4, p.e34784, 2012
20. A. Iyengar, J. Imoehl, A. Ueda, J. Nirschl, and C.-F. Wu, “Automated quantification of locomotion, social interaction, and mate preference in drosophila mutants,” Journal of neurogenetics, vol. 26, no. 3-4, pp. 306–316, 2012
21. A. Gomez-Marin,N. Partoune, G. J. Stephens, and M. Louis, “Automated tracking of animal posture and movement during exploration and sensory orientation behaviors,” PloS one, vol. 7, no. 8, p. e41642, 2012
22. K. J. Kohlhoff, T. R. Jahn, D. A. Lomas, C. M. Dobson, D. C. Crowther, and M. Vendruscolo, “The ifly tracking system for an automated locomotor and behavioural analysis of drosophila melanogaster,” Integrative Biology, vol. 3, no. 7, pp. 755–760, 2011
23. E. I. Fontaine, F. Zabala, M. H. Dickinson, and J. W. Burdick, “Wing and body motion during flight initiation in drosophila revealed by automated visual tracking,” Journal of Experimental Biology, vol. 212, no. 9, pp. 1307–1323, 2009
24. G. Card and M. Dickinson, “Performance trade-offs in the flight initiation of drosophila,” Journal of Experimental Biology, vol. 211, no. 3, pp. 341–353, 2008
25. F. W. Wolf, A. R. Rodan, L. T.-Y. Tsai, and U. Heberlein, “High-resolution analysis of ethanol-induced locomotor stimulation in drosophila,” Journal of Neuroscience, vol. 22, no. 24, pp. 11 035–11 044, 2002
26. E. Grant and J. Mackintosh, “A comparison of the social postures of some common laboratory rodents,” Behaviour, vol. 21, no. 3, pp. 246–259, 1963.
27. G. Gheusi, R.-M. Bluth ́e, G. Goodall, and R. Dantzer, “Social and individual recognition in rodents: methodological aspects and neurobiological bases,” Behavioural processes, vol. 33, no. 1-2, pp. 59–87, 1994.
28. A. Arac, P. Zhao, B. H. Dobkin, S. T. Carmichael, and P. Golshani, “DeepBehavior: A Deep Learning Toolbox for Automated Analysis of Animal and Human Behavior Imaging Data,” Frontiers in systems neuroscience, vol. 13, pp.20, 2019.
29. Z. Zhang, Y. Yang, and Z. Wu, “Social Behavior Recognition in Mouse Video Using Agent Embedding and LSTM Modelling,” Chinese Conference on Pattern Recognition and Computer Vision (PRCV), 2019, pp. 530-541
30. K. Branson and S. Belongie, “Tracking multiple mouse contours (without too many samples),” in Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on, vol. 1. IEEE, 2005, pp. 1039–1046
31. K. Branson, “Tracking multiple mice through severe occlusions,” Ph.D. dissertation, UC San Diego, 2007.
32. H. Pistori, V. V. V. A. Odakura, J. B. O. Monteiro, W. N. Gonc ̧alves, A. R. Roel, J. de Andrade Silva, and B. B. Machado, “Mice and larvae tracking using a particle filter with an auto-adjustable observation model,” Pattern Recognition Letters, vol. 31, no. 4, pp. 337–346, 2010.
33. P. Doll ́ar, V. Rabaud, G. Cottrell, and S. Belongie, “Behavior recognition via sparse spatio-temporal features,” in Visual Surveillance and Performance Evaluation of Tracking and Surveillance, 2005. 2nd Joint IEEE International Workshop on. IEEE, 2005, pp. 65–72
34. C. T. Hsu, P. Doll ́ar, D. Chang, and A. D. Steele, “Daily timed sexual interaction induces moderate anticipatory activity in mice,” PLoS One, vol. 5, no. 11, p. e15429, 2010.
35. H. Jhuang, E. Garrote, X. Yu, V. Khilnani, T. Poggio, A. D. Steele, and T. Serre, “Automated home-cage behavioural phenotyping of mice,” Nature communications, vol. 1, p. 68, 2010.
36. E. Kyzar, S. Gaikwad, A. Roth, J. Green, M. Pham, A. Stewart, Y. Liang, V. Kobla, and A. V. Kalueff, “Towards high-throughput phenotyping of complex patterned behaviors in rodents: focus on mouse self-grooming and its sequencing,” Behavioural brain research, vol. 225, no. 2, pp. 426–431, 2011.
37. F. De Chaumont, R. D.-S. Coura, P. Serreau, A. Cressant, J. Chabout, S. Granon, and J.-C. Olivo-Marin, “Computerized video analysis of social interactions in mice,” Nature methods, vol. 9, no. 4, p. 410, 2012.
38. H. Wang, M. M. Ullah, A. Klaser, I. Laptev, and C. Schmid, “Evaluation of local spatio-temporal features for action recognition,” in BMVC 2009-British Machine Vision Conference. BMVA Press, 2009, pp. 124–1
39. P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple features,” in Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on, vol. 1. IEEE, 2001, pp. I–I.
