Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 30, 2021 | Supplemental Material + Submitted + Published
Journal Article Open

The Mouse Action Recognition System (MARS) software pipeline for automated analysis of social behaviors in mice


The study of naturalistic social behavior requires quantification of animals' interactions. This is generally done through manual annotation—a highly time-consuming and tedious process. Recent advances in computer vision enable tracking the pose (posture) of freely behaving animals. However, automatically and accurately classifying complex social behaviors remains technically challenging. We introduce the Mouse Action Recognition System (MARS), an automated pipeline for pose estimation and behavior quantification in pairs of freely interacting mice. We compare MARS's annotations to human annotations and find that MARS's pose estimation and behavior classification achieve human-level performance. We also release the pose and annotation datasets used to train MARS to serve as community benchmarks and resources. Finally, we introduce the Behavior Ensemble and Neural Trajectory Observatory (BENTO), a graphical user interface for analysis of multimodal neuroscience datasets. Together, MARS and BENTO provide an end-to-end pipeline for behavior data extraction and analysis in a package that is user-friendly and easily modifiable.

Additional Information

© 2021, Segalin et al. This article is distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited. Preprinted: 27 July 2020; Received: 04 October 2020; Accepted: 14 October 2021; Published: 30 November 2021. We are grateful to Grant Van Horn for providing the original TensorFlow implementation of the MSC-MultiBox detection library, Matteo Ronchi for his pose error diagnosis code, and Mark Zylka for providing the Cul3 and Chd8 mouse lines. Research reported in this publication was supported by the National Institute of Mental Health of the National Institutes of Health under Award Number R01MH123612 and 5R01MH070053 (DJA), K99MH108734 (MZ) and K99MH117264 (AK), and by the Human Frontier Science Program (TK), the Helen Hay Whitney Foundation (AK), the Simons Foundation Autism Research Initiative (DJA), the Gordon and Betty Moore Foundation (PP), and a gift from Liying Huang and Charles Trimble (to PP). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication. Data availability: All data used to train and test MARS is hosted by the Caltech library at data.caltech.edu. MARS code is publicly available on Github: - Core end-user version of MARS: http://github.com/neuroethology/MARS - Code for training new MARS models: http://github.com/neuroethology/MARS_Developer - Bento interface for browsing data: http://github.com/neuroethology/bentoMAT. Ethics: All experimental procedures involving the use of live animals or their tissues were performed in accordance with the recommendations in the Guide for the Care and use of Laboratory animals of the National Institutes of Health. All of the animals were handled according to the Institutional Animal Care and Use Committee (IACUC) protocols (IA18-1552); the protocol was approved by the Institutional Biosafety Committee at the California Institute of Technology (Caltech).

Attached Files

Published - elife-63720-v1.pdf

Submitted - 2020.07.26.222299v2.full.pdf

Supplemental Material - elife-63720-transrepform1-v1.docx

Supplemental Material - elife-63720-video1.mp4

Supplemental Material - elife-63720-video2.mp4


Files (24.4 MB)
Name Size Download all
6.2 MB Download
251.2 kB Download
2.7 MB Download
9.1 MB Preview Download
6.1 MB Preview Download

Additional details

August 20, 2023
October 23, 2023