Published March 19, 2025 | Published
Journal Article Open

Parsing social context in auditory forebrain of male zebra finches

  • 1. ROR icon California Institute of Technology
  • 2. ROR icon Princeton University
  • 3. ROR icon University of Maryland, College Park
  • 4. ROR icon University of Massachusetts Amherst

Abstract

To understand the influence of natural behavioral context on neural activity requires studying awake-behaving animals. Microdrive devices facilitate bridging behavior and physiology to examine neural dynamics across behavioral contexts. Impediments to long-term single unit recordings in awake-behaving animals include tradeoffs between weight, functional flexibility, expense, and fabrication difficulty in microdrive devices. We describe a straightforward and low-cost method to fabricate versatile and lightweight microdrives that remain functional for months in awake-behaving zebra finches (Taeniopygia guttata). The vocal and gregarious nature of zebra finches provide an opportunity to investigate neural representations of social and behavioral context. Using microdrives, we report how auditory responses in an auditory association region of the pallium are modulated by two naturalistic contexts: self- vs. externally-generated song (behavioral context), and solitary vs. social listening (social context). While auditory neurons exhibited invariance across behavioral contexts, in a social context, response strength and stimulus selectivity were greater in a social condition. We also report stimulus-specific correlates of audition in local field potentials. Using a versatile, lightweight, and accessible microdrive design for small animals, we find that the auditory forebrain represents social but not behavioral context in awake-behaving animals.

Copyright and License

© 2025 Pollak et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding

This work supported by NIH Training grant T32 NS105595, awarded to MM, and grant NIH R01 NS082179, awarded to LRH, a University of Amherst Honors Research Assistant Fellowship, and a University of Massachusetts Honors Research Grant, both awarded to DP. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

All raw and processed files are available from the CaltechDATA repository (accession number https://doi.org/10.22002/h4r55-cdk54).

Acknowledgement

We thank B. Kaminska and T. Inbar for help implementing and adapting microdrive designs. We are grateful for the help of H.Boyd, M. Manfredi, A. McGrath, M. Senft, V. Indeglia, Z. Alam, and A. Cashel for help with animal husbandry, and to H. Boyd for help with histology. We thank J. Bois for help with analysis, J. Spool and M. Fernandez-Peters for help with surgery and histology, and J. Bergman, G. Cormier, and C. Hrasna for their patient help with numerous small projects. Many thanks to K. Schroeder for help with statistical modeling in R.

Supplemental Material

Suplemental material includes:
  • ~$pplement.docx
  • ~WRL1733.tmp
  • S1 Fig.svg
  • S1 File.pdf
  • S1 Table.docx
  • S2 Fig.tif

S1 Data. S1 Fig. Filtered multiunit activity from microdrive design.Top: a second-long snippet of 9 out of 16 channels. Spikes are denoted with colored triangles; different colors denote different units. Bottom: a 400 ms subset of the snippet above. S2 Fig. Summary stimulus-triggered raster plot of all units for all stimuli. Each dot is a spike, each row is a stimulus presentation, and each block of rows is a unit. Rasters in red are from the solitary condition and rasters in blue are from the social condition. S1 Table. Stimulus presentations for each experiment. Number of stimulus presentations for each experiment. S1 File. Selectivity index values for each neuron for solitary and social audition.

Code Availability

Experiments were conducted using code from the ephysSuite GitHub repository (www.github.com/healeylab/ephysSuite). Analysis was conducted using the KS-Analysis GitHub repository (www.github.com/healeylab/KS-Analysis). Figures were generated using python jupyter notebooks available on Github as well (www.github.com/healeylab/Pollaketal2024).

Files

journal.pone.0314795.pdf
Files (4.4 MB)
Name Size Download all
md5:19ff9b0f0b49f179337bee6fba930181
3.2 MB Preview Download
md5:46846a6dc9b5f8edcd24d73b64f57f5c
1.2 MB Preview Download

Additional details

Created:
March 25, 2025
Modified:
March 25, 2025