From Dopamine to Song: An Analytical Pipeline
This repository investigates the fascinating world of songbird vocalizations and the potential role of dopamine in maintaining them.
Here, we provide a comprehensive analytical pipeline, encompassing tracking and characterizing changes in learned song components, pinpointing the relationship between dopaminergic signal manipulation and song maintenance, offering a workflow for integrating photometry and song data, and applying statistical and modeling approaches to understand their relationship.
While initially developed for a dopamine-focused manuscript (citation pending), we believe this analytical pipeline offers a valuable starting point for researchers studying songbird behavior and neural activity in broader contexts. We include step-by-step guides detailing how photometry and song data are wrangled and integrated, and hope this tutorial could facilitate further exploration of the neural mechanisms underlying vocal learning and maintenance.
Repo Organization
│ ├── 01-Photometry_Preprocessing.ipynb -> Workflow tutorial to generate processed photometry data
│ ├── 02-Photometry_Manual_Alignment.ipynb -> Alignment of photometry data with behavioral references
│ ├── 03-Photometry_Modeling.ipynb -> Generate time warping models of photometry data
│ ├── 04-Variable_Classification.ipynb -> Feature selection and information theory analysis
│ ├── 05-Song_Analysis.ipynb -> Visualization of song changes after dopamine manipulation
│ ├── README.md -> The top-level README for users
├── binder
│ ├── DESCRIPTION -> General scope and goals of the project
│ ├── apt.txt -> List of modules required for package installation in Ubuntu
│ ├── install.R -> List of R packages used
│ ├── postBuild -> Script to create "d2s" conda environment for mybinder
│ ├── requirements.txt -> List of Python packages used
│ ├── runtime.txt -> R version and image date
├── data
│ ├── InfoTheo.xlsx -> Input file: data for notebook_04
├── example_NPM_data -> Input folder for notebook_01: raw data recorded via Bonsai
├── example_npy -> Output folder from notebook_01 (tutorial)
├── example_rds -> Output folder from notebook_01 (tutorial)
├── example_song_data -> Input folder for notebook_05: song feature data from one bird prepared
├── results
├── npy -> Input folder for notebook_03: processed photometry data from singing and non-singing epochs
├── rds -> Input folder for notebook_02: processed metadata, photometry and song annotation data
├── scripts
│ ├── align.R -> Functions to align raw or transients photometry data
│ ├── comb_ZdF2F.R -> Functions to combine processed fiber photometry data
│ ├── get_transient.R -> Functions to get photometry transients data
│ ├── get_zdF2F.R -> Functions to get fiber photometry data from a single file
│ ├── infotheo.R -> Functions to perform variable classification
│ ├── plot.R -> Functions to plot in IRkernel
│ ├── plot_utils.py -> Functions to plot in IPykernel
│ ├── stat.R -> Functions to perform statistical analysis
Getting started
Click the Launch binder badge above to launch the notebooks in a web browser with all the necessary dependencies pre-installed. Please be patient while JupyterLab is loading, as it may take a few minutes to build the container image and launch JupyterLab on the cloud server.
Note: While the Mybinder service provides a convenient way to interact with the notebooks, it may not be suitable for handling large data files. Some example sections that take minutes to execute on Mybinder should run instantly on a local computer setup.
Option 2: Running Notebooks Locally and on HPC Environments
We have built a container image that includes all the necessary dependencies to run the notebooks from this repo. This guideline will show you how to pull this container image from the UTSW GitLab registry and run the notebooks on your local machine or HPC environment.
1. Install Singularity
For local users: Install Singularity on your machine by following the official documentation at Singularity Installation Guide. **Note:**This guide provides instructions for various operating systems (Linux, macOS, Windows). Make sure to choose the appropriate instructions for your system.
For HPC users: On your HPC environment, use the following command to check if Singularity modules are available on the Web-based Visualization/GUI node.
module avail singularity
# If module avail singularity shows available Singularity modules, then load one of them
module load singularity/\<version\>
Replace with the specific Singularity version you want to use (as listed by module avail singularity).
2. Pull the Container Image
Once Singularity is installed/loaded, you can pull the container image containing the Jupyter notebooks and environment. Navigate to the directory where you want to store the container image:
cd path/to/the/image/directory
mkdir d2s && cd d2s
For platforms with Intel chips
singularity pull docker://git.biohpc.swmed.edu:5050/xlei/da2song/d2s:1.0.3
For platforms with Apple silicon chips, Note: This container image is still under development
singularity pull docker://git.biohpc.swmed.edu:5050/xlei/da2song/d2s:1.1.beta
This command will create a Singularity Image Format (SIF) file in your local 'd2s' folder, e.g. 'd2s_1.0.3.sif'. You will not need to run this command again if the same container image is already present in your local directory
3. Run the Container
To run the container for the first time or reload original files from this repo, use this command:
singularity run --no-home d2s_1.0.3.sif
Note:
- This command runs the container, mounts files from this repo to your local 'd2s' directory, and opens JupyterLab. Typically, it generates a URL like http://127.0.0.1:8888/lab?token=.... Clicking this URL opens JupyterLab.
- Use Ctrl + C (Windows) or Command + C (Mac) to stop the container.
- After shutting down the container, all mounted files remain in your local directory, preserving any changes made to the notebooks. You can load your own data to the "d2s" directory, run the container, and generate your own results.
To run the container again, use this command:
singularity exec --no-home d2s_1.0.3.sif /bin/bash -c "source /DA2Song/entrypoint.sh --no-cp && jupyter lab --ip=0.0.0.0 --port=8888 --allow-root --LabApp.default_url=/lab"
Note: The "--no-cp" flag prevents copying original notebooks and other files to your local directory, preserving any previous changes you made in the notebooks or any data you loaded to the "d2s" directory.