Welcome to Stytra’s documentation!

Stytra is a package to build and run behavioral experiments.

Installation guide

Stytra was developed and tested using Python 3.7 installed as part of the Anaconda Python distribution. Other Python versions have not been tested. Make sure you have the latest version of Anaconda installed before proceeding with the installation. Installation with custom python environments, Miniconda, or in Anaconda virtual environments could be achieved but might give dependencies issues. The following instructions have been tested and work for an installation in the Anaconda root environment.

Installing stytra

Stytra relies on opencv for some of its fish tracking functions. If you don’t have it installed, open the Anaconda prompt and type:

conda install opencv

If you are using Windows, git (used for tracking software versions) might not be installed. Git can also be easily installed with conda:

conda install git

This should be everything you need to make ready before installing stytra.

Note

PyQt5 is not listed as an explicit requirement because it should come with the Anaconda package. If you are not using Anaconda, make sure you have it installed and updated before installing Stytra!

The simplest way to install Stytra is with pip:

pip install stytra

You can the installation by running one of the examples in stytra example folder! To run a simple looming stimulus experiment, you can type:

python -m stytra.examples.looming_exp

If the GUI opens correctly and pressing the play button starts the stimulus: congratulations, installation was successful! If it crashes, check if you have all dependencies correctly installed. If it still does not work, open an issue on the Stytra github page.

Editable installation

On the other hand, if you want to modify the internals of stytra or use the unreleased features, clone or download stytra from github and install it with:

pip install path_to_stytra/stytra

If you want to be able to change the stytra code and use the changed version, install using the -e argument:

pip install -e path_to_stytra/stytra

Now you can have a look at the stytra Examples gallery, or you can start Configuring a computer for Stytra experiments. In the second case, you might want to have a look at the camera APIs section below first.

Note

Stytra might raise an error after quitting because of a bug in the current version of pyqtgraph (a package we are using for online plotting). If you are annoyed by the error messages when closing the program you can install the develop version of pyqtgraph from their github repository. The problem will be resolved once the next pyqtgraph version is released.

Installing camera APIs

xiCam: Ximea

Download the Ximea SDK software pacakge for your operating system, during the installation wizard make sure that you select the python API checkbox. After installation, copy the python wrapper API (in the folder where you installed XIMEA, …XIMEAAPIPythonv3ximea) into the Python site-packages folder (for anaconda, usually the folder …anaconda3Libsite-packages)

pymba: AVT

Go to the Allied Vision software webpage and download and install the Vimba SDK. Then install the python wrapper pymba. You can install it from source:

pip install git+https://github.com/morefigs/pymba.git

or, if using 64bit windows, you can grab the installation file from here. open the terminal in the folder where you downloaded it and install:

pip install pymba-0.1-py3-none-any.whl

spinnaker: Point Grey / FLIR

Go the the FLIR support website, download the SDK and the Python API.

  1. Install the SDK, by chosing the camera and OS, and then downloading
    e.g. Spinnaker 1.15.0.63 Full SDK - Windows (64-bit) — 07/27/2018 - 517.392MB or the equivalent for your operating system
  2. Install the python module
    pip install “path_to_extracted_zip/spinnaker_python-1.15.0.63-cp36-cp36m-win_amd64.whl”

(with the file with the appropriate OS and Python versions)

National Instruments framegrabber with Mikrotron camera

Install the NI vision SDK. For the Mikrotron MC1362 camera, you can use this camera file. The camera file usually needs to be put into C:UsersPublicPublic DocumentsNational InstrumentsNI-IMAQData After putting the camera file there, is should be selected for the image acquisition device in NI MAX.

Stimulation

One of the main purposes of stytra is to provide a framework to design and run sequences of stimuli to be presented to the fish.

Stimuli and Protocols in stytra

The Stimulus class constitutes the building block for an experiment in stytra. A sequence of Stimuli is bundled together and parameterized by the Protocol class. See Create stimulus sequence for a description of how to create a protocol in stytra.

The ProtocolRunner class is used to keep track of time and set the Stimuli in the Protocol sequence with the proper pace.

Stimuli examples

Full-field luminance

    def get_stim_sequence(self):
        lum = pd.DataFrame(dict(t=[0, 1, 2], luminance=[0.0, 1.0, 0.0]))
        return [
            DynamicLuminanceStimulus(df_param=lum, clip_mask=(0.0, 0.0, 0.5, 0.5)),
            DynamicLuminanceStimulus(df_param=lum, clip_mask=(0.5, 0.5, 0.5, 0.5)),
        ]

Gratings

    def get_stim_sequence(self):
        Stim = type("stim", (InterpolatedStimulus, GratingStimulus), dict())
        return [
            Stim(df_param=pd.DataFrame(dict(t=[0, 2], vel_x=[10, 10], theta=np.pi / 4)))
        ]

OKR inducing rotating windmill stimulus

    def get_stim_sequence(self):
        Stim = type(
            "stim", (InterpolatedStimulus, WindmillStimulus), {}  # order is important!
        )
        return [Stim(df_param=pd.DataFrame(dict(t=[0, 2, 4], theta=[0, np.pi / 8, 0])))]

Seamlessly-tiled image

    def get_stim_sequence(self):
        Stim = type("stim", (SeamlessImageStimulus, InterpolatedStimulus), {})
        return [
            Stim(
                background="caustics.png",
                df_param=pd.DataFrame(dict(t=[0, 2], vel_x=[10, 10], vel_y=[5, 5])),
            )
        ]

Radial sine (freely-swimming fish centering stimulus)

    def get_stim_sequence(self):
        return [RadialSineStimulus(duration=2, period=10, velocity=5)]

Configuring a computer for Stytra experiments

By default, Stytra checks the user folder (on Windows usually C:/Users/user_name, ~ on Unix-based systems) for the stytra_setup_config.json file. You can put default settings for the current computer in it, specifying the e.g. saving format, camera type and ROI, full-screen stimulus display and anything else that is specified when instantiating Stytra .

An example is provided below:

stytra_setup.config.json

{
"display": {"full_screen": true},
"dir_save": "J:/_Shared/experiments",
"dir_assets": "J:/_Shared/stytra_resources",
"log_format": "hdf5",
"camera": {"type": "ximea", "rotation":-1, "roi":[0, 0, 784, 784]},
"tracking": {"method":"fish"},
"embedded" : false
}

Data and metadata saving

Data saving classes in Stytra

All streaming data (tracking, stimulus state) is collected by subclasses of the Accumulator Accumulators collect named tuples of data and timing of data points. If the data format changes, the accumulator resets.

All other data (animal metadata, configuration information, GUI state etc. is collected inside the Experiment class via tha DataCollector

Configuring Stytra for updating external database:

In addition to the JSON file, the metadata can be saved to a database, such as MongoDB. For this, an appropriate database class has to be created and passed to the Stytra class. This example uses PyMongo

Example:

from stytra.utilities import Database, prepare_json
import pymongo


class PortuguesDatabase(Database):
    def __init__(self):
        # in the next line you have to put in the IP address and port of the
        # MongoDB instance
        self.client = pymongo.MongoClient("mongodb://192.???.???.???:????")
        # the database and collection are created in MongoDB before
        # the first use
        self.db = self.client.experiments
        self.collection = self.db["experiments"]

    def insert_experiment_data(self, exp_dict):
        """ Puts a record of the experiment in the default lab MongoDB database

        :param exp_dict: a dictionary from the experiment data collector
        :return: the database id of the inserted item
        """

        # we use the prepare_json function to clean the dictionary
        # before inserting into the database

        db_id = self.collection.insert_one(
            prepare_json(exp_dict, eliminate_df=True)
        ).inserted_id
        return str(db_id)

Calibration

Positioning and calibrating the monitor for visual stimuli

To calibrate the monitor for your experiment, first position the black stimulus screen on the monitor you are using for the experiment. Then, hit the show calibration button and drag around the ROI in the stytra GUI until the red rectangle covers the area you want to use for the stimulus and the cross is at the center. Finally, specify in the spin box the final size of the lateral edge of the calibrator in centimeters.

The calibration is saved in the last_stytra_config.json file, so once you have done it it maintains the same calibration for all subsequent experiments.

Calibration of the camera and monitor

To calibrate the camera image to the displayed image, the Circle Calibrator is used (it is enabled automatically for freely-swimming experiments).

freely-swimming tracking screenshot

After Stytra starts, turn off the IR illumination and remove the IR filter in front of the camera. Then, click the display calibration pattern button (a) and move the display window on the projector so that the 3 dots are clearly visible. Sometimes the camera exposure has to be adjusted as well (b) so that all 3 dots are visible. Due to screen or projector framerates, usually setting the camera framerate to 30 and the exposure to 10ms works well.

Then, click calibrate (c) and verify that the location of the camera image in the projected image makes sense. If not, try adjusting camera settings and calibrating again.

Triggering a Stytra protocol

Stytra is designed to be used in setups where the presentation of stimuli to the animal needs to be synchronized with an acquisition program running on a different computer, e.g. controlling a two-photon microscope. To this end, the triggering module provides classes to ensure communication with external devices to time the beginning of the experiment. Two methods are already supported in the triggering library:

TTL pulse triggering on a Labjack/NI board and serial ports.

In the first simple configuration, Stytra simply waits for a TTL pulse received on a Labjack or a NI board to start the experiment.

ZeroMQ

Stytra employs the ZeroMQ library to synchronize the beginning of the experiment through a message coming from the acquisition computer over the local network. ZeroMQ is supported in a number of programming languages and environments including LabView, and the exchange of the synchronizing message can easily be added to custom-made or open-source software. The messages can also be used to communicate to Stytra data such as the microscope configuration that will be logged together with the rest of experiment metadata.

A common framework to build custom software for hardware control is LabView. In our laboratory, a LabView program is used to control the scanning from the two-photon microscope. Below we report a screenshot of a very simple subVI that can be used together with Stytra for triggering the start of the stimulation. A ZMQ context is created, and than used to send a json file with the information about microscope configuration over the network to the ip of the computer running Stytra, identified by its IP. Stytra uses by default port 5555 to listen for triggering messages.

alternate text

Additional methods

The triggering module is also designed to be expandable. It is possible to define new kinds of triggers, which consists of a processes that continuously checks a condition. To define a new trigger, e.g., starting the acquisition when a new file is created in a folder, it is enough to write a method that uses the python standard library to monitor folder contents.

Parameters in stytra

Various aspects of the experiments are parametrised using the lightparam package. For basic use patterns you can refer to the README of the package.

A note on coordinate systems in Stytra

Stytra follows the common convention for displaying images on screens: the x axis increases to the right and the y axis increases downward, with (0,0) being the upper right corner. For the recorded coordinates, the same holds. The angles correspondingly increase clockwise.

Hardware description

Below we provide a description of the setups in use in the lab together with a full list of components that can be used to assemble them.

Two configurations of our setups are described: the first one is for detailed kinematic tracking of eyes and tail in a fish with head restrained in agarose, the second for tracking freely swimming fish in a petri dish.

Finally, we present a cheap version of the behavioral setup that can be easily built for about 700 euros, and easily assembled using cardboard, laser-cut parts or other custom-made enclosures.

Head-restrained fish setup

This configuration requires high magnification, provided by a 50 mm macro objective. On the other side, illumination be provided only in a very small field and can be accomplished by with a simple single IR-LED.

alternate text

Freely swimming fish setup

This configuration uses a camera with larger field of view and a custom-built LED box for illuminating homogeneously a large area.

alternate text

List of components

Below we provide a list of all components required for building the two setups. Indicative prices in euros (Jul 2018) and links to supplier pages are provided as well.

Note

Many parts of the setup, such as the base, the stage and the holders can easily be replaced with custom solutions.

alternate text

Head-restrained setup

Components for embedded configuration behavioral setup
  Component Manufacturer Part No Specs Amount Price (euros) Link Notes Replacement
Breadbord                  
1.1 Aluminum breadbord Thorlabs MB3045/M   1 170 https://www.thorlabs.com/thorproduct.cfm?partnumber=MB3045/M    
1.2 Feet Thorlabs AV4/M   1 20.86 https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6421    
Rail                  
2.1 Construction rail Thorlabs XT66-750   1 80 https://www.thorlabs.com/thorProduct.cfm?partNumber=XT66-750    
2.2 Construction rail mount Thorlabs XT66P1   1 32.25 https://www.thorlabs.com/thorproduct.cfm?partnumber=XT66P1    
2.3 Rail carriage Thorlabs XT66P2/M   2 62.25 https://www.thorlabs.com/thorproduct.cfm?partnumber=XT66P2/M    
2.4 Post holder Thorlabs PH20/M   2 6.33 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH20/M    
Stage                  
3.1 Post Thorlabs TR250/M   4 8.12 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR250/M can be smaller for embedded prep  
3.2 Post holder Thorlabs PH75/M   4 7.44 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH75/M#ad-image-0    
3.3 Acrilic stage custom 133555   1 10 https://www.modulor.de/acrylglas-gs-transparent-farblos-6-00-x-250-x-500-mm.html    
3.4 Screen custom              
Projector                  
4.1 Projector Asus P3E   1 534 https://www.asus.com/us/Projectors/P3E/    
4.2 Display cable (HDMI) Any              
4.3 Post Thorlabs TR40/M   1 4.52 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR40/M    
4.4 Post holder Thorlabs PH40/M   1 6.56 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH40/M#ad-image-0    
Camera                  
5.1 Camera Ximea MQ013MG-ON Python 1300 1 580 https://www.ximea.com/products/usb3-vision-cameras-xiq-line/mq013mg-on   PointGrey Blackfly S Mono 0.4 MP USB3 Vision (Sony IMX287) https://eu.ptgrey.com/blackfly-s-mono-04-mp-usb3-vision-sony-imx287
5.2 Camera cable e.g. Ximea CBL-U3-3M0 USB 3, 3m passive 1 17 https://www.ximea.com/en/products/usb3-vision-compliant-cameras-xiq/xiq-usb-30-accessories/1-m-usb-30-passive-cable    
5.3 Camera holder custom     1 0      
5.4 Post Thorlabs TR75/M   1 4.93 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR75/M    
5.5b Camera objective Navitar TC.5028 C mount 1 590 https://navitar.com/products/imaging-optics/telecentric/video-telecentric/    
IR filter                  
6.1 Post Thorlabs TR75/M   1 4.93 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR75/M    
6.2 Lens mount Thorlabs LMR2/M   1 23 https://www.thorlabs.com/thorproduct.cfm?partnumber=LMR2/M    
6.3 IR filter Edmund Optics 66-106 830 nm LP 1 69 https://www.edmundoptics.de/optics/optical-filters/longpass-edge-filters/rg-830-50mm-dia.-longpass-filter/    
Illumination                  
7.1 Power supply Conrad ESPS-1500 Voltcraft 1 15 https://www.conrad.com/ce/en/product/1380523/Mains-PSU-adjustable-voltage—–VOLTCRAFT—–ESPS-1500—–3-Vdc-45-Vdc-5-Vdc    
7.2b Post Thorlabs TR150/M   1 8.12 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR150/M    
7.3b Post Thorlabs TR50/M   2 8.12 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR50/M    
7.4b Post holder Thorlabs PH40/M   1 6.56 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH40/M#ad-image-0    
7.5b Right-angle clamp Thorlabs RA90/M   2 8.87 https://www.thorlabs.com/thorProduct.cfm?partNumber=RA90/M    
7.6b Lens mount Thorlabs LMR2/M   1 23 https://www.thorlabs.com/thorproduct.cfm?partnumber=LMR2/M    
7.7b cold mirror Edmund Optics #64-450   1 75 https://www.edmundoptics.de/optics/optical-mirrors/hot-cold-mirrors/45deg-aoi-50.0mm-diameter-cold-mirror/    
7.8b LED holder Thorlabs SMR1/M   1 17 https://www.thorlabs.com/thorproduct.cfm?partnumber=SMR1/M    
7.9b Cap for LED Thorlabs SM1CP2M   1 16 https://www.thorlabs.com/thorproduct.cfm?partnumber=SM1CP2M machined to accomodate wires  
7.10b high power LED RS Components e.g. 796-1772 850 nm LED 1 10 https://de.rs-online.com/web/p/led-ir/7961772/ max 1 A power LED-tech , Osram Black 850 nm (LT-2418) (https://www.led-tech.de/de/OSRAM-Black-Series-850nm-auf-Star)
7.11b LED pad LED-tech LT-2418   10 0.4 https://www.led-tech.de/de/Waermeleitklebepad-fuer-16mm-Star    
7.12b buck pack Digikey RCD-24-1.00/W/X3   1 22 https://www.digikey.com/product-detail/en/recom-power/RCD-24-1.00-W-X3/945-1131-ND/2256311 buck pack that has max 1 A power wired and dimmable using PWM and/or analogue in
          Total 2586      

Freely-swimming setup

Components for freely swimming configuration behavioral setup
  Component Manufacturer Part No. Specs Amount Price (euros) Link Notes Replacement
Breadbord                  
1.1 Aluminum breadbord Thorlabs MB3045/M   1 170 https://www.thorlabs.com/thorproduct.cfm?partnumber=MB3045/M    
1.2 Feet Thorlabs AV4/M   1 20.86 https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6421    
Rail                  
2.1 Construction rail Thorlabs XT66-750   1 80 https://www.thorlabs.com/thorProduct.cfm?partNumber=XT66-750    
2.2 Construction rail mount Thorlabs XT66P1   1 32.25 https://www.thorlabs.com/thorproduct.cfm?partnumber=XT66P1    
2.3 Rail carriage Thorlabs XT66P2/M   2 62.25 https://www.thorlabs.com/thorproduct.cfm?partnumber=XT66P2/M    
2.4 Post holder Thorlabs PH20/M   2 6.33 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH20/M    
Stage                  
3.1 Post Thorlabs TR250/M   4 8.12 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR250/M    
3.2 Post holder Thorlabs PH75/M   4 7.44 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH75/M#ad-image-0    
3.3 Acrilic stage custom     1 10 https://www.modulor.de/acrylglas-gs-transparent-farblos-6-00-x-250-x-500-mm.html From item n. 0133555 of modulor  
3.4 Screen custom              
Projector                  
4.1 Projector Asus P3E   1 534 https://www.asus.com/us/Projectors/P3E/    
4.2 Display cable (HDMI) Any              
4.3 Post Thorlabs TR40/M   1 4.52 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR40/M    
4.4 Post holder Thorlabs PH40/M   1 6.56 https://www.thorlabs.com/thorproduct.cfm?partnumber=PH40/M#ad-image-0    
Camera                  
5.1 Camera Ximea MQ013MG-ON Python 1300 1 580 https://www.ximea.com/products/usb3-vision-cameras-xiq-line/mq013mg-on   Camera PointGrey Blackfly S Mono 0.4 MP USB3 Vision (Sony IMX287) 305 https://eu.ptgrey.com/blackfly-s-mono-04-mp-usb3-vision-sony-imx287
5.2 Camera cable e.g. Ximea CBL-U3-3M0 USB 3, 3m passive 1 17 https://www.ximea.com/en/products/usb3-vision-compliant-cameras-xiq/xiq-usb-30-accessories/1-m-usb-30-passive-cable    
5.3 Camera holder custom     1 0      
5.4 Post Thorlabs TR75/M   1 4.93 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR75/M    
5.5a Camera lens Edmund Optics 59-872 C mount 35mm 1 295 https://www.edmundoptics.com/imaging-lenses/fixed-focal-length-lenses/35mm-c-series-fixed-focal-length-lens/#specs    
IR filter                  
6.1 Post Thorlabs TR75/M   1 4.93 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR75/M    
6.2 Lens mount Thorlabs LMR2/M   1 23 https://www.thorlabs.com/thorproduct.cfm?partnumber=LMR2/M    
6.3 IR filter Edmund Optics 66-106 830 nm LP 1 69 https://www.edmundoptics.de/optics/optical-filters/longpass-edge-filters/rg-830-50mm-dia.-longpass-filter/    
Illumination                  
7.1 Power supply Conrad ESPS-1500 Voltcraft 1 15 https://www.conrad.com/ce/en/product/1380523/Mains-PSU-adjustable-voltage—–VOLTCRAFT—–ESPS-1500—–3-Vdc-45-Vdc-5-Vdc    
7.2a Post Thorlabs TR50/M   1 8.12 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR50/M    
7.3a Post Thorlabs TR150/M   1 8.12 https://www.thorlabs.com/thorproduct.cfm?partnumber=TR150/M    
7.4a Right-angle clamp Thorlabs RA90/M   3 8.87 https://www.thorlabs.com/thorProduct.cfm?partNumber=RA90/M    
7.5a Mirror holder Edmund Optics #54-997   1 70 https://www.edmundoptics.com/optomechanics/optical-mounts/optical-filter-mounts/40mm-sq.-fixed-filter-holder/    
7.6a Cold mirror Edmund Optics #62-633   1 20 https://www.edmundoptics.com/optics/optical-mirrors/hot-cold-mirrors/45deg-aoi-12.5mm-sq-cold-mirror/    
7.7a LED box custom     1        
          Total 2199      

Low-cost behavioral setup

A very cheap version of the behavioral setup can be built by replacing the projector with small LED display and the camera lens with a fixed focal length objective. The dimensions of this setup are quite small and parts can be kept in place with a basic custom-made frame that can be laser-cut or even made out of cardboard.

alternate text

Stytra user interface

user interface screenshot

The toolbar on top controls running of the protocols: starting and stopping, progress display, opening a dialog for protocol settings, changing the metadata and save destination.

The rest of the interface is reconfigured depending on the experiment type. Each panel can be moved separately and closed. To reopen a closed panel, you can right-click on the title bar of any panel and a list of all available panels will appear.

The camera panel buttons are for:

  • pausing and starting the camera feed
  • activating the replay (for a region selected when the camera is paused). Refer to Replaying the camera feed to refine tracking section for details.
  • adjusting camera settings (framerate, exposure and gain)
  • capturing the current image of the camera (without the tracking results superimposed
  • turning on and off auto-scaling of the image brightness range.
  • selection box to display the image at a particular stage in the tracking pipeline
  • button for editing the tracking settings

The framerate display widget shows current framerates of the stimulus display, camera and tracking. If minimum framerates for display or tracking are configuresd, the indicators turn red if the framerate drops. These are configured in the stytra_config dict for a protocol or setup_config.json file in the following sections:

stytra_config = dict(
        display=dict(min_framerate=50),
        camera=dict(min_framerate=100),
    )

The monitoring widget shows changing variables relating to the stimulus, tracking or estimation of the animal state for closed-loop stimulation use

The status bar shows diagnostic messages from the camera or tracking.

Image processing pipelines

Image processing and tracking pipelines are defined by subclassing the Pipeline class. The pipelines are defined as trees of nodes, starting from the camera image with each node parametrized using lightparam. The image processing nodes are subclasses of ImageToImageNode whereas the terminal nodes are ImageToDataNode

Attributes of pipelines are:

  • a tree of processing nodes, along
  • (optional) a subclass of the camera window which displays the tracking overlay
  • (optional) an extra plotting window class

the nodes can be set as attributes of the class, with names that are arbitrary except for how they are used by the display and plotting classes (see the stytra.experiments.fish_pipelines for examples)

Processing nodes

There are two types of nodes: ImageToImageNode and ImageToDataNode <stytra.tracking.pipelines.ImageToDataNode>

Nodes must have: - A name

  • A _process function which contains optional parameters

as keyword arguments, annotated with Params for everything that can be changed from the user interface. The _process function has to output a NodeOutput named tuple (from stytra.tracking.pipeline) which contains a list of diagnostic messages (can be empty), and either an image if the node is a ImageToImageNode or a NamedTuple if the node is a ImageToDataNode

Optionally, if the processing function is stateful (depends on previous inputs), you can define a reset function which resets the state.

Configuring tracking of freely-swimming fish

freely-swimming tracking screenshot
  1. Open the tracking settings window
  2. Input the number of fish in the dish
  3. Determine the parameters for background subtraction bglearning_rate and bglearn_every The diagnostic display can be invoked by putting display_processed to different states
  4. Once you see the fish nicely, adjust the thresholded image, so that the full fish, but nothing more, is white bgdif_threshold
  5. Adjust the eye threshold so that the eyes and swim bladder are highlighted (by changing the display_processed parameter) threshold_eyes
  6. Adjust the target area: look at the biggest_area plot, if the background is correctly subtracted and a fish is in the field of view, the value should equal the current area of the fish. Choose a range that is comfortably around the current fish are
  7. Adjust the tail length: the red line tracing the tail should not go over the actual tail.
  8. If the fish jumps around too much, adjust the prediction uncertainty.

Configuring tracking of embedded fish

  1. Ensure that the exposure time is not longer than 1.5 miliseconds, otherwise the tracking will not be correct for fast tail movements
  2. Open the tracking settings window
  3. Invert the image if the tail is dark with respect to the background
  4. Set the camera display to filtered and adjust clipping until the fish is the only bright thing with respect to the background, which has to be completely black.
  5. Make the image as small as possible (with image_scale) as long as the tail is mostly more than 3px wide and filter it a bit (usually using filter_size=3)
  6. Adjust the number of tail segments, around 30 is a good number. Usually, not more than 10 n_output_segments are required

7) Tap the dish of the fish so that it moves, and ensure the tail is tracked correctly. You can use the replay function to ensure the whole movement is tracked smoothly

  1. To ensure the tracking is correct, you can enable the plotting of the last bout in the windows

Replaying the camera feed to refine tracking

The replay functionality allows a frame-by-frame view of the camera feed during a period of interest (e.g. a bout or a struggle). After an interesting event happens and you can see it in the plot, pause the camera with the pause button. Use the two gray bars in the plots, select the time-period of interest. Then, enable the replay with the button underneath the camera, and unpause the camera feed. Now, the selected slice of time is replayed, and the framerate of the replay can be adjusted in the camera parameters. To go back to the live feed, toggle the replay button.

Modules

stytra
The root module, contains the Stytra class for running the experiment (selecting the appropriate experiment subtypes and setting the parameters)
stytra.experiments
The controller classes organizing different kinds of experiments (with and without behavioral tracking, closed loop stimulation and with video recording). The classes put together everything required for a particular kind of experiment
stytra.gui
Defines windows and widgets used for the different experiment types
stytra.hardware
Communication with external hardware, from cameras to NI boards
stytra.triggering
Communication with other equipment for starting or stopping experiments
stytra.metadata
Classes that manage the metadata
stytra.stimulation
Definitions of various stimuli and management of experimental protocols
stytra.calibration
Classes to register the camera view to the projector display and set physical dimensions
stytra.tracking
Fish, eye and tail tracking functions together with appropriate interfaces

Indices and tables