Ines Wichert: Machine learning based video segmentation for animal tracking in behavioral experiments
BCCN Berlin and TU Berlin
Abstract
In behavioral animal studies, locating and segmenting animals in video recordings of experiments is a common first step towards automatized analysis. Textured backgrounds or backgrounds with a low contrast with respect to the animal, as well as lightning changes can complicate this task. Researchers are therefore often limited to either use clean and unnatural animal surroundings that might impact behavior, or perform laborious manual annotation. In this work, we present a method that is designed to perform segmentation in behavioral animal experiments and that requires very little work or programming knowledge of the practitioner. We use a simple segmentation method that relies on temporal dynamics to generate a first, noisy segmentation. The results of it are then passed as noisy training data to a neural network that learns to perform a better segmentation. We test and evaluate performance of the model on a benchmark dataset that we designed to cover various background types such as different kinds of fabrics at various shades of darkness or pebbles, recording settings and animal species (fish, mice, termites and squids). The method widens the applicability of automatized analysis methods to more complicated videos, giving researchers more liberty in their experimental design, and making their analyses easier and faster.
Organized by
Pawel Romanczuk / Robert Martin
Location
BCCN Berlin, Seminar Room, Philippstr. 13 Haus 6, 10115 Berlin