Getting to know your herd or flock
On an entertainment level, I quite enjoy watching our herd and our flock. When and if our upstream connection is converted from wireless to fibre, we’ll be streaming live footage of our crew, as I suspect others will enjoy watching them as well.
However on a research level, I’m interested to learn more about the tools that are available to analyze and monitor animals. Especially tools that can be employed on site, without having to use the broader Internet, so that farms with limited connectivity can still take advantage of emerging technology.
Connecting different parts of a farm to each other, and enabling local video streaming, is relatively straight forward. If that video footage can then be employed, locally, and autonomously, there may be interesting insights and data to be generated.
Similarly, my interest in technology is rarely proprietary. A tool is only useful to me if other people use it openly so I can learn from them, and if I too can share the tool, and whatever I’ve learned from it. This is why open source has been essential to my professional career, and will be a pillar of the smart farm research and work I wish to do.
While farmers and the broader agricultural sector may not be as engaged in open source activity as I may desire, thankfully the broader research community is. A by-product perhaps of the economic precarity of graduate students.
I found the following research article when upon initiating my search for open source animal tracking solutions:
A Special Issue from Journal of Neural Engineering presents the article Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai. https://t.co/ysPHMEYCPg pic.twitter.com/u9OaLOE6iD
— IOP Publishing (@IOPPublishing) August 21, 2018
A major goal in systems neuroscience is to determine the causal relationship between neural activity and behavior. To this end, methods that combine monitoring neural activity, behavioral tracking, and targeted manipulation of neurons in closed-loop are powerful tools.
One of the appeals of this kind of research is better understanding the herd or flock mentality. This is one of the primary reasons I enjoy watching them. Their neural activity is clearly not just individual, but also networked. Members of the herd react to each other. Similarly the flock exhibit a kind of collective and distributed intelligence that is fascinating.
If I could use these tools to make herd or flock based digital art, that in and of itself would be a worthwhile pursuit. I digress, back to the abstract:
However, commercial systems that allow these types of experiments are usually expensive and rely on non-standardized data formats and proprietary software which may hinder user-modifications for specific needs. In order to promote reproducibility and data-sharing in science, transparent software and standardized data formats are an advantage.
Word!
Here, we present an open source, low-cost, adaptable, and easy to set-up system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Approach. Based on the Open Ephys system (www.open-ephys.org) we developed multiple modules to include real-time tracking and behavior-based closed-loop stimulation. We describe the equipment and provide a step-by-step guide to set up the system. Combining the open source software Bonsai (bonsai-rx.org) for analyzing camera images in real time with the newly developed modules in Open Ephys, we acquire position information, visualize tracking, and perform tracking-based closed-loop stimulation experiments. To analyze the acquired data we provide an open source file reading package in Python.
Exciting! Though when they say easy to set-up I wonder if that applies to a relatively lay person like myself?
I’m not presently interested in gathering the neural activity, but that would be a neat capability to use in the future, especially if I found a farmer interested in that.
Instead I see value in tracking the movements and patterns of the animals. This could help inform pasture management, but also overall herd and flock management. For example the flock has a complex social structure that would be interesting to document and visualize. Similarly I’d like to influence where the herd grazes and why. Especially in the context of goats as brush and weed control.
The next tool I came across was idTracker:
New FREE software that can track multiple animals at once? #idTracker to the rescue http://t.co/9mcwGMevoW http://t.co/gZsYuwEqMz
— Mikel Delgado, PhD #WashYourHands #CoverYourFace (@mikel_maria) June 2, 2014
Animal tracking software has been developed before, but they're not perfect. A common glitch is mixing up the identities of two animals after one crosses in front of the other. The new system tries to get around this by extracting a unique visual "fingerprint" that can be used to identify a specific individual in every frame of a video. The fingerprints are based on comparing the intensity and distance between hundreds of pairs of pixels on each animal when it's swimming or crawling in isolation. The software then uses these numerical signatures of each animal to tell them apart. Unlike some methods, it works on unadorned animals; it doesn't require any paint or other marks that can potentially interfere with natural behavior.
Best of all, it seems to work really well. The researchers, led by Gonzalo de Polavieja at the Instituto Cajal in Madrid, Spain, tested their software on zebrafish (above), medaka fish, mice, fruit flies, and ants (below), and achieved 99.7 percent accuracy on average, they report yesterday in* Nature Methods*. It even works for siblings and other similar-looking animals that humans have a hard time telling apart. The implications for the sanity of behavioral science graduate students could be enormous.
Given some of the advances in machine learning and automatic pattern recognition, I was hoping that such a tool as idTracker would be available and open source. In the realm of machine learning, it’s relatively basic, but it still offers tremendous potential, and is something I hope to experiment with.
Though it does raise the question of where more recent applications of machine learning or deep learning are being used with regard to animal tracking. This is how I found DeepLabCut:
🎉🤩DeepLabCut Model Zoo is live!!!! #WFH? Have a pet 🐶🐱?! Run #deeplabcut on them now! Runs in nearly real-time 😎
— Mackenzie Mathis (@TrackingActions) May 13, 2020
NO install 🥳 use our custom DeepLabCut Model Zoo @GoogleColab Notebook now! ➡️ https://t.co/8SovrmVZNN
The models, details:https://t.co/VLs4ScH0Wv https://t.co/4SIDg7KA3w pic.twitter.com/tnhQOPj5NL
Admittedly a bit over my head, but something I hope to circle back to if and when my zoological literacies increase. 😎 Though if you do have a pet, and have video of that pet, why not click on the link above and follow the instructions? Let us know if you do!?
I found out about DeepLabCut via a fantastic yet intimidating resource called OpenBehavior, which “provides a centralized repository of open-source tools specific for behavioral neuroscience research.” Here’s a small sample of some of the tools I’ve found via their repository:
Need visual stimuli for your experiments? Here's an #opensource spatial visual stimulator from @kfrankelab @Chagas_AM et al. Can be used across species and is flexible to suit your experimental needs for vision research. See more on https://t.co/B7crm9byfu https://t.co/ftu2urLC04
— OpenBehavior (@OpenBehavior) May 7, 2020
1/4 In honor of #WBTC1, I'm pre-releasing ThruTracker--free open access software for 2-D and 3-D animal tracking.
— Dr. Aaron Corcoran 🦇 (@AaronJCorcoran) May 27, 2020
Free Windows download, manual and tutorials are available here: https://t.co/te9K59KDbD pic.twitter.com/dBuDr0Y5kr
ACRoBaT is an Automated Center-out Rodent Behavioral Trainer, which is a lost-cost option for training rodents that requires little human oversight. The device is fully automated and build instructions are included! See details on https://t.co/B7crm9byfu https://t.co/znxU8EEI66
— OpenBehavior (@OpenBehavior) May 14, 2020
Hmm, I wonder if there’s a role for trained rodents on a farm? Probably not, but could make for some interesting cyberpunk fiction!
When I started searching for applications or tools to monitor and analyze animals, I assumed I’d find stuff focused on pets. While there is some basic stuff out there, the more advanced concepts are still somewhat primitive, entirely proprietary, and somewhat creepy.
What’s Your Pet Saying? These Machines Know https://t.co/D00SF3ibqr
— Dirk Strauss (@DirkStrauss) May 8, 2020
Furbo, a streaming camera that can dispense treats for your pet, snap photos and send you a notification if your dog is barking, provides a live feed of your home that you can check on a smartphone app.
In the coming months, Furbo is expected to roll out a new feature that allows it to differentiate among kinds of barking and alert owners if a dog’s behavior appears abnormal.
“That’s sort of why dogs were hired in the first place, to alert you of danger,” said Andrew Bleiman, the North America general manager for Tomofun, the company that makes Furbo. “So we can tell you not only is your dog barking, but also if your dog is howling or whining or frantically barking, and send you basically a real emergency alert.”
The ever-expanding world of pet-oriented technology now allows owners to toss treats, snap a dog selfie and play with the cat — all from afar. And the artificial intelligence used in such products is continuing to refine what we know about animal behavior.
Most of these pet apps and devices follow the social media model of capture users, extract their data, and employ them (without pay) to train your machine learning models so that they improve over time.
This only works if enough people participate, and in this case there will probably be enough pet owners who fall for these false promises. The irony being this is not the AI they need, as most pet owners do not have to rely upon AI to understand their animals.
I’d imagine the same is true of most farmers, but the larger the herd or flock, the easier it is to miss little things. Similarly the modelling has to be open source if it is to be trusted or adapted. For example how many people trust Facebook or have the power to modify the algorithms that run the platform?
On a more basic level, there are open source dog or pet collars that can track location. While there are more advanced proprietary ones that are in various stages of kickstarter campaigns, or in the trough of disillusionment that follows most crowdsource funding initiatives, the basic stuff seems both doable and viable.
For example Adafruit industries put out a neat tutorial on how to make your own GPS dog collar (using components sold by Adafruit):
And there are developments for collars designed other animals:
Congratulations Dr James Foley for first paper from your doctoral work. @WildCRU_Ox @OxZooDept @MarinoJorgelina
— Claudio Sillero (@ClaudioSillero) April 3, 2020
Open‐source, low‐cost modular GPS collars for monitoring and tracking wildlife https://t.co/ZGfnW5X0qZ
The collar above would work on medium sized animals, and the one below is designed for larger animals like elephants, rhinos, and lions:
OPENCOLLAR : OpenCollar is a conservation collaboration to design, support
— orovellotti (@orovellotti) May 26, 2020
and deploy open-source tracking collar hardware and software https://t.co/QU5pEbke8S
Overall some interesting tools to play with, and it will be interesting to see how these evolve over time. That’s one of the primary benefits of open source, that in addition to greater accessibility, the design of tools and solutions is collaborative and participatory.