Getting to know your herd or flock
On an entertainment level, I quite enjoy watching our herd and our flock. When and if our upstream connection is converted from wireless to fibre, we’ll be streaming live footage of our crew, as I suspect others will enjoy watching them as well.
However on a research level, I’m interested to learn more about the tools that are available to analyze and monitor animals. Especially tools that can be employed on site, without having to use the broader Internet, so that farms with limited connectivity can still take advantage of emerging technology.
Connecting different parts of a farm to each other, and enabling local video streaming, is relatively straight forward. If that video footage can then be employed, locally, and autonomously, there may be interesting insights and data to be generated.
Similarly, my interest in technology is rarely proprietary. A tool is only useful to me if other people use it openly so I can learn from them, and if I too can share the tool, and whatever I’ve learned from it. This is why open source has been essential to my professional career, and will be a pillar of the smart farm research and work I wish to do.
While farmers and the broader agricultural sector may not be as engaged in open source activity as I may desire, thankfully the broader research community is. A by-product perhaps of the economic precarity of graduate students.
I found the following research article when upon initiating my search for open source animal tracking solutions:
A major goal in systems neuroscience is to determine the causal relationship between neural activity and behavior. To this end, methods that combine monitoring neural activity, behavioral tracking, and targeted manipulation of neurons in closed-loop are powerful tools.
One of the appeals of this kind of research is better understanding the herd or flock mentality. This is one of the primary reasons I enjoy watching them. Their neural activity is clearly not just individual, but also networked. Members of the herd react to each other. Similarly the flock exhibit a kind of collective and distributed intelligence that is fascinating.
If I could use these tools to make herd or flock based digital art, that in and of itself would be a worthwhile pursuit. I digress, back to the abstract:
However, commercial systems that allow these types of experiments are usually expensive and rely on non-standardized data formats and proprietary software which may hinder user-modifications for specific needs. In order to promote reproducibility and data-sharing in science, transparent software and standardized data formats are an advantage.
Word!
Here, we present an open source, low-cost, adaptable, and easy to set-up system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Approach. Based on the Open Ephys system (www.open-ephys.org) we developed multiple modules to include real-time tracking and behavior-based closed-loop stimulation. We describe the equipment and provide a step-by-step guide to set up the system. Combining the open source software Bonsai (bonsai-rx.org) for analyzing camera images in real time with the newly developed modules in Open Ephys, we acquire position information, visualize tracking, and perform tracking-based closed-loop stimulation experiments. To analyze the acquired data we provide an open source file reading package in Python.
Exciting! Though when they say easy to set-up I wonder if that applies to a relatively lay person like myself?
I’m not presently interested in gathering the neural activity, but that would be a neat capability to use in the future, especially if I found a farmer interested in that.
Instead I see value in tracking the movements and patterns of the animals. This could help inform pasture management, but also overall herd and flock management. For example the flock has a complex social structure that would be interesting to document and visualize. Similarly I’d like to influence where the herd grazes and why. Especially in the context of goats as brush and weed control.
The next tool I came across was idTracker:
Animal tracking software has been developed before, but they're not perfect. A common glitch is mixing up the identities of two animals after one crosses in front of the other. The new system tries to get around this by extracting a unique visual "fingerprint" that can be used to identify a specific individual in every frame of a video. The fingerprints are based on comparing the intensity and distance between hundreds of pairs of pixels on each animal when it's swimming or crawling in isolation. The software then uses these numerical signatures of each animal to tell them apart. Unlike some methods, it works on unadorned animals; it doesn't require any paint or other marks that can potentially interfere with natural behavior.
Best of all, it seems to work really well. The researchers, led by Gonzalo de Polavieja at the Instituto Cajal in Madrid, Spain, tested their software on zebrafish (above), medaka fish, mice, fruit flies, and ants (below), and achieved 99.7 percent accuracy on average, they report yesterday in* Nature Methods*. It even works for siblings and other similar-looking animals that humans have a hard time telling apart. The implications for the sanity of behavioral science graduate students could be enormous.
Given some of the advances in machine learning and automatic pattern recognition, I was hoping that such a tool as idTracker would be available and open source. In the realm of machine learning, it’s relatively basic, but it still offers tremendous potential, and is something I hope to experiment with.
Though it does raise the question of where more recent applications of machine learning or deep learning are being used with regard to animal tracking. This is how I found DeepLabCut:
Admittedly a bit over my head, but something I hope to circle back to if and when my zoological literacies increase. 😎 Though if you do have a pet, and have video of that pet, why not click on the link above and follow the instructions? Let us know if you do!?
I found out about DeepLabCut via a fantastic yet intimidating resource called OpenBehavior, which “provides a centralized repository of open-source tools specific for behavioral neuroscience research.” Here’s a small sample of some of the tools I’ve found via their repository:
Hmm, I wonder if there’s a role for trained rodents on a farm? Probably not, but could make for some interesting cyberpunk fiction!
When I started searching for applications or tools to monitor and analyze animals, I assumed I’d find stuff focused on pets. While there is some basic stuff out there, the more advanced concepts are still somewhat primitive, entirely proprietary, and somewhat creepy.
Furbo, a streaming camera that can dispense treats for your pet, snap photos and send you a notification if your dog is barking, provides a live feed of your home that you can check on a smartphone app.
In the coming months, Furbo is expected to roll out a new feature that allows it to differentiate among kinds of barking and alert owners if a dog’s behavior appears abnormal.
“That’s sort of why dogs were hired in the first place, to alert you of danger,” said Andrew Bleiman, the North America general manager for Tomofun, the company that makes Furbo. “So we can tell you not only is your dog barking, but also if your dog is howling or whining or frantically barking, and send you basically a real emergency alert.”
The ever-expanding world of pet-oriented technology now allows owners to toss treats, snap a dog selfie and play with the cat — all from afar. And the artificial intelligence used in such products is continuing to refine what we know about animal behavior.
Most of these pet apps and devices follow the social media model of capture users, extract their data, and employ them (without pay) to train your machine learning models so that they improve over time.
This only works if enough people participate, and in this case there will probably be enough pet owners who fall for these false promises. The irony being this is not the AI they need, as most pet owners do not have to rely upon AI to understand their animals.
I’d imagine the same is true of most farmers, but the larger the herd or flock, the easier it is to miss little things. Similarly the modelling has to be open source if it is to be trusted or adapted. For example how many people trust Facebook or have the power to modify the algorithms that run the platform?
On a more basic level, there are open source dog or pet collars that can track location. While there are more advanced proprietary ones that are in various stages of kickstarter campaigns, or in the trough of disillusionment that follows most crowdsource funding initiatives, the basic stuff seems both doable and viable.
For example Adafruit industries put out a neat tutorial on how to make your own GPS dog collar (using components sold by Adafruit):
And there are developments for collars designed other animals:
The collar above would work on medium sized animals, and the one below is designed for larger animals like elephants, rhinos, and lions:
Overall some interesting tools to play with, and it will be interesting to see how these evolve over time. That’s one of the primary benefits of open source, that in addition to greater accessibility, the design of tools and solutions is collaborative and participatory.