Netflix for Monkeys

Our publication in 2021, got a lot of coverage in news. In Finland (HS, MTV, MTV meta about CNN), but also elsewhere, such as Stephen Colbert’s late night show (above)! Our study was often referred to as Netflix for monkeys.

In this post I give a somewhat wordy description of what doing our research project with white-faced sakis in field of Animal-Computer Interaction actually looked like in practice for me. For more details and formal presentation: find the paper here.

In summer 2020 I joined my first project in Animal-Computer Interaction with Ilyena Hirskyj-Douglas (ilyena.com). The aim was to build an enrichment device based on visual stimuli (videos) for white-faced sakis in the zoo. Purpose of this was to evaluate the interaction method, study the monkeys' response (behaviours) to visual stimuli and discover if they prefer any particular content.

I started off with a tunnel-shaped device (below) that was already programmed to play different audio when a monkey entered the device (Roosa and Ilyena’s first study here). The audio was triggered by three infrared (IR) sensors detecting the monkey inside. Sensors and speaker were controlled using Raspberry Pi 3 Model B. I needed to add some new tech (screen, camera) to the device which meant I needed to adapt the structure to this also.

Partially distmantled original device

I felt priviledged being able to leverage the Fablab (equipment and staff’s expertise!) at Aalto university. I mainly just needed to laser cut pieces for a new compartment to cover all the tech. To do this, I was using Adobe Illustrator to design the vector files, which were then used with the laser cutter to cut plywood. Furthermore, the new screen needed to be attached to the acrylic cover in some way, which we (Fablab staff helped me) solved by first creating a wooden 3D model of the screen. We used the wooden model to mould plastic with heat into a cover for the screen and for the small Raspberry Pi camera module.





In regards to the software, I quickly updated the software to play the videos, record the interactions with the Raspberry Pi camera, and to log details (time, duration, etc.) each time the audio was triggered (via IR sensors). Well, I also quickly learned that the coding was not done yet – in the end, 90% of the coding was fixing random configuration issues of the Raspberry Pi, resolving other small tasks like how to start the program at boot, and about enhancing the existing code, like fixing unoptimal working, adding error handling, logging all data online (logs, recordings), etc. And of course, there are also small bugs! They creep to light during interactions that are not typical and due to a new environment (a live zoo instead of an office). In hindsight, we set up the device at the zoo a bit too early, not having patience or experience enough to know to test the device more properly prior. Nevertheless, we learned from this for our future research.




One of the sakis

Once the device was up and running, I spent a lot of time watching videos of monkeys. Our study consisted of two conditions: baseline and audio. We measured a baseline of sakis' interactions with the device first, without playing any audio when they interacted with it. After, we played each of the six videos for one week. We had two cameras: the one inside the device and one in the corner of the sakis' enclosre, as we wanted to code the sakis' behaviours inside and around the device. Two cameras ensured we would have a good view of every interaction. In addition to the recordings, we also got data online constantly, so I started verifying and cleaning it as the study progressed.

After all data was gathered, I used R Studio to do summary statistics and tests to compare the sakis' interactions across the different videos, and the stimuli against baseline data. The results of the study indicated that the monkeys used the device much more in the beginning of the study, this declining towards the end (this is called novelty effect). They preferred videos of mealworms (see the GIF below, trying to lick the screen!) and underwater scenes with fish over ‘abstract’ and ‘forest’ videos and videos of other animals.


licking mealworms from screen

This was a fun project. Working with animals was super fun. Workload varied a lot, from hectic days of coding and fixing many bugs to get our study running, to days full of watching monkey videos and repeatedly logging details of their interactions.