After finishing my PhD, I worked for few years as a computational neuroscientist in the Institute of Neuroscience at Newcastle University.
I joined a multi-disciplinary project called Man, Mantis and Machine (M3) aimed at reverse-engineering the neural algorithms underpinning 3D vision in the the praying mantis, the only invertebrate known to have 3D vision. The project received a lot of publicity due to a novel experimental technique we developed involving putting 3D glasses on insects.
I'll try to describe what we were doing and why, using the picture below.
This is an individual of the praying mantis species Sphodromantis lineola that we studied. I took this picture from one of our experimental setups and it shows the mantis sitting in front of a computer monitor and viewing a visual stimulus (the mantis is actually hanging upside down).
Most of our experiments followed the same protocol; we render a visual stimulus to the mantis and record its behavioral and electrophysiological responses. Mantises are stealth predators that rely on vision to hunt prey -- they have fun behavioral responses to many visual stimuli, check this out for one ...
In this video there's a virtual target (a bug) that's swirling around the screen and triggering the tracking response of the mantis. As for the sudden shifts in the background, these are global motion stimuli that make the mantis think it's falling and trigger a stabilization mechanism known as the optomotor response. Such behavioral observations are very simple but can be quantitatively measured and used to reverse engineer the architectures and parameters of the neural computations underpinning behavior.
My particular role within the project was to build computational models that account for recorded experimental observations. This was overall a fun and productive ride that (despite the occasional months of experimental failures and negative results) culminated in several interesting discoveries, for example see ...
- Invisible noise obscures visible signal in insect motion detection (Tarawneh et al., Scientific Reports 2017)
- Insect stereopsis demonstrated using a 3D insect cinema (Nityananda et al., Scientific Reports 2018)
From the few years I spent building these models, my main personal insight is that life is usually more mechanical than one may initially assume. It's very intuitive to think, for example, that a mantis that is tracking prey like in the video above is an agent with a complicated understanding of what prey is, why it must be captured and how the future will unfold afterwards. It's therefore somewhat surprising to find that this behavior can be accurately modeled using a simple feed-forward signal processing system consisting of elementary blocks such as adders, multipliers and first-order filters.