How Animals See
Daniel Hanley was four or five when he tried his first cup of plain yogurt with wheat germ. âIt was an atypical snack,â he says, but he remembers loving it instantly. Hanley walked into his familyâs backyard in upstate New York to eat his yogurt at the forestâs edge, where he was visited by two adult robins.
âThey were approaching me, calling to me with small chirps,â he says. So Hanley chirped back. For 10 or 15 minutes, the boy and the birds kept up a spirited exchange. He says he was âreally sure that they were trying to communicate with me.â
That was the moment when Hanley first recalls being fascinated by the minds of animals and their perceptual world. âI spent a long time wondering about how they experience such encounters,â he says. âWhat do they sense, perceive, and feel?â
Not quite four decades later, as a sensory ecologist at George Mason University and a National Geographic Explorer, Hanley remains just as enthralled with that question and the notion that âanimals have value and wealth and their own perspectives and their own worldviews,â as he told me. He longs to know how an animalâs perception of its world impacts its judgments, behaviors, and decisions. Early in his career, he decided to focus much of his attention on bird eggsâthe evolution of egg color and, more recently, cuckoo eggs in particular. These medium-sized birds lay their eggs in the nests of other species, forcing the host to raise the baby cuckoos, a phenomenon known as brood parasitism. Hanley wondered which features of a cuckooâs egg so effectively dupe the host birds into thinking itâs one of theirs. The answer, he found, is color mimicry. âMany of the hosts are subjected to a fairly big cuckoo egg,â he says. âThe size and the shape and all that stuff doesnât matter near as much as color. So theyâre really tied to this world of color.â
Hanley also chose to study bird eggs because theyâre straightforward to work with and experiment onâafter all, âthey donât get up and walk away.â But Hanley knew that a bunch of eggs lying around a nest is an overly simplistic reduction of the kind of scenario animals usually navigate. Their visual world is a dynamic and fluid one thatâs constantly moving and evolving. An organism has to interact with all that shifting complexity every moment to stay alive. Imagine a tiger slinking through the forest; plants and flowers fluttering in the breeze and attracting pollinators; a bird-of-paradise strutting its feathers in a clearing; or a frilled lizard fanning out the fold of skin encircling its neck to startle a predator. When we use our naked human eye to watch any of these scenes unfold, weâre not seeing the whole picture. To understand whatâs at stake and whatâs really happening from an animalâs perspective, Hanley says we need to view the world through the organism's visual system.
This kind of picture represented a wholly more challenging one for Hanley to capture, visualize, and interpret. âItâs really hard to wrap your head around what another animal can see,â he observes. His eggs would only get him so far.
Around this time, he connected with Vera Vasas, a computational biologist now at the University of Sussex who was facing a similar struggle. She had worked on building a computer model to understand how honeybees learn images and patterns. âThe first question that I was supposed to put in the model,â she says, â[was:] OK, so what are the bees seeing? Then I went, âOh shit.â We don't actually know, do we?â
Previously, to understand what a scene might look like to a honeybee, researchers assembled images of flowers âfrom digital photos taken through a UV, a blue, and a green filter matching the spectral sensitivity of the beesâ photoreceptors.â This approach yielded tantalizing but fairly limited and entirely static glimpses into the insectâs view of its surroundings. âItâs very restricted,â Vasas says. âThereâs a lot that you are just suspecting [is] out there,â but she had no way of seeing it for herself. What she wanted was a moving imageâto see the world as a honeybee sees it, to know âhow another animal [would] experience what Iâm experiencing,â she says.
So Hanley and Vasas teamed up with a dozen colleagues to build something that had never been attempted successfully beforeâa digital video camera that could be pointed at a complex, dynamic scene and that would instantaneously relay back that same view as though seen through the eyes of a mammal, bird, or insect.
*
This lofty idea soon became mired in the reality of executing itââa frantic struggle of difficulties,â according to Hanley. âIt was quite a hard project. I feel as though [at] every step we had to relearn and we had to redo.â
âThe whole camera,â says Vasas, âis more or less built on existing knowledge. But it had to be pulled together and unearthed and understood.â Between the selection of the optics, aligning the images, sorting out the focus, teasing apart what the camera was doing internally, finessing the components they wanted to 3D print, and more, Vasas says it felt like the project required an infinite number of iterations. âEvery time, there was another thing [that] made it not work.â
But after years of trying, Vasas, Hanley, and their colleagues finally did it. They made a device to see what animals see.
Hereâs how it works. Two consumer Sony cameras are positioned perpendicularly to one another. Light entering the system first hits a special piece of glass called a beam splitter. It routes ultraviolet light to a camera thatâs sensitive to UV but allows visible light to travel to the second camera (that has a sensor for red light, another for green light, and a third for blue light). âOnce you hit go,â says Hanley, âboth cameras start rolling footage. And theyâre both seeing the same thing thatâs occurring in front of the lens at the same time.â The result is footage of the scene, split into four streams: one based on red light only, one on green light only, one on blue light only, and one on UV light only. If you know an animalâs sensitivity to different wavelengths of light, you can then compute the red, green, blue, and UV contributions to capture what that organism would see. And you can easily recompute the colors to render out the same scene according to the perspective of a different animal. Thereâs a fair bit of software thatâs required to make the whole thing sing. â[Vera] worked her magic behind the scenes with Python,â says Hanley, referring to the computer programming language.
The results are stunningly kaleidoscopic and even psychedelic in some cases. For Vasas, it was âa relief being able to see what I have been trying to imagine before.â She says the camera allows her to skip a lot of the mental gymnastics to remember what should be visible. Now, âItâs just there.â
*
Check out this âbirdâs eye viewâ of a museum specimen of the orange-barred sulphur butterfly, showcasing how brightly reflective it is of UV light. It's a wavelength to which most songbirds are sensitive. Here, it's rendered purple and magenta to make it visible to us.
In this next video of three male orange sulphur butterflies (a different species) as a songbird might see it, UV iridescence is displayed as purple. Males display this iridescence as a key mating signal, one that humans miss altogether.
Hereâs a video of two northern mockingbirds interacting in a tree as a songbird would view it. The UV signal has been overlaid as magenta to make it visible to the human eye.
âFor all the animals that can perceive UV, the sky isnât really blue,â says Vasas. âItâs a much brighter UV because of the way the sunlight gets scattered in the atmosphere.â As she and her coauthors state in the paper, âThus, while the sky may appear blue to our eyes, it would appear UV-blue to many other organisms.â This is something Vasas says she could have looked up. But she found it arresting to watch this video and see the sky through the eyes of a birdâas a âglowing orb of UV,â to borrow Hanleyâs words.
Look at this video of a black swallowtail caterpillar being coaxed into an anti-predatory display as a honeybee might see things. The UV, blue, and green light streams are shown as blue, green, and red, respectively.
Note the two-pronged âosmeteriumâ that emerges from just behind the caterpillarâs head, which it uses to ward off predators (in this case, the researcherâs pair of forceps). Before this camera, an osmeterium âreally wasnât something that was accessible,â says Hanley. âItâs [typically] inside their body, hidden. Itâs soft. Itâs only used in certain contexts and then it [goes] right back in.â Osmeteria contain chemicals that help defend the caterpillar. This video also shows how the organ reflects UV light (depicted as magenta), possibly serving as a visual warning as well.
Finally, here arrayed in a grid is how four different species would view a rainbowâa mouse (A), a honeybee (B), a bird (C), and a human (D). Not only do other species pick up the UV portion of the rainbow (which is invisible to us), but the mouseâs smaller number of different photoreceptors means it might view the same rainbow as being composed of fewer, broader bands.
âOur art teachers always said that thereâs a difference between looking and seeing,â says Hanley. âWhen you actually see something, itâs different than when you just simply look at it. And rainbows are a good case study of this.â
*
If successful, this invention by Vasas, Hanley, and their colleagues could help researchers understand animal behavior and cognition using motion instead of still imagesââthe full display,â as Hanley puts it. âWithin my discipline,â he says, âthereâs all sorts of questions that start with some type of color thatâs moving.â
Take camouflage. To appreciate how an organism actually strives to conceal itself, Vasas says you have to know whatâs visible and invisible to both the camouflaging animal and the observer itâs hiding from. There are agricultural applications as well. For instance, we rely on bees as pollinators, but theyâre not faring well globally. âUnderstanding what signals they detect and what it means to them,â says Hanley, might give us insights into why their colonies are collapsing.
Then there are the more imaginative applications. Hanley considers installing one of these cameras at a childrenâs museum where kids could toggle between, say, a snakeâs view and a hamsterâs view of the same scene. He says that kind of uninhibited exploration can unlock a deep and lasting curiosity about the natural world. Not unlike his early experience communicating with robins over yogurt, it gives young people an intuitive understanding that animals have their own way of perceiving and interacting with their surroundingsâeach one just as valid as the next. And Hanley, whoâs collaborated with filmmakers before, sees this camera as a new, flexible tool that cinematographers can use to represent an animalâs visual perspective authentically in their movies. (These twin aims of rigorous science and artistic expression helped convince the National Geographic Society to fund the project.)
Ultimately, this invention reveals how arbitrary any visual scene really is, and could help stop humans from historically evaluating animal intelligence and acuity based on how we understand reality, rather than the way other species do. âWhatever you see in the world is very specific to you,â explains Vasas. âPerception is deeper than just whatâs happening at the photoreceptor,â adds Hanley.
âMaybe [this tool] can bring about some understanding of animals,â Vasas says, âknowing that what they see at the basic level is so different to us. They experience things differently. They think differently.â
âBut,â she says, âthat doesnât necessarily mean that they are less than humans.â âŠ
Subscribe to Broadcast