A new camera system lets us see the world through the eyes of the birds and the bees

A new camera system lets us see the world through the eyes of the birds and the bees

A new camera system and software package allows researchers and filmmakers to capture video of the animals. Credit: Vasas et al., 2024.

Who hasn’t wondered how animals perceive the world, which is often different from how humans perceive it? There are various ways in which scientists, photographers, filmmakers, and others attempt to reconstruct the colors a bee sees as it hunts a ripe flower for pollination. Now a multidisciplinary team has developed an innovative camera system that is faster and more flexible in terms of lighting conditions than existing systems, allowing it to capture moving images of animals in their natural habitat, according to a new paper published in the journal PLoS Biology.

“We have long been fascinated by the way animals see the world. New techniques in sensory ecology allow us to infer what static views might look like to an animal,” said co-author Daniel Hanley, a biologist at George Mason University in Fairfax, Virginia. However, Animals often make critical decisions about moving targets (e.g., detecting food items, evaluating the display of a potential mate, etc.). Here, we provide hardware and software tools for ecologists and filmmakers that can capture and display the colors that animals perceive during the movement.

Per Hanley and his co-authors, different animal species have unique sets of photoreceptors that are sensitive to a wide range of wavelengths, from ultraviolet to infrared, depending on the specific environmental needs of each animal. Some animals can even detect polarized light. So each species will perceive color a little differently. For example, honeybees and birds are sensitive to ultraviolet radiation, which is invisible to the human eye. “Since neither our eyes nor commercial cameras capture such variations in light, large swaths of visual fields remain unexplored,” the authors wrote. “This makes the false-color images of animal vision powerful and convincing.”

However, the authors stress that current techniques for producing false-color images cannot identify the colors that animals see while moving, an important factor because movement is crucial to how different animals communicate and navigate the world around them through the appearance of colors and detecting signals. For example, traditional spectrophotometry relies on light reflected off the body to estimate how a given animal’s photoreceptors process that light, but it is a time-consuming method, and a lot of spatial and temporal information is lost.

Peacock feathers through the eyes of four different animals: (a) the peacock; (b) humans; (c) honey bees; and (d) dogs. Credit: Vasas et al., 2024.

Multispectral imaging takes a series of images across different wavelengths (including ultraviolet and infrared) and combines them into different color channels to extract camera-independent color measurements. This method trades some resolution for better spatial information, and is well suited for studying animal signals, for example, but it only works on stationary objects, so temporal information is not available.

This is a disadvantage because “animals provide and receive signals from complex shapes that cast shadows and generate light,” the researchers wrote. “These signals vary under ever-changing lighting and vantage points. Information about this interplay between background, illumination, and dynamic signals is scarce. However, it constitutes a critical aspect of the ways colors are used, and thus perceived, by free-living organisms in natural environments.” .

So Hanley and his colleagues set out to develop a camera system capable of producing high-resolution videos of animals, which capture the full complexity of visual signals as an animal would see them in a natural environment. They combined existing methods of multispectral imaging with new hardware and software designs. The camera records video in four color channels simultaneously (blue, green, red and ultraviolet). Once this data is processed into “perceptual units,” the result is an accurate video of how different animals perceive a colored scene, based on what we know about the photoreceptors they have. The team’s system predicts tactile colors with up to 92 percent accuracy. The cameras are commercially available, and the software is open source so others can freely use and build upon it.

The video at the top of this article depicts the colors seen by honey bees as they watch fellow bees foraging for food and interacting (even fighting) over flowers — an example of a camera system’s ability to capture behavior in a natural setting. Below, Hanley applies UV-blocking sunscreen on the field. Its light-colored skin appears roughly similar to human vision and honey bee false-color vision “because skin reflectance gradually increases at longer wavelengths,” the researchers wrote.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *