Scientists have combined a new camera system with open source software to create stunning videos of the world as different animals see it, including the specific colors they see.
From the most intense red to streaks of ultraviolet light, the shots show different settings in and around the park environment, with some colors highlighted and others faded depending on the vision of the animal being simulated.
The clip shows a zebra swallowtail butterfly (Protographium Marcellus) It feeds on flowers, such as honey bees.Apis mellifera) We’ll see about that. Scientists published 12 video clips showing how birds, bees, mice and dogs see the world.
To produce the videos, the researchers set up cameras to capture raw footage and later applied post-processing software on top to predict perceived colors in different species. This method, which they explained in a research paper published on January 23 in the journal PLoS Biology92% accurate based on testing performed using conventional spectrophotometric techniques.
A zebra swallowtail moth feeding on flowers as seen by honeybees. (Credit: Vasas V, et al., 2024, PLOS Biology, CC-BY 4.0)
“We have long been fascinated by the way animals see the world.” Daniel HanleyThe study’s senior author, an assistant professor of biology at George Mason University in Virginia, said in A statement. “Modern techniques in sensory ecology allow us to infer what static scenes might look like to an animal; however, animals often make critical decisions about moving targets (e.g., detecting food items, evaluating the display of a potential mate, etc.). Here, “We provide hardware and software tools to ecologists and filmmakers that can capture and display the colors perceived by animals in motion.”
Related: See 15 Crazy Animal Eyes – Rectangular Pupils to Wild Colors
Species see the world differently, partly due to the photoreceptors in their eyes and the neural structure of their brains. Dog eyesFor example, they are organized similarly for people with red-green color blindness. In their research paper, the scientists said that insects such as honeybees can see ultraviolet light.
To better understand how animals see the world, researchers have devised various methods to accurately reproduce the colors that animals see, but these techniques have only been able to generate static images.
For example, spectrophotometry works by using light reflected off the body to estimate what the animals’ photoreceptors are picking up. These methods have only produced static images so far, cannot infer spatial information and are time-consuming, the scientists said. Meanwhile, multispectral photography, which relies on taking a series of images in several wavelength bands, replaces resolution with more spatial information, but this method only works on stationary objects.
To overcome these limitations, the researchers created this new system by obtaining commercially available Sony a6400 cameras and configuring them to record in four color channels—red, green, blue, and ultraviolet—simultaneously.
Next, they mounted the cameras on a 3D-printed chassis, consisting of various pieces of photographic equipment, including a modular cage, beam-splitter mirror mounts, and cone baffles (which reduce light leakage towards the camera).
This was the first step in a process that began with capturing raw footage and ended with showing the final clips. To display the video in colors perceived by animals, the researchers applied… video2vision program – A set of conversion functions – to raw footage. They then processed the data into “perceptual modules,” similar to image filters, and adjusted each one based on our existing knowledge of the photoreceptors of the species in question to accurately predict what each animal might see.
Scientists and filmmakers who study animals could use this setup to capture and process their own footage, the researchers said. In particular, viewing footage with animal vision filters applied can tell us more about how particular species interact with their environment and respond to stimuli.