Our visual perception is more rational than we think

By Christoph Elhardt.

Key points

  • Rather than provide a complete view of the world, our retina already tries to process information in the most useful way possible.
  • This means that cognitive biases start long before we consciously think about something.
  • An artificial intelligence that had to solve the same tasks as human study participants helped confirm these findings.

Our visual perception depends more strongly on the utility of information than previously thought. This has been demonstrated in a series of experiments conducted by researchers at the Neuroscience Center Zurich. Cognitive biases can begin at the retina.

Are our senses there to provide us with the most complete representation of the world, or do they serve our survival? For a long time, the former was the dominant view in neuroscience. “Was” is the operative word here. In the last 50 years, psychologists such as Nobel Prize winners Daniel Kahnemann and Amos Tversky have shown that human perception is often anything but complete and instead is highly selective.

Experiments have now verified that there is a whole list of examples of cognitive biases. One of the most important is confirmation bias: we often process new information in a way that confirms our beliefs and expectations.

But up until now, researchers haven’t been able to fully explain under what conditions these distortions come into play and when exactly in the perceptual process they begin. A study by researchers led by University of Zurich Professor Todd Hare and ETH Professor Rafael Polania, recently published in the journal Nature Human Behaviour, now shows that the brain already adjusts the visual perception of things on the retina when it is in our interest to do so. Or, to put it another way, we unconsciously see things distorted when it comes to our survival, well-being, or other interests.

How slanted are the stripe patterns?

Hare and his coauthors were able to prove through a series of experiments that people perceive the same things differently when the decision context changes. The study’s 86 participants were asked to repeatedly compare two black-and-white striped patterns – known as Gabor patches – and say which pattern was closer to a 45-degree angle. The aim was to score as many points as possible.

In the first round, they received 15 points for every correct answer. But in the second round, the decision context changed: it no longer mattered whether the answer was right or wrong. Instead, the score increased continuously from 0 to 45 degrees to a maximum of 25 points. The participants saw the same pairs in both rounds.

They ought really to have reached the same conclusion both times. This is because when we look at something, our retinas convert the reflected light into visual information that is transmitted to our brain via nerve pathways. There, they are matched with our prior knowledge and experience and processed to provide a three-dimensional image. The visual information was the same in both rounds.

The subjects first focus on the cross before comparing the inclination of the Gabor patches in two rounds. (Illustration: Schaffner et.al. 2023)

What we see depends on the context

When the researchers evaluated the experiment, they realised that the participants had adjusted their perceptions in the second round to score as many points as possible. If they actually saw the world objectively, there shouldn’t be any differences between the two rounds.

Participants’ assessments of the Gabor patches’ angles ought to have been the same each time, irrespective of the decision context. But this wasn’t the case: “People flexibly and unconsciously adjust their perceptions when it works to their advantage,” so the scientists say.

For the researchers, inferring that cognitive distortions are errors that cause us to make inaccurate or irrational judgements and decisions is missing the point. “Since our cognitive abilities are limited, it actually makes sense that we perceive the world in a distorted or selective way,” says Rafael Polonia.

Even the retina prioritises useful information

Our visual perception seems to depend more strongly on the potential utility of information than previously thought. In another experiment, the researchers were able to show that our retinas already try to process information in the most advantageous way possible.

“As soon as we look at something, we try to maximise our own benefit. This means that cognitive bias starts long before we consciously think about something,” Polania says.

This is because a lot of information is lost in perception. It’s therefore more efficient for the brain to filter, prioritise and select information as early on as possible.

AI filters visual information like humans

To determine when visual information is distorted, a group of participants repeated the test with a variable score. Unlike the first experiment, however, the Gabor patch pairs were displayed at the top of the visual test field. After this training round came the real task: the participants repeatedly saw a single Gabor patch at the top or bottom of the test area and had to estimate the angle of the stripes.

The researchers found that the participants assessed each patch differently depending on whether it appeared at the bottom or at the top of the test field. When subjects saw the patch at the top, their perceptions immediately adapted to the utility maximisation logic they had applied during the training round. This wasn’t the case when the patch appeared at the bottom.

The study’s authors also tested these results on an artificial intelligence (AI) agent that underwent the same experiments as the human subjects. To achieve the highest possible score in the experiment, the AI agent also stopped trying to represent the world accurately when it started processing the information. The agent exhibited the same perceptual biases observed in humans.

The study participants rated the slope of the patches differently depending on whether they appeared at the bottom or top of the test field. (Illustration: Schaffner et.al. 2023)

Biases are more deeply rooted than previously thought

The results of the study may also shed new light on the discussion of biases in humans and AI agents. Perhaps these distortions are so difficult to identify and change because they are an unconscious part of vision. They kick in long before we can think about what we see.

The fact that our perception is programmed to increase utility rather than to fully represent the world doesn’t make things any easier. Yet, the results of the study can also help us find new ways to identify and correct biases.

“Sensory perception relies on fitness-maximizing codes”
https://www.nature.com/articles/s41562-023-01584-y

Source: University of Zurich, June 12, 2023