AI Sees Optical Illusions: What It Reveals About Human Vision

AI Sees Optical Illusions: What It Reveals About Human Vision

Artificial intelligence systems trained on computer vision have begun to experience optical illusions in ways remarkably similar to human perception.

When researchers tested deep neural networks on classic illusions—such as the Scintillating grid, where white circles appear to contain black spots that do not physically exist—the AI systems reproduced the same perceptual distortions observed in human observers. This discovery marks a turning point in neuroscience, revealing fundamental truths about how brains construct reality.

The parallels between artificial and biological vision are not coincidental. Modern deep neural networks with feedback mechanisms follow predictive coding theory, a computational principle increasingly recognized as central to how biological brains process visual information. When these networks include recurrent connections that allow information to flow backward through layers—mimicking feedback pathways found in the brain's visual cortex—they spontaneously perceive illusions.

Networks trained on natural images suddenly "see" the Kanizsa square, where four Pac-Man shapes create the perception of a complete white square in the empty space between them. The illusion emerges not from the visual stimulus itself but from the network's internal predictions about what should be there.

Yet the similarities between human and machine vision mask profound differences. Vision language models such as GPT-4, Claude 3, and Gemini sometimes hallucinate illusions that do not exist, seeing illusory effects in ordinary images that cause no perceptual distortion in humans.

Other AI systems entirely fail to perceive motion illusions, processing the physical properties of a moving stimulus rather than the perceived motion that humans experience. These gaps between artificial and biological perception shed light on what makes human vision distinctive: the brain does not simply respond to sensory input. Instead, it actively constructs perception through an intricate dialogue between different brain regions.theregister

Recent neuroscientific discoveries have begun to decode the mechanisms underlying this constructive process. Researchers at the Allen Institute and UC Berkeley identified specialized neurons in the primary visual cortex—called IC-encoder neurons—that selectively respond to illusory contours.

Strikingly, when these neurons were artificially stimulated without any visual input, they triggered the same brain activity patterns observed during illusion perception. This finding revolutionizes the classical understanding of vision as a bottom-up process where the eye passively collects information. Instead, the brain receives top-down instructions from higher visual areas and uses these signals to complete missing information in early sensory regions.

The IC-encoder neurons receive feedback from higher-order brain structures, which send predictions downward through the visual hierarchy. When sensory information is incomplete or ambiguous—as it inevitably is in a complex, dynamic world—these top-down signals guide the construction of perception.

The system functions like a manager providing instructions to subordinate staff: higher brain regions propose what should be perceived based on learned patterns and expectations, and lower visual areas execute that construction. This mechanism represents an evolutionary solution to a fundamental problem: processing vast amounts of visual information with limited neural resources.

The brain's reliance on predictive shortcuts explains why optical illusions exist at all. Our visual system evolved not to achieve perfect accuracy in every instance but to extract meaningful information efficiently.

The Moon illusion, where the Moon appears larger near the horizon than at its zenith despite unchanged physical properties, emerges from the brain's assumption about distance and size based on surrounding visual context. Similar heuristics drive other illusions: the Hering illusion, where straight lines appear curved, results from the brain's tendency to infer 3D depth structure from 2D images.youtube

Understanding these limitations in human perception opens new avenues for medical research. Disorders involving false perceptions—such as hallucinations in schizophrenia—may stem from disruptions in the feedback mechanisms that construct normal visual experience.

By studying how the brain generates illusory percepts, researchers gain insight into what goes awry in perceptual disorders.

The work of creating optical illusions using artificial intelligence further illuminates human perception. Using generative adversarial networks, researchers can now synthesize entirely novel illusions optimized to fool both human observers and specific computer vision models.

This process reveals the precise mathematical structure that triggers perceptual distortion. Some researchers have recently developed AI tools capable of generating visual anagrams—images containing multiple recognizable figures that change depending on their orientation—allowing systematic investigation of how the brain segregates visual information.pmc.ncbi.nlm.nih

The fundamental insight emerging from these studies is that perception is not a simple reflection of reality. The eye functions as a camera, collecting photons and transmitting signals to the brain, yet what humans perceive bears only approximate resemblance to the physical world.

The brain imposes structure, fills gaps, makes predictions, and reinterprets ambiguous signals based on accumulated experience. Optical illusions expose the mechanics of this process by revealing moments when these efficient shortcuts fail.

The fact that artificial neural networks with feedback mechanisms reproduce human illusions suggests that predictive coding represents a genuinely fundamental principle of visual processing. It is not unique to biological brains but rather reflects deeper computational principles governing how systems extract meaning from noisy, incomplete sensory data.

Yet the differences between how humans and AI perceive certain illusions—particularly motion illusions—indicate that brains possess capabilities that current artificial systems have not yet replicated, including the capacity for true pattern learning independent of classification tasks.2025.ccneuro

As research continues to reveal the mechanisms of visual perception through the lens of artificial intelligence, a clearer picture emerges of what the human brain accomplishes.

It does not passively receive the world but actively constructs a coherent model of reality through iterative feedback processes, guided by learned predictions and contextual expectations. Optical illusions mark the boundaries where these mechanisms falter, and it is precisely at these boundaries that neuroscience discovers the true complexity of human perception.

Eric Collins - image

Eric Collins

Eric Collins is the News Editor, with over ten years dedicated to science communication. His expertise is focused on reporting the latest scientific Breakthroughs, Fun Facts, and the crucial intersection of Research with modern Technology and Innovation.