Your two-month-old is lying there, staring at a rubber duck. You show them another rubber duck. Different colour, different angle. Their expression remains the same… blank. You think: cute. Their brain thinks: ah yes, another one of those.
No one told them what a duck is. No flashcards were involved. They've been alive for eight weeks. And yet, somewhere inside that tiny head, their brain has already filed "rubber duck" into its own category, separate from cats, trees, and shopping trolleys. A landmark 2026 study put babies into an fMRI scanner to watch this happen in real time. And what they found is that your baby's brain organises the world almost identically to how the most advanced artificial intelligence learns to see. Except your baby does it faster, and without Wi-Fi.
What researchers call visual categorisation (the ability to group objects by type, even when they look slightly different) is already happening in your two-month-old's brain. Specifically, it's happening in the ventral visual cortex, a region at the bottom and side of the brain that processes what things *are*, as opposed to where they are or what they're doing.
Think of it like this: your baby's brain is building invisible folders. One folder collects images of birds: feathered, beaked, varied sizes. Another collects trees: trunks, leaves, different colours. Another collects shopping trolleys. Each folder fills itself with examples that share certain features, even though no two examples look exactly the same. Your baby isn't doing this consciously. Their brain is doing it automatically, sorting and filing the world before they can even hold their own head up.
Based on Deen et al. (2026): 130 infants show distinct neural patterns for different object categories at just 8 weeks old.
A team from Trinity College Dublin, Queen's University Belfast, and Stanford University recruited 130 two-month-old infants and placed them inside an fMRI scanner, the same imaging technology that maps adult brains. This was a monumental undertaking. Babies don't sit still. So instead of the clinical white room you might imagine, researchers built a soft, comfortable beanbag nest, gave each baby sound-cancelling headphones, and showed them bright, colourful images for 15 to 20 minutes while the scanner measured their neural activity.
They tested twelve different categories: cat, bird, rubber duck, shopping trolley, tree, and others. What they found was decisive. When a baby looked at a rubber duck and then another rubber duck (a different toy, different angle, different lighting), the pattern of brain activity in their visual cortex was more similar to itself than it was to the pattern that appeared when they looked at a cat. The brain was treating ducks as a single category. It was grouping them. The babies' neural activity patterns resembled those that appear in deep neural networks, the artificial intelligence systems that teach themselves to classify objects by identifying shared features. At eight weeks old, before any deliberate learning, your baby's brain was already doing something that mirrors the way AI learns to see.
Here's the part that changes how you think about all of it. To train an artificial intelligence to tell ducks from cats, you need to feed it millions of labelled images. Your two-month-old has seen maybe a few hundred objects total. They haven't had tutors or learning apps. And yet, their brain has already learned to extract the essential features that make a duck a duck, matching the discrimination ability of AI systems trained on vastly more data.
This is built into the architecture of their visual system from birth, or even before. Other research shows that face perception starts earlier still: newborns orient to face-like patterns within hours of birth, and by 26 weeks of gestation, fetuses already move their eyes preferentially toward face-like shapes projected through the uterine wall. The ability to categorise doesn't emerge from experience alone. It emerges from the structure of the brain itself, working in concert with whatever sensory input arrives.
The researchers also conducted follow-up scans when these babies turned nine months old. What they found was a deepening of the same process. The category patterns in the visual cortex became stronger, more stable, more finely tuned. Like watching a rough pencil sketch gradually harden into precise ink lines.
We tend to think a child's comprehension starts just before they begin expressing it, using gestures or words. But this research says otherwise: the sorting, the categorising, the quiet organising of the entire visual world is happening as early as 8 weeks, long before your baby can communicate any of it to you.
So the next time your two-month-old stares at a toy with that wide eyed wonder, know that behind those beautiful eyes, the filing cabinets are open and busy.