Not long ago, a small monkey slipped into my house while I wasn't looking, snatched two cartons of milk, bit one open, spilled it everywhere, and drank happily.
Naturally, I scared it away.
However, it led me to reflect on some very interesting questions.
Over the past few years, I've become quite familiar with monkeys, seeing them in troops every day:
They can see and smell whether passersby are carrying food.
They judge the "strength" of pedestrians, usually targeting "women, children, and the elderly" to snatch items.
They operate in groups: older monkeys give orders, orchestrating pincer attacks.
But for someone like me, if I give them a sharp glare while passing by, they flinch and dodge, even if I'm openly carrying a takeout "Yuenyeung" (coffee-tea mix).
They frequently break into nearby residents' homes to "steal" things, just as I experienced.
Recently, some lone young monkeys have uncharacteristically tried to "snatch" drinks from my hand, but they were easily scared off. Their resolve is weak and their organization is loose, making me feel that their elders haven't trained them well.
According to TV specials, when these monkeys break into homes, they also "steal" mirrors; perhaps they've begun to indulge in admiring themselves in the reflection.


As I try to connect these past observations with my current conscious observations and "ruminate" on them, I couldn't help but pose a few questions:
Over the years of coexisting with humans, it's evident that monkeys have shown a noticeable increase in "IQ," even if it may not yet be comparable to humans.
Does intelligence stem from society? Be it a monkey society or a human one.
Given enough time, could monkeys evolve to a level of intelligence approaching that of humans?
It is precisely because of these questions that I have a relatively clear stance—half-agreeing and half-disagreeing—with Yann LeCun's previous conclusions:
The part I agree with is: let's get AI to the intelligence level of a "cat" before discussing "Super AI." Yes, because mammalian "intelligence" is more tangible to us.
The part I disagree with is: if the questions above point to the answer that society is a crucial driver of intelligent evolution, then could socializing AI (such as through platforms like Moltbook) potentially lead to the emergence of "intelligence"? After all, while biological social evolution takes thousands or tens of thousands of years or longer, the evolution of digital life faces no such time constraints as long as there is sufficient computing power.
But the prerequisite is that AI needs not only memory but also the ability to modify its own presets.
This prerequisite is clearly not yet met. When will it be? Perhaps it's on the way, or perhaps it requires much more accumulation. Perhaps no one has a definitive answer yet.
The issues discussed above are not new; they have been discussed in detail since perhaps the 1960s.
Maybe history means a lot to us, or maybe history is just a burden and a distraction. Who knows?