算力之上:“泡沫论”的上下限思考

算力之上:“泡沫论”的上下限思考


The reason for discussing this topic again stems from two very direct reasons:

I have once again deeply experienced the torment of limited computing power. This is something I feel every time Gemini releases a new model: new models always manage to broaden the boundaries of applications once again, and then an "illusion" follows—it seems like many real-world tasks can be changed in the shortest possible time. However, the rate limits brought about by compute and budget constraints quickly pull wandering thoughts back to reality.

Starting in August and September, the frameworks and calculations gradually aligned with the market's anxiety points. While more and more voices are shouting "bubble," we are well aware that although every "bubble pop" carries the typical characteristics of a crisis triggered by liquidity exhaustion, the high-speed iteration and penetration of AI have significantly increased the complexity of this story.

Fundamentally, this is a story about two curves and three points of contention.

One curve is Capex; clearly, this is an almost exponentially rising curve.

The other curve is revenue; we hope it is an accelerating growth curve, but it seems inevitably to fall into the worry of resembling a logarithmic (ln) curve.

If we look at these two curves statically, the bubble theory is clearly valid. However, every three to six months, we always find that: on one hand, gravity pulls down the slope of the exponential growth curve, while on the other hand, high-speed iteration and penetration seem to pull up the slope of the revenue curve.

Therefore, this is essentially a complex periodic movement of two curves under "spring" constraints.

The first point of contention: Is the Scaling Law still working?

The second point of contention: Are there new architectures that can significantly save computing power?

The third point of contention: Is AI truly reliable?

Interestingly, the answer to all three questions seems to be: "depends."

← Back to Blog