I don’t know how everyone else feels when consuming tokens while using “Claude,” but I have one thought right now: this thing is exactly like a text message bundle.
The Vanished “Word Count Anxiety” Is Back
I remember 20 years ago, mobile phone screens weren’t even as large as the camera island on some phones today. On one side, you had a 5 RMB 30MB data plan; on the other, a 5 RMB 100-message SMS bundle. Every month, you had to plan meticulously, counting every character to get by.
As it turns out, 20 years later, that feeling of being stretched thin has returned. Once your “phone credit” is gone, Claude stops replying to your messages.
Generally speaking, there are two mainstream ways to use AI Tokens today: monthly subscription plans and pre-paid pay-as-you-go.
As for monthly plans, the entry-level price on major domestic platforms is basically around 40 RMB/month. For example, the Zhipu GLM plan I’m using costs 49 RMB per month and allows for sending and receiving 400 messages per week. Calculating it out, it’s not that expensive—only about 3 cents per message.
The problem is, this plan limits you to 80 messages every 5 hours, which is 16 messages per hour. If you are a high-intensity AI user, you’ll basically enter an “AI cooling-off period” after just half an hour of use. If you really want to make the most of Claude’s functions, you have to upgrade to a monthly plan costing 149 RMB or even 469 RMB to meet the demand.
So, after all that, we’re back to where we started: meticulous calculation, trying to pack as much content as possible into every message, striving to “fill up all 70 characters” before sending the next one.
Why I Suggest Translating Tokens as “Ai-Xin”
Our generation has lived through SMS (plain text), MMS (with images), Feixin (cross-terminal communication), and WeChat (mobile internet). We thought that in 2026, with data long since being “all-you-can-eat” and video streaming being ubiquitous, billing units would start at GBs. Instead, the emergence of AI has dragged us right back to the Stone Age of counting conversation turns.
I thought about it, and calling Tokens something like “Lingpai” (Link) feels too rigid—after all, the APIs over there haven’t been named yet—and calling them “Dianshu” (Points) feels too much like an online game. After much consideration, I feel “Ai-Xin” is the most vivid. Phonetically, it combines with “AI”; a message sent by an AI is an “Ai-Xin.” Semantically, it is communication driven by intelligence, with every message carrying the warmth of computing power. Most importantly, it captures the pain point: watching consumption rise rapidly, it’s a case of “Ai-Xin-Bu-Xin” (Believe it or not/Love it or not)—the money is being deducted regardless.
AI Image Generation Today Is the “Extravagant MMS” of Yesteryear
If AI text dialogue is a 10-cent text message, then AI image and video generation is the “MMS” that everyone felt they couldn’t afford back in the day.
I checked, and for mainstream image generation consumption, if converted based on Tokens or computing points, the cost of generating a high-quality image is often dozens or even hundreds of times that of a plain text dialogue. This sense of consumption is staggering.
Back then, an MMS was a 300KB image so blurry you couldn’t even see a face, costing 50 cents.
Now, an AI-generated 4K ultra-realistic portrait might cost between 1 to 2 RMB in computing power.
In actual use, it’s almost a case of not using it unless absolutely necessary—saving wherever possible.
What’s the worst part?
AI image and video generation is an endless demand. You have to constantly revise prompts and fine-tune to get the result you want. This process, in terms of consumption, is even more painful than when an MMS failed to send or receive but you were still charged.
Of course, the “text-to-image” discussed here mainly refers to the current top-tier AI models. As for the free trials or “downgraded” models launched by various companies, they aren’t even in the consideration. After all, a truly useful “AI” has never been cheap.
