Tokens are the new commodity.
But nobody is governing what they produce.
On 18 March, Jensen Huang took the stage at Nvidia’s GTC conference and laid out a thought experiment. A $500,000 engineer should be consuming at least $250,000 in AI tokens per year. If that engineer’s annual token spend came in at $5,000, Huang said he’d “go ape.” Token budgets, he argued, are becoming a recruiting tool in Silicon Valley. Tokens are the new commodity.
Five days later, at the 2026 China Development Forum, Liu Liehong, head of China’s National Data Administration, went further. He called the token a “settlement unit” and a “value anchor for the intelligent era.” China even coined an official term for it: ciyuan, combining ci (word) with yuan (currency). Not a metaphor. A formal naming convention, designed to position tokens as a unit of economic exchange.
Two very different actors. Same week. Same conclusion.
What they're actually saying
This isn’t about compensation models or technical architecture. It’s about pricing infrastructure.
Both Huang and Liu are making the same structural argument: tokens are units of productive output, not a cost line buried inside a SaaS subscription. When Huang says an engineer plus $250K in token consumption produces dramatically more than the same engineer working without them, he’s framing token spend as a capital investment with a measurable return. When Liu describes tokens as a settlement unit connecting supply with demand, he’s framing them as the basis of a functioning market.
The analogy both of them reach for is energy. GPU cycles and electricity become tokens the way crude oil becomes gasoline. Fungible at the base layer. Differentiated by grade. Lightweight inference is regular unleaded. Deep reasoning is premium. Multimodal is high-octane. What matters to the buyer is what the fuel produces, not the molecular composition.
It’s a good analogy. But it stops too early.
Gasoline doesn't drive itself
Here’s where the token-as-commodity framing gets interesting, and where almost everyone talking about it stops short.
Crude oil becomes gasoline. Fine. But gasoline doesn’t become anything useful until someone puts it in an engine, inside a vehicle, on a road, going somewhere specific. The value isn’t in the fuel. It’s in what the fuel enables. And whether what it enables is worth the trip.
Tokens become content. Analysis. Code. Decisions. Marketing campaigns. Sales collateral. Thought leadership. Product documentation. Board reports. Internal communications. All of it, at speeds and volumes that would have been unimaginable three years ago.
And almost none of it is being governed.
Jensen’s $250K token budget only works if there’s a system determining whether that $250K produced aligned, accurate, accountable output. Without that system, you’re not investing in productivity. You’re burning fuel and hoping the car is pointed in the right direction.
The return-per-token problem
Goldman Sachs found in March that AI delivers roughly 30% productivity gains on specific, localised use cases like software development and customer support. That number gets cited a lot. It does a lot of heavy lifting in investor presentations and vendor pitch decks.
But productivity measured how?
If an organisation produces 30% more content, and that content is inconsistent, unverified, or off-narrative, the “return” is negative in brand and trust terms. Faster is not the same as better. More is not the same as right. The Goldman number measures output velocity. It says nothing about output quality, narrative coherence, or whether anyone verified that the content was fit to publish before it went out.
PwC’s 2026 AI predictions made a related point that deserves far more attention: technology delivers roughly 20% of any AI initiative’s value. The other 80% comes from redesigning work. That’s the system around the tool. The governance. The standards. The accountability. The human judgement applied at the right moments.
Eighty per cent.
That number should concern every organisation currently budgeting for token consumption without budgeting for what happens after the tokens get consumed.
The content industry's reckoning
If tokens really are becoming a commodity, priced and traded like energy, then every industry built on producing content by the unit is about to face a structural repricing.
Advertising and marketing agencies have historically valued creative output by the deliverable. The campaign. The asset. The video. The blog post. The unit cost of producing any of those things is collapsing. Chinese AI providers like MiniMax and Moonshot are already offering tokens at $2 to $3 per million output tokens, against $15 or more for comparable US models. The production cost of a piece of content is heading towards zero.
But the cost of getting it wrong isn’t going anywhere.
A brand publishing inconsistent messaging across channels doesn’t save money by producing that messaging faster. A company pushing unverified claims into market-facing materials doesn’t benefit from the efficiency gains. An organisation whose AI-generated content bypasses editorial review because there is no editorial review protocol to bypass is not more productive. It’s more exposed.
The organisations already struggling with what we see every week, content that exists everywhere and works nowhere, will face that problem at ten times the current scale. More teams producing more content through more AI tools, with no common narrative, no governance layer, and no accountability for what goes out.
That’s not a volume problem. It’s an infrastructure problem. And volume is about to make it dramatically worse.
The missing layer
The token-as-commodity conversation is about the supply side. Who produces tokens. How they’re priced. What grades exist. Which markets are being built.
Nobody is talking about the demand side. Specifically: what happens after the tokens are consumed? Who verifies that the output is accurate? Who ensures it’s aligned with the organisation’s narrative? Who is accountable when it isn’t?
Right now, the answer in most organisations is: nobody.
That’s the gap. Not the technology gap. Not the pricing gap. The governance gap.
If organisations are about to budget for tokens like energy, they need to govern what those tokens produce with the same rigour they apply to any other capital investment. Energy companies don’t just buy crude. They refine it, quality-check it, and certify that what comes out of the pipeline meets spec before it reaches the customer. Token consumption needs the same discipline.
That means narrative infrastructure: a defined point of view that determines what “accurate and aligned” actually means before any AI output is judged against it. It means governance architecture: the standards, ownership models, and review protocols that determine what can go out without human review and what can’t. And it means workflow accountability: the sign-off gates and verification checkpoints that ensure nothing gets published that hasn’t been authorised.
Without all three, you’re not running a content operation. You’re running a burn rate.
The race nobody is watching
Jensen Huang and Liu Liehong are both right. Tokens are a commodity. They are a settlement unit. The pricing infrastructure they’re building will shape the economics of AI for the next decade.
But the race isn’t just about who prices the fuel. It’s about who builds the system that makes the fuel worth burning.
The organisations that capture the return on token investment will not be the ones that spent the most. They’ll be the ones that built the governance infrastructure to ensure that what came out was worth standing behind.
That infrastructure is not a technology problem. It’s an editorial accountability problem. And it’s the part of the token economy that nobody seems to be building.
Except us.
Got something to add?
If you’d like to talk about how we could help your organisation or continue the conversation, reach out today.








Leave a Reply