Some time earlier this year, an employee at tech giant Meta built a system to track how much each staff member was using artificial intelligence (AI).
Named “Claudeonomics” after the Claude chatbot, the system created a leaderboard ranked by the number of tokens each user was exchanging with AI models, with leaders given titles such as “Token Legend”. (Tokens are tiny chunks of text, each around four characters long, that language models use for processing.)
Meta is not alone in its fascination with “tokenmaxxing”: AI labs OpenAI and Anthropic, e-commerce company Shopify, and tech investment firm Sequoia capital are all reportedly monitoring AI usage and rewarding heavy users, some of whom burn billions of tokens in a week.
Reducing a person’s performance to a single metric can be appealing for management in large corporations. But the choice of what to measure isn’t a neutral one – and if we’re not careful, it can start to rewrite our vision of what we actually value.
One of the more full-throated advocates of tokenmaxxing is Jensen Huang, chief executive of chipmaker Nvidia, who envisions a future in which tech employees negotiate high token budgets and consume tokens at rates commensurate with their salaries. Around 80% of those tokens are currently processed via Nvidia’s chips, so Huang’s enthusiasm makes sense.
But is token consumption a helpful metric for those of us who do not profit directly from AI processing volume?
In a recent book, The Score, philosopher C. Thi Nguyen analyses the rise of metrics throughout modern society and offers some helpful insights.
As Nguyen emphasises, what we measure shapes our goals. We develop metrics as tools of convenience; they standardise our measurement of values so we can compare large numbers of otherwise disparate things.
This standardisation comes at the expense of variation and distinctiveness, Nguyen argues. In business, it can make workers seem interchangeable.
Determining which employees in a large organisation are consuming the most tokens in a week is fairly straightforward. But it tells us nothing about the quality or impact of their work.
In the past, questionable metrics have contributed to dramatically bad outcomes.
Prior to the 2008 global financial crisis, for example, many financial institutions had sophisticated systems of measures designed to incentivise selling as many loans as possible, as quickly as possible. Perhaps unsurprisingly, many of those loans turned out to be far riskier than anyone realised.
Nguyen emphasises that these types of metrics can tempt us into thinking they are unavoidable. But one of the central lessons of moral philosophy is that we ought to pause at moments like these and ask a couple of basic questions: what is a good life, and what values are actually worth chasing?
Huang and others usually don’t present tokenmaxxing as an answer to these question. But that’s how it functions. What is worth devoting your professional and creative energy to? Simple: grinding through tokens. Silicon Valley has, of late, produced a striking number of manifestos and quasi-constitutions.
The Conversation