Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(62,626 posts)
Fri Apr 17, 2026, 03:08 PM Friday

"Tokenmaxxing" is making developers less productive than they think (TechCrunch, April 17)

Another article on AI code generators not helping as much as many developers believe.

No paywall:

https://techcrunch.com/2026/04/17/tokenmaxxing-is-making-developers-less-productive-than-they-think/

-snip-

He says that engineering managers are seeing code acceptance rates of 80% to 90%—meaning the share of AI-generated code that developers approve and keep—but they’re missing the churn that happens when engineers have to revise that code in the following weeks, which drives the real-world acceptance rate down between 10% and 30% of generated code.

-snip-

GitClear, another company in this space, published a report in January that found AI tools increased productivity, but also that its data showed “regular AI users averaged 9.4x higher code churn than their non-AI counterparts”—more than double the productivity gains the tools provided.

Faros AI, an engineering analytics platform, drew on two years of customer data for its March 2026 report. The finding: code churn—lines of code deleted versus lines added—had increased 861% under high AI adoption.

Jellyfish, which bills itself as an intelligence platform for AI-integrated engineering, collected data on 7,548 engineers in the first quarter of 2026. The firm found that the engineers with the largest token budgets produced the most pull requests (proposed changes to a shared codebase), but the productivity improvement didn’t scale. They achieved two times the throughput at ten times the cost of tokens. In other words, the tools are generating volume, not value.

-snip-
2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
"Tokenmaxxing" is making developers less productive than they think (TechCrunch, April 17) (Original Post) highplainsdem Friday OP
Having done my share of programming, I can say that a short conversation between engineers solves many problems. patphil Friday #1
I agree it's dangerous. But from what I've read, it's being done, a lot. highplainsdem Friday #2

patphil

(9,142 posts)
1. Having done my share of programming, I can say that a short conversation between engineers solves many problems.
Fri Apr 17, 2026, 03:40 PM
Friday

AI is inherently unable to have the kind of "Aha" moment where an idea gets understood, and the best possible solution is seen.
I don't see AI being anything more than a GIGO generator if it's not working with human engineers.
There's no elegance in AI solutions; no creativity, no sense of ownership of the code it creates. How does something with those kind of limitations test the code. How can it really challenge what it has built?
End users are also a necessary part of the process.
We always had end users as part of our team, because nobody can break code better than someone who expectations of use aren't always in the build documents.
AI may be there some day, but not for quite a while. Right now it's dangerous to allow AI to do complex coding without human review and intervention.

Latest Discussions»General Discussion»"Tokenmaxxing" is making ...