Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

General Discussion

Showing Original Post only (View all)

highplainsdem

(62,646 posts)
Fri Apr 17, 2026, 03:08 PM Friday

"Tokenmaxxing" is making developers less productive than they think (TechCrunch, April 17) [View all]

Another article on AI code generators not helping as much as many developers believe.

No paywall:

https://techcrunch.com/2026/04/17/tokenmaxxing-is-making-developers-less-productive-than-they-think/

-snip-

He says that engineering managers are seeing code acceptance rates of 80% to 90%—meaning the share of AI-generated code that developers approve and keep—but they’re missing the churn that happens when engineers have to revise that code in the following weeks, which drives the real-world acceptance rate down between 10% and 30% of generated code.

-snip-

GitClear, another company in this space, published a report in January that found AI tools increased productivity, but also that its data showed “regular AI users averaged 9.4x higher code churn than their non-AI counterparts”—more than double the productivity gains the tools provided.

Faros AI, an engineering analytics platform, drew on two years of customer data for its March 2026 report. The finding: code churn—lines of code deleted versus lines added—had increased 861% under high AI adoption.

Jellyfish, which bills itself as an intelligence platform for AI-integrated engineering, collected data on 7,548 engineers in the first quarter of 2026. The firm found that the engineers with the largest token budgets produced the most pull requests (proposed changes to a shared codebase), but the productivity improvement didn’t scale. They achieved two times the throughput at ten times the cost of tokens. In other words, the tools are generating volume, not value.

-snip-
2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Latest Discussions»General Discussion»"Tokenmaxxing" is making ...