General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsAI can't stop recommending nuclear strikes in war game simulations

Link
https://www.newscientist.com/article/2516885-ais-cant-stop-recommending-nuclear-strikes-in-war-game-simulations/
RussBLib
(10,501 posts)
autonomous weapons without the need for human intervention, and we may indeed have a (human) extinction event, with insane motherfuckers in charge.
BootinUp
(51,091 posts)Intelligence thinks AI should even be consulted.
sdfernando
(6,050 posts)they must be using the WOPR!
Layzeebeaver
(2,247 posts)Even not reading the article, as a user of AI, this smacks entirely about the scenario prompts and about actual rules (the ones humans operate under)
When I was a kid, we sometimes played war games that included nuclear weapons options, it was a race to get them AND use them. Because there usually were no rules that indicated usage restrictions. You were free to okay as you wanted.
Whats missing is how the LLMs compete when give precise nuclear use protocols vs when they are not.
I feel the clickbait pseudoscience study force is strong.
To add a bit of nostalgia, back in the 80s Yaquinto published a board game titled ultimatum. The objective was to win either by investing in soft power and gaining control of global regions OR investing hard power (nuclear weapon systems). At some point when you felt you were losing the soft power side of things you could decide to push the button - where the soft power portion of the game stopped and was scrapped. What remained was a nuclear exchange aimed at cities, launch sites, military bases, etc. the one with the most population remaining was the winner - very few games ended in soft power mode.
TLDR; its not just AI that can run amok.