>>9100>it's primarily Late Stage Capitalist states like the USA designing these things. It's already in progress.So there's a chance it'll be an expensive boondoggle, you know because greedy arms contractors milking the AI hype.
The current Ai models use vector spaces for conceptual links.
Words like brother, sister or sibling have very similar vectors.
Likewise Words like car, bus or vehicle have very similar vectors.
But the vectors in the sibling group and the vehicle group are not similar, that's how the AI knows that your not related to vehicles. It also does this for entire phrases not just words.
It works very well and can create surprisingly good cognitive maps. However it's not possible to use it for conceptual reduction, which is something you need to derive a general understanding from a particular example.
That's why the AIs fails to solve common logic puzzles if you ask them to solve a riddle while using uncommon wordings. Changing the words means that they can't find the vector space where they stored the conceptual understanding.
This doesn't really matter for civilian AI services because people that use those services will subconsciously compensate and learn which words they have to use in the prompt to get a useful reply from the thing.
If they make a military AI intended for adversarial use however, that's a pretty big weakness that can get exploited.
It's not clear if human brains have to learn conceptual reduction, children already got it by the time they learn to speak, it might be a
hardware wetware feature of the brain. If we don't have to learn this, it is very likely that this isn't part of the general intellect of society. And no amount of brute-force computational power can extract it from the data piles they scraped from the internet.
I wouldn't hand over command to AI generals just yet.