Monday, April 8, 2024

How AI elites spin the promise of AI and how the military can use it

The first video by Jon Stewart pieces together the reasoning and arguments that AI (artificial intelligence software) will be a good thing and there is nothing to worry about. The AI segment starts at about 3:15 into the video. The comedy in it is great. The second video is a news report about how the Israeli military appears to be using an AI program named Lavender to target and kill Hamas fighters along with their families. There is nothing funny in the 2nd video. That video was brought to my attention by PD in his post from yesterday.







Together, these two videos give us a feel for how AI is going to be employed and how dark effects and dark uses of AI will be propagandized and/or hidden to the extent that people in power can spin and hide what is going on. The Israeli government is likely going to either deny the existence of Lavender or that it is used indiscriminately. 

Why post these two videos together?
Because this is important information. People really need to know at least something about how AI is going to be used, whether we like it or not. And, enquiring minds just want to know. 

It is probable that our captured and broken federal government is incapable of dealing with AI responsibly. We will very likely (~97% chance in the next two years?) be mostly left to (1) the whims of people like the cynical and transparently mendacious AI billionaires that Stewart interviewed, and (2) the brutality of authoritarian governments like Israel. 

The WaPo reported about this a couple of days ago:
It’s hard to concoct a more airy sobriquet than this one. A new report published by +972 magazine and Local Call indicates that Israel has allegedly used an AI-powered database to select suspected Hamas and other militant targets in the besieged Gaza Strip. According to the report, the tool, trained by Israeli military data scientists, sifted through a huge trove of surveillance data and other information to generate targets for assassination. It may have played a major role particularly in the early stages of the current war, as Israel conducted relentless waves of airstrikes on the territory, flattening homes and whole neighborhoods. At present count, according to the Gaza Health Ministry, more than 33,000 Palestinians, the majority being women and children, have been killed in the territory.

The AI tool’s name? “Lavender.”   
“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based,” Abraham wrote.
The use of AI technology is still only a small part of what has troubled human rights activists about Israel’s conduct in Gaza. But it points to a darker future. Lavender, observed Adil Haque, an expert on international law at Rutgers University, is “the nightmare of every international humanitarian lawyer come to life.”

Color coded targets that AI can
choose to obliterate

No comments:

Post a Comment