#habsora
Explore tagged Tumblr posts
Text
Maybe I’ve just missed it, but I haven’t seen anyone on here bring up this horrific information directly from Israeli intelligence and military officers. I’m including the entire post from the BDS Instagram account (@/bdsnationalcommittee) under the cut, but the most damning revelation, at least to me, is this:
Israel is literally using an AI program to generate names of “targets” that they can cite as excuses to bomb civilians. I cannot overstate this: these targets are LITERALLY MADE UP. THEY AREN’T EVEN GUARANTEED TO BE THE NAMES OF REAL PEOPLE.
There is no legitimate system for checking the validity of these accusations, or if the generated identities are even real. Israel claims to estimate that there are over 30,000 of these “targets” still in Gaza, and they will likely exceed that number.
Keep in mind that this number only represents the supposed Hamas operatives, and Israel has openly admitted (including in quotes under the cut) that they are aware of the immense collateral deaths of innocent civilians, and fully consider it an acceptable sacrifice.
The death toll in Gaza is currently almost 22,000, and the real number is likely far greater due to how difficult it is to excavate and identify bodies.
#for some reason tumblr wont let me paste the link to the post smh. I’ll try to rb with it later#free gaza#gaza strip#gaza#free palestine#palestine#palestinian genocide#israel#genocide#ai#hamas#zionism#habsora ai#habsora
29 notes
·
View notes
Text
youtube
#sub media#gaza#palestine#israel#operation iron sword#hamas#habsora#ai#israel defense forces#idf#google#amazon#youtube#dahiya doctrine#nizoz#storm clouds#iai harpy#project nimbus#automl#elbit systems#israeli occupation#genocide#war crimes
7 notes
·
View notes
Text
According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.” According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process.
uhh by "AI" do they mean the thing that routinely makes up stuff? that kind of AI?
(source)
631 notes
·
View notes
Text
this article is so illuminating and shows why so many of us believe this is a genocide-- according to the own words of IDF soldiers and israeli govt and their actions. they are admitting repeatedly that they sometimes target civilian areas and civilians and cultural heritage sites intentionally, knowing hamas is not there, in a twisted attempt of turning creating civil pressure on hamas.
Compared to previous Israeli assaults on Gaza, the current war — which Israel has named “Operation Iron Swords,” and which began in the wake of the Hamas-led assault on southern Israel on October 7 — has seen the army significantly expand its bombing of targets that are not distinctly military in nature. These include private residences as well as public buildings, infrastructure, and high-rise blocks, which sources say the army defines as “power targets” (“matarot otzem”). The bombing of power targets, according to intelligence sources who had first-hand experience with its application in Gaza in the past, is mainly intended to harm Palestinian civil society: to “create a shock” that, among other things, will reverberate powerfully and “lead civilians to put pressure on Hamas,” as one source put it.
theyre literally intentionally terrorising and killing palestinian civilians hoping it will somehow cause palestinians to somehow do the job of getting hamas for israel. instead of actually just.......idk.......trying to get hamas.
Several of the sources, who spoke to +972 and Local Call on the condition of anonymity, confirmed that the Israeli army has files on the vast majority of potential targets in Gaza — including homes — which stipulate the number of civilians who are likely to be killed in an attack on a particular target. This number is calculated and known in advance to the army’s intelligence units, who also know shortly before carrying out an attack roughly how many civilians are certain to be killed. In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source. “Nothing happens by accident,” said another source. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”
the usage of "we are not hamas" to say that they are intentionally choosing to kill civilians instead of doing so at random is.. insane. "we are not hamas" should be followed by being more humane, not.. "we decided killing hundreds of palestinian civilians is worth it to get 1 single hamas member!"
According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process.
so, unshockingly, they are sometimes killing everyone within a building over some potential 1 hamas member, and sometimes there isnt a singular hamas member known in that building. so it could just be purely civilians being killed.
Another source said that a senior intelligence officer told his officers after October 7 that the goal was to “kill as many Hamas operatives as possible,” for which the criteria around harming Palestinian civilians were significantly relaxed. As such, there are “cases in which we shell based on a wide cellular pinpointing of where the target is, killing civilians. This is often done to save time, instead of doing a little more work to get a more accurate pinpointing,” said the source.
so they can be more accurate and precise with their attacks, as should be obvious for a highly sophisticated military, but they decide its better to just kill thousands of civilians if it saves them time.
From the first moment after the October 7 attack, decisionmakers in Israel openly declared that the response would be of a completely different magnitude to previous military operations in Gaza, with the stated aim of totally eradicating Hamas. “The emphasis is on damage and not on accuracy,” said IDF Spokesperson Daniel Hagari on Oct. 9. The army swiftly translated those declarations into actions.
The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices. The idea behind hitting such targets, say three intelligence sources who were involved in planning or conducting strikes on power targets in the past, is that a deliberate attack on Palestinian society will exert “civil pressure” on Hamas.
they are deliberately destroying palestinian culture and history and society, hoping it will somehow create more pressure on hamas. 0 regard for palestinians' well-beings and safety and existence and they keep saying this over & over again
The final category consists of “family homes” or “operatives’ homes.” The stated purpose of these attacks is to destroy private residences in order to assassinate a single resident suspected of being a Hamas or Islamic Jihad operative. However, in the current war, Palestinian testimonies assert that some of the families that were killed did not include any operatives from these organizations. In the early stages of the current war, the Israeli army appears to have given particular attention to the third and fourth categories of targets. According to statements on Oct. 11 by the IDF Spokesperson, during the first five days of fighting, half of the targets bombed — 1,329 out of a total 2,687 — were deemed power targets.
so half of their targets were specifically intended to terrorise palestinian civilians and weren't actually attacks on hamas.
“We are asked to look for high-rise buildings with half a floor that can be attributed to Hamas,” said one source who took part in previous Israeli offensives in Gaza. “Sometimes it is a militant group’s spokesperson’s office, or a point where operatives meet. I understood that the floor is an excuse that allows the army to cause a lot of destruction in Gaza. That is what they told us. “If they would tell the whole world that the [Islamic Jihad] offices on the 10th floor are not important as a target, but that its existence is a justification to bring down the entire high-rise with the aim of pressuring civilian families who live in it in order to put pressure on terrorist organizations, this would itself be seen as terrorism. So they do not say it,” the source added.
the goal of their destruction of residential buildings isn't even about getting a hamas member who may or may not be there, its terrorism against palestinians.
Various sources who served in IDF intelligence units said that at least until the current war, army protocols allowed for attacking power targets only when the buildings were empty of residents at the time of the strike. However, testimonies and videos from Gaza suggest that since October 7, some of these targets have been attacked without prior notice being given to their occupants, killing entire families as a result.
unshockingly its as palestinians in gaza have been saying: they get attacked with no warning and countless civilian deaths occur as a result.
According to the Israeli army, during the first five days of fighting it dropped 6,000 bombs on the Strip, with a total weight of about 4,000 tons. Media outlets reported that the army had wiped out entire neighborhoods; according to the Gaza-based Al Mezan Center for Human Rights, these attacks led to “the complete destruction of residential neighborhoods, the destruction of infrastructure, and the mass killing of residents.” As documented by Al Mezan and numerous images coming out of Gaza, Israel bombed the Islamic University of Gaza, the Palestinian Bar Association, a UN building for an educational program for outstanding students, a building belonging to the Palestine Telecommunications Company, the Ministry of National Economy, the Ministry of Culture, roads, and dozens of high-rise buildings and homes — especially in Gaza’s northern neighborhoods.
Yet despite the unbridled Israeli bombardment, the damage to Hamas’ military infrastructure in northern Gaza during the first days of the war appears to have been very minimal. Indeed, intelligence sources told +972 and Local Call that military targets that were part of power targets have previously been used many times as a fig leaf for harming the civilian population. “Hamas is everywhere in Gaza; there is no building that does not have something of Hamas in it, so if you want to find a way to turn a high-rise into a target, you will be able to do so,” said one former intelligence official.
they admit they use the excuse of hamas to justify attacking overwhelmingly civilian areas.
Indeed, according to sources who were involved in the compiling of power targets in previous wars, although the target file usually contains some kind of alleged association with Hamas or other militant groups, striking the target functions primarily as a “means that allows damage to civil society.” The sources understood, some explicitly and some implicitly, that damage to civilians is the real purpose of these attacks.
According to the doctrine — developed by former IDF Chief of Staff Gadi Eizenkot, who is now a Knesset member and part of the current war cabinet — in a war against guerrilla groups such as Hamas or Hezbollah, Israel must use disproportionate and overwhelming force while targeting civilian and government infrastructure in order to establish deterrence and force the civilian population to pressure the groups to end their attacks. The concept of “power targets” seems to have emanated from this same logic. The first time the Israeli army publicly defined power targets in Gaza was at the end of Operation Protective Edge in 2014. The army bombed four buildings during the last four days of the war — three residential multi-story buildings in Gaza City, and a high-rise in Rafah. The security establishment explained at the time that the attacks were intended to convey to the Palestinians of Gaza that “nothing is immune anymore,” and to put pressure on Hamas to agree to a ceasefire. “The evidence we collected shows that the massive destruction [of the buildings] was carried out deliberately, and without any military justification,” stated an Amnesty report in late 2014.
Not only has the current war seen Israel attack an unprecedented number of power targets, it has also seen the army abandon prior policies that aimed at avoiding harm to civilians. Whereas previously the army’s official procedure was that it was possible to attack power targets only after all civilians had been evacuated from them, testimonies from Palestinian residents in Gaza indicate that, since October 7, Israel has attacked high-rises with their residents still inside, or without having taken significant steps to evacuate them, leading to many civilian deaths. Such attacks very often result in the killing of entire families, as experienced in previous offensives; according to an investigation by AP conducted after the 2014 war, about 89 percent of those killed in the aerial bombings of family homes were unarmed residents, and most of them were children and women.
However, evidence from Gaza suggests that some high-rises — which we assume to have been power targets — were toppled without prior warning. +972 and Local Call located at least two cases during the current war in which entire residential high-rises were bombed and collapsed without warning, and one case in which, according to the evidence, a high-rise building collapsed on civilians who were inside.
therefore palestinian civilians are being killed without even being given warnings, just for the sake of terrorising other palestinians and hopefully pressuring hamas.
Six days later, on Oct. 31, the eight-story Al-Mohandseen residential building was bombed without warning. Between 30 and 45 bodies were reportedly recovered from the ruins on the first day. One baby was found alive, without his parents. Journalists estimated that over 150 people were killed in the attack, as many remained buried under the rubble. The building used to stand in Nuseirat Refugee Camp, south of Wadi Gaza — in the supposed “safe zone” to which Israel directed the Palestinians who fled their homes in northern and central Gaza — and therefore served as temporary shelter for the displaced, according to testimonies.
so theyre also attacking "safe zones".
According to an investigation by Amnesty International, on Oct. 9, Israel shelled at least three multi-story buildings, as well as an open flea market on a crowded street in the Jabaliya Refugee Camp, killing at least 69 people. “The bodies were burned … I didn’t want to look, I was scared of looking at Imad’s face,” said the father of a child who was killed. “The bodies were scattered on the floor. Everyone was looking for their children in these piles. I recognized my son only by his trousers. I wanted to bury him immediately, so I carried my son and got him out.” According to Amnesty’s investigation, the army said that the attack on the market area was aimed at a mosque “where there were Hamas operatives.” However, according to the same investigation, satellite images do not show a mosque in the vicinity.
independent investigations are finding inconsistencies between IDF claims and reality.
According to the IDF Spokesperson, by Nov. 10, during the first 35 days of fighting, Israel attacked a total of 15,000 targets in Gaza. Based on multiple sources, this is a very high figure compared to the four previous major operations in the Strip. During Guardian of the Walls in 2021, Israel attacked 1,500 targets in 11 days. In Protective Edge in 2014, which lasted 51 days, Israel struck between 5,266 and 6,231 targets. During Pillar of Defense in 2012, about 1,500 targets were attacked over eight days. In Cast Lead” in 2008, Israel struck 3,400 targets in 22 days. Intelligence sources who served in the previous operations also told +972 and Local Call that, for 10 days in 2021 and three weeks in 2014, an attack rate of 100 to 200 targets per day led to a situation in which the Israeli Air Force had no targets of military value left. Why, then, after nearly two months, has the Israeli army not yet run out of targets in the current war?
Israeli analysts have admitted that the military effectiveness of these kinds of disproportionate aerial attacks is limited. Two weeks after the start of the bombings in Gaza (and before the ground invasion) — after the bodies of 1,903 children, approximately 1,000 women, and 187 elderly men were counted in the Gaza Strip — Israeli commentator Avi Issacharoff tweeted: “As hard as it is to hear, on the 14th day of fighting, it does not appear that the military arm of Hamas has been significantly harmed. The most significant damage to the military leadership is the assassination of [Hamas commander] Ayman Nofal.”
i did not share all of the article so u can feel free to read all of it but it just confirms what many of us know to be the horrific and cruel acts of the IDF.
47 notes
·
View notes
Text
@GOOGLE & @AMAZON ARE ENABLING THE FIRST AI-POWERED GENOCIDE.
Over the last 100+ days, Israel has escalated its assault on Gaza in what's being called the first AI-facilitated genocide in human history. @amazon @google & companies across tech have powered the current genocide of Palestinians in Gaza & the surveillance & oppression of Paletinians across historic Palestine for years, revving up Israel's genocide machine that has led to the murder of 32K-F Palestinians, according to @euromedhr.
Last November, a @972mag investigation revealed the Israeli military's use of a new AI-based system called Habsora ("The Gospel") to automatically generate bombing targets & kill Palestinians in Gaza at an unprecedented rate.
Google & Amazon are also providing powerful Al tech to the Israeli military through the $1B Project Nimbus contract, which was signed while Israel dropped bombs on Gaza during its May 2021 assault.
In 2022, a @theintercept investigation confirmed @Google is offering advanced Al & machine-learning capabilities to Israel via Nimbus. The dots indicate that the new cloud would include facial detection tech & even sentiment analysis that claims to "assess the emotional content of pictures, speech & writing" to Israel. Any of these capabilities
In 2022, a @theintercept investigation confirmed @Google is offering advanced Al & machine-learning capabilities to Israel via Nimbus. The docs indicate that the new cloud would include facial detection tech & even sentiment analysis that claims to "assess the emotional content of pictures, speech & writing" to Israel. Any of these capabilities supercharge Israel's ability to surveil Palestinians & collect/process data on Palestinians—key strategies of the Israeli occupation.
Workers don't want their labor to be used to power genocide.
For 2+ yrs, Google & Amazon workers w/ community orgs have organized against the companies' ties to Israel. Last year, 100s of tech workers & community protested at @googlecloud & @amazonwebservices conferences in SF & NYC. In 2022, tech workers & community organized #NoTechForApartheid demonstrations in four tech hubs across the US in a historic show of unity & solidarity among workers across two of the biggest tech companies on the planet.
We won't stop organizing until @amazon @google drop Nimbus & the tech industry stops fueling state violence & genocide.
Take action: - Are you a tech worker? Get involved at t.ly/ NotaGenerallntake
Demand #NoTechForApartheid by emailing the CEOs at notechforapartheid.com.
#gaza#free gaza#gaza strip#palestine#gazagenocide#gaza news#gazaunderfire#gazaunderattack#free palestine#save gaza#technology#tech#no tech for apartheid#israeli apartheid#pro israel#boycott israel#israel news#isreal#palestinian#propaganda#from the river to the sea palestine will be free#palestinian genocide#pray for palestine#stand with palestine#israel palestine conflict#save palestine#long live palestine#palestine news#palestinian film#palestinians
25 notes
·
View notes
Text
Several of the sources, who spoke to +972 and Local Call on the condition of anonymity, confirmed that the Israeli army has files on the vast majority of potential targets in Gaza — including homes — which stipulate the number of civilians who are likely to be killed in an attack on a particular target. This number is calculated and known in advance to the army’s intelligence units, who also know shortly before carrying out an attack roughly how many civilians are certain to be killed.
In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source.
“Nothing happens by accident,” said another source. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”
According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”
15 notes
·
View notes
Text
According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.” According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process. In the majority of cases, the sources added, military activity is not conducted from these targeted homes. “I remember thinking that it was like if [Palestinian militants] would bomb all the private residences of our families when [Israeli soldiers] go back to sleep at home on the weekend,” one source, who was critical of this practice, recalled.
12 notes
·
View notes
Text
"For air advisory missions, which I imagine involve intelligence sharing and training, specific domestic legal restrictions such as the Leahy law and the assassination ban would likely come into play,” McBrien said. But the Leahy vetting process is “reversed” for Israel; rather than vetting Israeli military units beforehand, the U.S. State Department sends aid and then waits for reports of violations, according to a recent articleOpens in a new tab by Josh Paul, who resigned from his post as a State Department political-military officer over his concerns with U.S. support for Israel.
“As a general matter, U.S. officials who are providing support to another country during armed conflict would want to make sure they are not aiding and abetting war crimes,” Finucane told The Intercept. He emphasized that the same principle applies to weapons transfers and intelligence sharing.
The Israeli military intentionally strikes Palestinian civilian infrastructure, known as “power targetsOpens in a new tab,” in order to “create a shock,” according to an investigation by the Israeli news website +972 Magazine. Targets are generated using an artificial intelligence system known as “Habsora,” Hebrew for “gospel.”
“Nothing happens by accident,” an Israeli military intelligence source told +972 Magazine. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”
4 notes
·
View notes
Text
#sub media#gaza#palestine#israel#operation iron sword#hamas#habsora#ai#israel defense forces#idf#google#amazon#youtube#dahiya doctrine#nizoz#storm clouds#iai harpy#project nimbus#automl#elbit systems#israeli occupation#genocide#war crimes
1 note
·
View note
Text
Compared to previous Israeli assaults on Gaza, the current war — which Israel has named “Operation Iron Swords,” and which began in the wake of the Hamas-led assault on southern Israel on October 7 — has seen the army significantly expand its bombing of targets that are not distinctly military in nature. These include private residences as well as public buildings, infrastructure, and high-rise blocks, which sources say the army defines as “power targets” (“matarot otzem”).
Several of the sources, who spoke to +972 and Local Call on the condition of anonymity, confirmed that the Israeli army has files on the vast majority of potential targets in Gaza — including homes — which stipulate the number of civilians who are likely to be killed in an attack on a particular target. This number is calculated and known in advance to the army’s intelligence units, who also know shortly before carrying out an attack roughly how many civilians are certain to be killed.
Over 300 families have lost 10 or more family members in Israeli bombings in the past two months — a number that is 15 times higher than the figure from what was previously Israel’s deadliest war on Gaza, in 2014. At the time of writing, around 15,000 Palestinians have been reported killed in the war, and counting.
“If they would tell the whole world that the [Islamic Jihad] offices on the 10th floor are not important as a target, but that its existence is a justification to bring down the entire high-rise with the aim of pressuring civilian families who live in it in order to put pressure on terrorist organizations, this would itself be seen as terrorism. So they do not say it,” the source added.
…by the time the temporary ceasefire took hold on Nov. 23, Israel had killed 14,800 Palestinians in Gaza; approximately 6,000 of them were children and 4,000 were women…
In total, according to the UN, 1.7 million Palestinians, the vast majority of the Strip’s population, have been displaced within Gaza since October 7. The army claimed that the demand to evacuate the Strip’s north was intended to protect civilian lives. Palestinians, however, see this mass displacement as part of a “new Nakba” — an attempt to ethnically cleanse part or all of the territory. The answer may lie in a statement from the IDF Spokesperson on Nov. 2, according to which it is using the AI system Habsora (“The Gospel”), which the spokesperson says “enables the use of automatic tools to produce targets at a fast pace, and works by improving accurate and high-quality intelligence material according to [operational] needs.” In the statement, a senior intelligence official is quoted as saying that thanks to Habsora, targets are created for precision strikes “while causing great damage to the enemy and minimal damage to non-combatants. Hamas operatives are not immune — no matter where they hide.” According to intelligence sources, Habsora generates, among other things, automatic recommendations for attacking private residences where people suspected of being Hamas or Islamic Jihad operatives live. Israel then carries out large-scale assassination operations through the heavy shelling of these residential homes. Habsora, explained one of the sources, processes enormous amounts of data that “tens of thousands of intelligence officers could not process,” and recommends bombing sites in real time. Because most senior Hamas officials head into underground tunnels with the start of any military operation, the sources say, the use of a system like Habsora makes it possible to locate and attack the homes of relatively junior operatives.
6 notes
·
View notes
Text
Un’inchiesta di +972 Magazine e Local Call rivela che l’esercito israeliano ignora le precauzioni a tutela dei civili e affida la scelta dei suoi obiettivi all’intelligenza artificiale. Ecco perché gli attacchi sulla Striscia di Gaza stanno facendo un numero di vittime senza precedenti.
L’inchiesta di rivela che l’esercito israeliano ha autorizzato il bombardamento di bersagli non militari, allentato le limitazioni sul numero di vittime civili tollerabili e usato un sistema di intelligenza artificiale chiamato Habsora (Vangelo) per generare un numero senza precedenti di potenziali bersagli, contribuendo così alla natura distruttiva della guerra nella Striscia di Gaza, una delle campagne militari più sanguinose contro i palestinesi dall’epoca della Nakba (la “catastrofe”, la cacciata dei palestinesi dalle loro terre nel 1948).
L’inchiesta si basa su conversazioni avute con sette agenti dell’intelligence israeliana – tra cui personale militare e dell’aeronautica coinvolto in operazioni israeliane nella Striscia sotto assedio – oltre a testimonianze palestinesi, dati e documentazione dalla Striscia di Gaza, e a dichiarazioni ufficiali del portavoce dell’esercito e di altre istituzioni israeliane.
(Yuval Abraham, +972 Magazine, Israele)
4 notes
·
View notes
Text
Israel's Assassination Factory
youtube
This video is an examination of "Israel's Mass Assassination Factory," a bombshell story broken by Yuval Abraham for 972. After summarizing the current state of the conflict, we delve into the Habsora system, how it works, its use of the "Power Target" designations, and how all of these factors represent a serious escalation in violence from the already violent occupying colonial state of Israel.
2 notes
·
View notes
Text
In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source.
“Nothing happens by accident,” said another source. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target. We are not Hamas. These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.”
According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”
According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process.
In the majority of cases, the sources added, military activity is not conducted from these targeted homes. “I remember thinking that it was like if [Palestinian militants] would bomb all the private residences of our families when [Israeli soldiers] go back to sleep at home on the weekend,” one source, who was critical of this practice, recalled.
...
The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices. The idea behind hitting such targets, say three intelligence sources who were involved in planning or conducting strikes on power targets in the past, is that a deliberate attack on Palestinian society will exert “civil pressure” on Hamas.
The final [fourth] category consists of “family homes” or “operatives’ homes.” The stated purpose of these attacks is to destroy private residences in order to assassinate a single resident suspected of being a Hamas or Islamic Jihad operative. However, in the current war, Palestinian testimonies assert that some of the families that were killed did not include any operatives from these organizations.
In the early stages of the current war, the Israeli army appears to have given particular attention to the third and fourth categories of targets. According to statements on Oct. 11 by the IDF Spokesperson, during the first five days of fighting, half of the targets bombed — 1,329 out of a total 2,687 — were deemed power targets.
...
“We are asked to look for high-rise buildings with half a floor that can be attributed to Hamas,” said one source who took part in previous Israeli offensives in Gaza. “Sometimes it is a militant group’s spokesperson’s office, or a point where operatives meet. I understood that the floor is an excuse that allows the army to cause a lot of destruction in Gaza. That is what they told us.
“If they would tell the whole world that the [Islamic Jihad] offices on the 10th floor are not important as a target, but that its existence is a justification to bring down the entire high-rise with the aim of pressuring civilian families who live in it in order to put pressure on terrorist organizations, this would itself be seen as terrorism. So they do not say it,” the source added.
...
“They will never just hit a high-rise that does not have something we can define as a military target,” said another intelligence source, who carried out previous strikes against power targets. “There will always be a floor in the high-rise [associated with Hamas]. But for the most part, when it comes to power targets, it is clear that the target doesn’t have military value that justifies an attack that would bring down the entire empty building in the middle of a city, with the help of six planes and bombs weighing several tons.”
Indeed, according to sources who were involved in the compiling of power targets in previous wars, although the target file usually contains some kind of alleged association with Hamas or other militant groups, striking the target functions primarily as a “means that allows damage to civil society.” The sources understood, some explicitly and some implicitly, that damage to civilians is the real purpose of these attacks.
#dystopian beyond all measures#truly psychopathic#israel#fuck israel#gaza genocide#ai#palestine#stop gaza genocide#war crimes#ai militarism#this is a reminder to all of you to become luddites#stop the genocide#nakba#mass murder#israeli war crimes#972mag#crimes against humanity#iof#targeting civilians#international law#israeli military#military ai#genocide#r/#gaza under attack#gaza under genocide#gaza strip#this is why israel always looses. the ppl are so far fucking gone from reality it's staggering#israeli terrorism#Dahiya Doctrine
2 notes
·
View notes
Text
Sai Bourothu, a researcher on the Automated Decision Research team at the Stop Killer Robots coalition, has spoken to Al Jazeera about Israel’s use of the Lavender system, an AI-powered database, to identify targets for bombing in Gaza. She said the increasing use of such processing systems in conflict is “deeply concerning from a legal, moral and humanitarian perspective”.
“While the Habsora [or Gospel] system used AI to identify targets such as buildings and structures, the Lavender system generates human targets,” she said. The use of such a system by Israel in the Gaza Strip, she said, raises “grave concerns over the increasing use of autonomy in conflict, digital dehumanisation, artificial intelligence, automation bias and human control in the use of force”. She noted that these systems’ rapid generation of targets and the Israeli military’s sweeping approval of recommended targets “brings into question compliance with international humanitarian law, particularly the principles of distinction and proportionality, and raises serious ethical concerns around responsibility and accountability for the use of force”. “The reduction of people to data points through the use of these systems has contributed to accelerating digital dehumanisation in war,” she said.
-- "Israel’s Lavender AI concerning from a legal, moral, humanitarian perspective" by Nils Adler and Farah Najjar for Al Jazeera, 4 Apr 2024 16:05 GMT
0 notes
Text
"This piece aims to identify the pitfalls in thinking about what is being called an ‘algorithmic genocide’ in Gaza. I’d like to push against the exceptionalism afforded to AI; for example pieces which set military uses of AI as distinct from previous iterations of techno-warfare. Rather, the spectre of ‘artificial intelligence’ is a reification—a set of social relations in the false appearance of concrete form; something made by us which has been cast as something outside of us. And the way in which AI has been talked about in the context of a potentially ‘AI-enabled’ genocide in Gaza poses a dangerous distraction. All of the actually interesting and hard problems about AI, besides all the math, lie in its capacity as an intangible social technology and rhetorical device which elides human intention, creating the space of epistemic indeterminacy through which people act.
...The data does not “speak for itself”, neither in the context of academic research or in military applications.
Any ML model is, from its beginning, bound to a human conceptual apparatus.
...
The reification of AI, which happens at all points on the political spectrum, is actively dangerous in the context of its being taken to its most extreme conclusion: in the ‘usage' of ‘AI’ for mass death, as in the case of Gospel (‘Habsora’, הבשורה, named after the infallible word of God) and Lavender. This reification gives cover for politicians and military officers to make decisions about human lives, faking a hand-off of responsibility to a pile of linear algebra and in doing so handing themselves a blank check to do whatever they want. The extent to which these “AI systems” are credible or actually used is irrelevant, because the main purpose they serve is ideological, with massive psychological benefits for those pressing the buttons. Talking about military AI shifts the focus from the social relations between people to the technologies used to implement them, a mystification which misdirects focus and propagates invincibility.
There are things which are horrifying and exceptional about the current genocide, but the deployment of technology is not in itself one of those things; the usage of data-driven methods to conduct warfare is neither ‘intelligent’ nor ‘artificial’, and moreover not even remotely novel. As prior reporting from Ars Technica has shown about the NSA’s SKYNET program in Pakistan, Lavender is not even the first machine learning-driven system of mass assassination. I recently read Nick Turse’s Kill Anything That Moves: The Real American War in Vietnam (2013) and was struck by the parallels to the current campaign of extermination in Gaza, down to the directed-from-above obsession with fulfilling ‘body count’ as well as the creation of anarchic spaces in which lower-level operatives are afforded opportunities to carry out atrocities which were not explicitly ordered, an observation which has also been made of the Shoah. Thinking about it in this way allows us to fold AI into other discourses of technological warfare over the past century, such as the US’s usage of IBM 360 mainframe computers in Vietnam to similarly produce lists of targets under Operation Igloo White. Using technology as rhetorical cover for bureaucratized violence is not new.
The Lavender piece by Yuval Abraham states that IDF soldiers rapidly rubber-stamped bombing targets “despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases”. But even if the error rate were 0.005% it wouldn’t matter, because the ‘precision’ canard is just laundering human intent through a justification-manufacturing apparatus which has zero technical component. Abraham reports that “sources who have used Lavender in recent months say human agency and precision were substituted by mass target creation and lethality,” but in reality exactly zero human agency has been removed. He writes that “once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence…AI did most of the work instead”, but this verbiage is a perverse reversal of cause and effect to create post-hoc justification.
...
Another line from the Gospel piece reads “the increasing use of AI based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale”. Emphasis mine—that word ‘allows’ is the hinge upon which this whole grotesque charade rests. The algorithm isn’t choosing anything; the choices already happened in the compiling and labeling of the dataset. The collecting and categorizing of data—which data on individuals’ social media or GPS movements or purchasing activity is to be used, which to be excluded—is in itself the construction of an elaborate ideological apparatus."
..
The purpose of a system is what it does, and science is a thing which people do
...We can expect the laundering of agency, whitewashed through the ideological device of 'the algorithm', to begin to be deployed in the arena of international law, given the ways in which Israel is already trying to sidestep the ‘genocidal intent’ it has been charged with at the ICJ. "The fetish of AI as a commodity allows companies and governments to sell it, particularly Israel, which still enjoys a fairly glowing reputation in the ML/AI industry and research world."
#no tech for genocide#no tech for apartheid#palestine#free palestine#gaza#isreal#genocide#apartheid#colonization#american imperialism#us politics#police state#settler police#settler colonialism#settler violence
1 note
·
View note
Text
Yuval Abraham, the journalist in the picture, wrote about this:
"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”
The IDF admits to using the Gaza health ministry death toll
254 notes
·
View notes