Israel/OPT: Israeli military reservists using expertise gained at Google, Meta and Microsoft to create vast AI surveillance tool
Revealed: Israeli military creating ChatGPT-like tool using vast collection of Palestinian surveillance data
Israelโs military surveillance agency has used a vast collection of intercepted Palestinian communications to build a powerful artificial intelligence tool similar to ChatGPT that it hopes will transform its spying capabilities, an investigation by the Guardian can reveal.
The joint investigation with Israeli-Palestinian publication +972 Magazine and Hebrew-language outlet Local Call has found Unit 8200 trained the AI model to understand spoken Arabic using large volumes of telephone conversations and text messages, obtained through its extensive surveillance of the occupied territories.
[...]
A spokesperson for the Israel Defence Forces (IDF) declined to answer the Guardianโs questions about the new LLM, but said the military โdeploys various intelligence methods to identify and thwart terrorist activity by hostile organisations in the Middle Eastโ.
[...]
Initially, Israeli military intelligence struggled to build a model on this scale. โWe had no clue how to train a foundation model,โ said Sayedoff, the former intelligence official, in his presentation. At one stage, officials sent an unsuccessful request to OpenAI to run ChatGPT on the militaryโs secure systems (OpenAI declined to comment).
However, when the IDF mobilised hundreds of thousands of reservists in response to the Hamas-led 7 October attacks, a group of officers with expertise in building LLMs returned to the unit from the private sector. Some came from major US tech companies, such as Google, Meta and Microsoft. (Google said the work its employees do as reservists was โnot connectedโ to the company. Meta and Microsoft declined to comment.)
[...]
The IDF did not respond to the Guardianโs questions about how Unit 8200 ensures its machine learning models, including the new LLM being developed, do not exacerbate inaccuracies and biases. It also would not say how it protects the privacy rights of Palestinians when training models with sensitive personal data.
[...]