Menu Close

Overcoming OSINT overload during chaotic events like the Israel-Hamas conflict

Satellite imagery of Al-Ahli Arab in Gaza. The blast crater is clear and obvious. The damage to the surrounding buildings is shown.

On your average day, open source intelligence (OSINT) provides a valuable service, distilling social media posts and other data into credible information. But the Israel-Hamas conflict has challenged all but the most experienced OSINT teams to keep pace. Even some longtime OSINT social media accounts that once brought clarity to conflict have become part of the noise and misinformation problem themselves.

Similar to Russia’s invasion of Ukraine, the Hamas attack has fueled a furious volume and velocity of social media posts with little concrete information. Beyond the sheer scale of reports, the Israel-Hamas war brings a new dimension: an intense, emotionally-charged environment on a global scale.

“From a verification standpoint, Ukraine was far easier – although it was hectic given the speed at which the Russian offensive unfolded across a widely dispersed geographic area at first ,” explained Alex Moore, an editor at the risk intelligence company Factal, which specializes in verifying critical events in real time. “There was more deference to trustworthy sources. But with Israel and Palestine, people are much more passionate, and they favor information that aligns with their beliefs – and everyone feels obligated to post about it.”

The result is a flood of opinions, propaganda and misinformation. Much of that misinformation – and a fair amount of disinformation – has consumed Twitter/X like never before. Even the platform’s own Community Notes system, a sort of crowdsourced fact-checking tool, has become overwhelmed.

“There’s no meaningful system of verification on X. That’s just a recipe for chaos,” said Casey Newton, editor of the technology newsletter Platformer. “But what’s interesting here is that [Elon] Musk has now created a financial incentive for people to do this because the more views they get on their posts, presumably the more they’ll be paid.”

Dramatic and sometimes gory photos and videos can generate thousands of views. Either intentionally or not, sharing recycled old and unrelated media across Twitter/X, Telegram and TikTok has become a common occurrence. Some OSINT accounts are leaning into their newfound popularity; by capitalizing on fear or taking a point of view, they attract more attention – and cash.

“What the current war in Israel and Gaza has made clear in recent days is that there are many verified, popular accounts on Twitter that use the OSINT term to give legitimacy to shoddy work that only creates more confusion,” wrote 404Media’s Joseph Cox and Emanuel Maiberg. “What exists now is a profit and engagement driven ecosystem of non-experts who in some cases may be spreading videos for the clout and cash, rather than to inform readers about what is actually true.”

Several times during the conflict, reports distributed by OSINT accounts have been picked up by regional news organizations with little if any independent verification. For example, three days after the attack, air raid sirens in northern Israel sparked reports of another infiltration of Hamas paragliders.

“Israeli, Palestinian, Lebanese and even Iranian and Iraqi verified media outlets were sharing these reports of paragliding individuals coming into Israel’s territory,” said Factal editor Agnese Boffano in the Global Security Briefing. An hour later, the IDF debunked the reports. “During that span of an hour, the amount of information that came out of verified accounts that were just being shared and picked up so quickly and so easily by a lot of media accounts, just goes to show the amount of mis- and disinformation around this conflict.”

While amateur and self-proclaimed OSINT accounts have either struggled to keep up or fanned the flames of war, professional operations continue to provide a critical service. They typically blend technology with sufficient staffing, ethical standards and a specific focus. Bellingcat, for example, has a team of experienced researchers who use a variety of tools under editorial standards and practices. It also has a specific focus: selective investigations like this analysis of the explosion at a Gaza hospital. Some may take a day or two to complete, others take months or more.

At Factal we focus on real-time verification. Our AI-powered detection platform does the heavy lifting, ingesting a diverse range of open data sources and identifying likely early-stage incidents. Then our newsroom of experienced journalists takes over, relying on pre-vetted sources with reliable track records – a critical tactic to avoid getting overwhelmed – and corroborating details under a code of ethics. If it’s unclear what’s happening in the moment, we’re transparent about what we know – and don’t know.

“Given the amount of eyes on Israel-Palestine and the speed at which developments reverberate across the region, it can be difficult for open source accounts, especially those with a particular bias, to wait for all of the puzzle pieces to become clear,” Moore said. “However, sometimes saying ‘we don’t know’ while waiting for additional corroborative evidence is the only way not to contribute to the spread of misinformation.”

(Factal’s members include many of the world’s largest companies as well as 250 humanitarian aid and disaster relief organizations, several of which have staff on the ground in Israel and Gaza. Qualified NGOs get free Factal access. Top image shared by the IDF. Second image shared by several OSINT accounts.)

Factal gives companies the facts they need in real time to protect people, avoid disruptions and drive automation when the unexpected happens.

Try Factal for free or talk with our sales team ( for a demo.