Digital Deception: Iran’s state TV admits F‑35 shootdown was AI‑generated fake

Fake AI picture (IRIB)
Fake AI picture (IRIB)

After months of mounting criticism IRIB chief Peyman Jebelli conceded that the reports were ‘not reliable’ and ‘damaged our credibility’.

By Hezy Laing

In December 2025, Iran’s state broadcaster IRIB was forced into an embarrassing admission: its wartime claim that Israeli F‑35 stealth jets had been shot down was based on fabricated evidence.

The footage aired during the height of the conflict showed what appeared to be wreckage of advanced fighter aircraft, but analysts quickly identified glaring inconsistencies.

Some images bore the hallmarks of artificial intelligence generation, with distorted proportions and unrealistic details, while others were traced back to video game simulations.

After months of mounting criticism, IRIB chief Peyman Jebelli conceded that the reports were “not reliable” and had “damaged our credibility,” acknowledging that the broadcaster lacked the ability to verify such battlefield claims.

The confession underscored the growing role of misinformation in modern warfare, where AI‑generated visuals can be deployed as propaganda but are swiftly debunked by fact‑checkers.

Israel dismissed the original reports as “fake news,” and the eventual retraction highlighted the information war that ran parallel to the military conflict.

For Iranian citizens, the episode deepened mistrust in state media, with many admitting they turned to foreign outlets such as Al Jazeera for credible coverage.

The F‑35 incident was not the only case of fabricated claims during the war.

Earlier in the conflict, Iranian outlets circulated images purporting to show Israeli tanks destroyed near the Lebanese border.

Investigations revealed the visuals were lifted from archival footage of Syrian battlefields, repurposed to fit the narrative.

Another widely shared claim alleged that Iranian drones had struck Tel Aviv’s central train station, accompanied by dramatic video clips.

These were later exposed as doctored footage from unrelated explosions in Gaza.

A third example involved supposed evidence of mass Israeli troop surrenders, illustrated with photographs that were traced back to military exercises in Turkey years earlier.

Together, these episodes illustrate how quickly misinformation can spread in wartime and how damaging it can be when official sources are implicated in its dissemination.

The reliance on AI‑generated imagery and recycled footage reflects both the technological possibilities and the desperation of propaganda efforts.

Iran’s admission regarding the F‑35 case may mark a turning point, but it also serves as a cautionary tale: in the digital age, credibility is as vital a weapon as any missile or drone.

IDF News

Videos

Heroes

Weapons