The curated light sources have a “secret code” that can be used to verify the video authenticity and see if the visuals have been manipulated [File]
| Photo Credit: REUTERS
Cornell researchers have proposed a way to help forensic experts tell AI-manipulated videos from genuine ones by using specially designed light sources at key events that reveal when videos have been morphed.
A paper titled, “Noise-Coded Illumination for Forensic and Photometric Video,” describes how light sources featured in a video could be encoded secretly through visual noise fluctuations. In essence, this would watermark the light source itself, rather than individually trying to watermark every video shot at an event in order to prevent these clips from being morphed.
These curated light sources have a “secret code” that can be used to verify the video authenticity and see if the visuals have been manipulated.

Computer scientist and graduate student Peter Michael led this work on Noise-Coded Illumination (NCI).
“Our approach effectively adds a temporal watermark to any video recorded under coded illumination. However, rather than encoding a specific message, this watermark encodes an image of the unmanipulated scene as it would appear lit only by the coded illumination,” stated the paper.
This tactic would allow forensic experts to compare a manipulated video to an easily accessible version of the original, instead of having to search for the source material manually.
“When an adversary manipulates video captured under coded illumination, they unwittingly change the code images contained therein. Knowing the codes used by each light source lets us recover and examine these code images, which we can use to identify and visualize manipulation,” stated the paper.
The work noted that such an approach could be useful for public events and interviews, to prevent clips of these key meetings from being morphed.
However, the success of the venture depends on the widespread adoption of the specially designed lights.
As AI-generated videos or AI-morphed clips become more realistic, experts are looking at more ways to watermark original content. However, the need of the hour is a watermarking method that even malicious attackers cannot remove from the videos they work with.
Published – August 13, 2025 02:35 pm IST