It isn’t hard for just about anyone to change or alter an image these days -- and that can be a problem.
It’s an issue researchers at the Defense Advanced Research Projects Agency want top put to rest with a new program called Media Forensics or MediFor, which looks to build an algorithmic-based platform that can detect image manipulation.
+More on Network World: Gartner: Get onboard the algorithm train!
“The forensic tools used today lack robustness and scalability and address only some aspects of media authentication; an end‐to‐end platform to perform a complete and automated forensic analysis does not exist. Although there are a few applications for image manipulation detection in the commercial sector, they are typically limited to a yes/no decision about the source being an “original” asset, obtained directly from an imaging device. As a result, media authentication is typically performed manually using a variety of ad hoc methods that are often more art than science, and forensics analysts rely heavily on their own background and experience,” DARPA stated.
DARPA noted that many image manipulations are benign, performed for fun or for artistic value, but some are for adversarial purposes, such as propaganda or misinformation campaigns. Gaining a complete understanding of what manipulation was done is essential for analysts and systems to ultimately decide whether to use the image or video.
DARPA said it hopes the MediFor program levels this playing field, “which currently favors the image manipulator, by developing technologies for the automated assessment of the integrity of an image or video.
+More on Network World: DARPA: Current DDoS protection isn’t cutting it+
The program will integrate these technologies in a visual media forensics platform that, for a given image/video, will automatically detect manipulations, provide analysts and decision makers with detailed information about the types of manipulations performed, how they were performed, and their significance/importance in order to facilitate decisions regarding the intelligence value of the image/video. The MediFor platform will also automatically discover associations across visual media collections as another means for confirming the veracity of an image/video,” DARPA stated.
According to the agency the MediFor program has adopted a model for image and video integrity comprised of three elements:
- Digital Integrity Indicators: Are the pixels or the representation of the image or video inconsistent? Are there examples of pixel‐level features that cast doubt on the digital integrity such as edge discontinuities, blurred pixels, or repeated image regions? Do metadata and/or representation artifacts suggest manipulation?
- Physical Integrity Indicators: Are there image or video features that appear to violate the laws of physics? Do features from the 3D scene include shadows, reflections, and/or kinematics (video) that are inconsistent?
- Semantic Integrity Indicators: Do other information sources corroborate or contradict results of the digital or physical analyses or any assumptions made about the asset? Is there evidence that the date, time, or location is not correct using external knowledge, or that there are inconsistencies in digital or physical features within a group of assets? In attempting to discover if an asset may have been repurposed, is there other evidence that shows an asset is not what it is claimed to be?
- Check out these other hot stories: