Image manipulation isn't something new, it's been happening from almost around the time when photographs were first invented. It's just that in recent years with Artificial Intelligence has made it easier to do so. This opened the floodgates for artificially manipulated images on the internet. I literally can't tell half the time if what I am seeing is real or not. I have started questioning every image I see, heck I even question some of the photos my family sends me.
This... is not good. Manipulated images have not just introduced something hard to catch, but have tarnished the credibility of real images. Images have always been a source of truth for us, something which we use to visualise what could have been merely text. Just look at history books, wouldn't they just be boring without any images? But if manipulation existed before, were images really a source of truth? History shows this isn't a one-off problem, but a recurring failure of how we trust images.
One of the most widely circulated portraits of President Abraham Lincoln was later found to be simple image compositing, where his face had been placed onto another man's body. For decades this image was never questioned of its authority, despite the fact that the body belonged to a slavery advocate, directly contradicting what Lincoln stood for. This manipulation wasn't subtle, nor was it digital, yet it went unquestioned for years. Not because it was true but because verifying it was harder than believing it.
Image on left is the altered portrait of President Lincoln and image on right is the original protrait of slavery advocate John Calhoun (Image credit: Library of Congress)
Someone who understood the power of images was the renowned dictator of the USSR, Joseph Stalin. During his reign of power photos were not just used as historical records, but as tools to shape them. Stalin's enemies were not just removed from public life, but also expunged from records, erasing them from history. This slowly became normalised, unquestioned and now instead of images documenting the past, it was Stalin's state writing their version of the past. This type of image manipulation is an example of how it could be used not just to distort truth but help overwrite it.
Left shows the original photograph of Nikolai Antipov, Stalin, Sergei Kirov and Nikolai Shvernik in Leningrad, 1926. (Credit: Tate Archive by David King, 2016/Tate, London/Art Resource, NY)
The modern digital era marked a fundamental shift. With tools like Adobe Photoshop, image manipulation stopped being rare and became accessible to almost anyone. Editing no longer required significant skill, time, or resources and more importantly, it became difficult to detect. While mechanisms like metadata were introduced to preserve authenticity, they were fragile and easily altered, offering only a thin layer of reassurance. For the first time, image creation began to scale faster than image verification. This wasn’t just an increase in fake images, it was the point where trust in images stopped keeping up.
Now, come to present day, that imbalance has widened dramatically. With the rise of Artificial Intelligence for image generation and editing, creating convincing manipulations no longer requires technical skill or effort. Images can now be generated, altered, or refined through simple instructions, often in seconds. What once demanded time, expertise, and intent has been reduced to a conversational interaction. Meanwhile, verifying whether an image is authentic still requires scrutiny, tools, and context in forensic fields. The result isn’t that images suddenly became fake, it’s that human judgment can no longer reliably keep up.
https://x.com/IndianTechGuide/status/2009256327596355938?s=20
In recent podcast with Raj Shamani, Deepinder Goyal talked about how customers abuse AI generated content to scam Zomato customer support.
I know I have been bashing on Image manipulation for the last four paragraphs or so, but it isn't inherently bad. In fact, it has enabled some of the most valuable visual work we have, from enhanced space imagery to complex visual design and creative expression. The problem isn’t that images are altered. It’s that viewers are rarely told how or why they were altered. What matters isn’t whether an image has been manipulated, but whether the truth behind it is knowable.
Image of Cosmic Tarantula, which has been enhanced from infrared spectrum to a visible spectrum for better color visualization.
(Image Credit: NASA, ESA, CSA, STScI, Webb ERO Production Team)
To explore this problem, we’ve been building Xvertice, an attempt to give viewers more context instead of false certainty. Rather than labelling images as simply “real” or “fake,” Xvertice focuses more on explainable image forensics, helping users understand how an image may have been created, altered, or processed over time. The goal isn’t to replace judgment, but to inform it.
We’re launching an experimental demo to share this approach early. It’s not a finished product, and it’s not meant to deliver definitive answers but it should make clear how we think about image trust, and how we’re trying to close the gap between creation and verification. If you try it, your feedback will directly shape where it goes next.
Xvertice Links
Website: https://x-vertice.com/
Twitter/X: https://x.com/teamxvertice
Peerlist: https://peerlist.io/company/xvertice
LinkedIn: https://www.linkedin.com/company/xvertice




Top comments (0)