How can you protect images from being changed by AIs in the future? MIT researchers want to be able to answer this pressing question with an exciting innovation. Adjustments invisible to humans make it possible for AIs to no longer understand the images.
MIT researchers want to prevent AI image manipulation
Generative AIs have only recently become available to the masses, but AI systems’ abilities to create and manipulate images are rapidly advancing. The rapid development makes it necessary to think about solutions that not only identify real images with a kind of watermark but also, at best, prevent them from being captured by AI systems and thus be changed. The MIT Computer Science and Artificial Intelligence Laboratory (MIT CSAIL) is now presenting such an approach with “PhotoGuard“.
As the research team describes, according to Engadget, the system is based on the idea that AIs can be confused by targeted “disruptions” and then lose their ability to recognize images. The basic principle: selected pixels in the image are changed in such a way that the artificial intelligence gets out of step when trying to display the target image using its algorithmic model – i.e. the mathematical basis for the position and color of each pixel in the image. For humans, however, the changes are invisible.
Particularly exciting: According to the researchers, their method goes one step further. According to this, it is possible with a computationally intensive “diffusion” to camouflage an image for AIs in such a way that they simply capture a different image when trying to recognize it. “The encoder attack leads the model to believe that the input image (to be processed) is a different image (e.g., a grayscale image),” Hadi Salman, an MIT graduate student and the paper’s lead author, said in a statement to Engadget.
Digital marketing enthusiast and industry professional in Digital technologies, Technology News, Mobile phones, software, gadgets with vast experience in the tech industry, I have a keen interest in technology, News breaking.