An adversarial patch is a technique that has been devised to fool the machine learning models. These patches can be a physical obstruction in the captured photos or random photos using algorithms.  Computer vision models are trained on photos that are usually straightforward. There can be different orientations or even different resolutions in the training…

The post What Are Adversarial Patches & Why Should We Worry appeared first on Analytics India Magazine.