Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 169 tok/s
Gemini 2.5 Pro 44 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 22 tok/s Pro
GPT-4o 87 tok/s Pro
Kimi K2 185 tok/s Pro
GPT OSS 120B 461 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

An Efficient Pre-processing Method to Eliminate Adversarial Effects (1905.08614v2)

Published 15 May 2019 in cs.CV

Abstract: Deep Neural Networks (DNNs) are vulnerable to adversarial examples generated by imposing subtle perturbations to inputs that lead a model to predict incorrect outputs. Currently, a large number of researches on defending adversarial examples pay little attention to the real-world applications, either with high computational complexity or poor defensive effects. Motivated by this observation, we develop an efficient preprocessing method to defend adversarial images. Specifically, before an adversarial example is fed into the model, we perform two image transformations: WebP compression, which is utilized to remove the small adversarial noises. Flip operation, which flips the image once along one side of the image to destroy the specific structure of adversarial perturbations. Finally, a de-perturbed sample is obtained and can be correctly classified by DNNs. Experimental results on ImageNet show that our method outperforms the state-of-the-art defense methods. It can effectively defend adversarial attacks while ensure only very small accuracy drop on normal images.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.