mirror of
https://github.com/autistic-symposium/tensorflow-for-deep-learning-py.git
synced 2025-05-10 18:54:57 -04:00
12 lines
No EOL
472 B
Markdown
12 lines
No EOL
472 B
Markdown
# Adversarial Examples in the Physical World
|
|
|
|
## Kurakin, Goodfellow, Bengio
|
|
http://arxiv.org/pdf/1607.02533v1.pdf
|
|
|
|
* An adversarial example is a sample of input data which has been modified
|
|
very slightly in a way that is intended to cause a machine learning classifier
|
|
to misclassify it.
|
|
|
|
* Adversarial examples pose security concerns because they could be
|
|
used to perform an attack on machine learning systems, even if the adversary has
|
|
no access to the underlying model |