-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added support for SNAL attack #2440
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## dev_1.19.0 #2440 +/- ##
==============================================
+ Coverage 85.25% 85.38% +0.13%
==============================================
Files 330 333 +3
Lines 30470 30930 +460
Branches 5228 5294 +66
==============================================
+ Hits 25977 26410 +433
- Misses 3042 3053 +11
- Partials 1451 1467 +16
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @CNOCycle Thank you very much for you pull request. I think it looks good, I have only found a few formatting and documentation issues. Could you please take a look and let me know if you have questions?
@@ -0,0 +1,747 @@ | |||
# | |||
# Copyright (C) The Adversarial Robustness Toolbox (ART) Authors 2018 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Copyright (C) The Adversarial Robustness Toolbox (ART) Authors 2018 | |
# Copyright (C) The Adversarial Robustness Toolbox (ART) Authors 2024 |
""" | ||
This module implements the paper: "Steal Now and Attack Later: Evaluating Robustness of Object Detection against Black-box Adversarial Attacks" | ||
| Paper link: https://arxiv.org/abs/2304.05370 | ||
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
""" | |
This module implements the paper: "Steal Now and Attack Later: Evaluating Robustness of Object Detection against Black-box Adversarial Attacks" | |
| Paper link: https://arxiv.org/abs/2304.05370 | |
""" | |
""" | |
This module implements the paper: "Steal Now and Attack Later: Evaluating Robustness of Object Detection against Black-box Adversarial Attacks" | |
| Paper link: https://arxiv.org/abs/2304.05370 | |
""" |
import logging | ||
from typing import Optional, Tuple, TYPE_CHECKING |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import logging | |
from typing import Optional, Tuple, TYPE_CHECKING | |
import logging | |
import random | |
from typing import Optional, Tuple, TYPE_CHECKING |
from typing import Optional, Tuple, TYPE_CHECKING | ||
|
||
import numpy as np | ||
import random |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import random |
|
||
x_out[:, :, y1:y2, x1:x2] = updated | ||
|
||
return x_out, tile_mat |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add empty line at end of file
return x_out, tile_mat | |
return x_out, tile_mat | |
import torch | ||
TRIAL = 10 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import torch | |
TRIAL = 10 | |
import torch | |
TRIAL = 10 |
|
||
def __init__( | ||
self, | ||
estimator: "torch.nn.Module", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
estimator: "torch.nn.Module", | |
estimator: PYTORCH_OBJECT_DETECTION_TYPE, |
and please add
PYTORCH_OBJECT_DETECTION_TYPE = Union[PyTorchObjectDetector]
to art/utils.py with the other type definitions at the tope of that file.
# pylint: disable=C0412 | ||
import torch | ||
|
||
logger = logging.getLogger(__name__) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please add docstrings to all functions in this module?
|
||
logger = logging.getLogger(__name__) | ||
|
||
def _bbox_ioa(box1: "torch.tenosr", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please include the license text of the file yolov5/utils/metrics.py to this docstring?
block_size: int): | ||
""" | ||
=== NOTE === | ||
This function is modified from torchvision (torchvision/ops/drop_block.py) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please include the license text of the file yolov5/utils/metrics.py to this docstring?
art/attacks/evasion/steal_now_attack_later/steal_now_attack_later.py
Fixed
Show resolved
Hide resolved
f0c4c5d
to
87c5b9b
Compare
689df81
to
a067c50
Compare
3ffba68
to
e1bfbc0
Compare
e1bfbc0
to
d27f500
Compare
25b7253
to
492a6ce
Compare
Hi @CNOCycle Thank you very much for your contribution of the SNAL attack to ART! |
Description
This pull request adds the support of the SNAL Attack proposed in [1].
[1]Steal Now and Attack Later: Evaluating Robustness of Object Detection against Black-box Adversarial Attacks. [Paper]
Fixes # (issue)
Type of change
Please check all relevant options.
Testing
Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.
Test Configuration:
Checklist