Loading...
Upload Image
This demo allows you to compare the performance of traditional AI models (Non-XAI) with Explainable AI models (XAI). You can upload an image to test the models in real-time or choose from pre-existing images with different adversarial conditions (Camouflage, Patch). Each mode showcases how AI interprets objects under various scenarios.
- Camouflage attacks exploit the similarity between an object and its background, making detection challenging.
- By gradually blending the object into its surroundings, it becomes visually and computationally difficult to distinguish.
- This attack effectively reduces the contrast between the object and its background, tricking both human observers and AI models.
- It is particularly effective in environments with significant background interference, such as natural settings or complex urban areas.
- Camouflage attacks are crucial for understanding the limitations of object detection models, especially those relying heavily on color and texture differentiation.
- Patch attacks aim to disrupt object recognition by obscuring key features with randomly shaped and sized patches.
- These patches introduce noise and occlude important regions of the object, making identification difficult.
- The patches can be placed in critical regions where the model expects key features (e.g., windows on airplanes, ship hulls, or vehicle headlights).
- By adjusting the size and selection rate of patches, researchers can evaluate how different configurations affect detection performance.
- Patch attacks are an effective method for testing model robustness and improving adversarial resilience.
Testing Object Detection Models.
-
To evaluate the impact of these attacks, users can test their models using datasets such as ShipRSImageNet and RarePlanes, which provide high-resolution imagery for ship and aircraft detection. These datasets allow for testing model performance in real-world and adversarial conditions.
Experience our models in action with sample test images. Sample Images
Upload an image to see the Non-XAI and XAI versions side by side.
Select an option:Choose whether you want to work with a plane or a ship.
Apply camouflage effects to test AI recognition.
Select an image: Choose an image from the slider below.
Adjust Camouflage Strength: Use the slider to set the level of camouflage applied to the image.
Apply the Camouflage: Click the Apply Camouflage button below to have a Camouflage attack.
Inference Hidden: Click Show Inference to display bounding boxes again.
Apply patching to modify AI detection results.
Select an image: Choose an image from the slider below.
Adjust Patch Parameters: Use the sliders to set the scaling factor and selection rate.
Inference Hidden: Click Show Inference to display bounding boxes again.