Compare commits
29 Commits
1.8.0
...
a9289d8e47
Author | SHA1 | Date | |
---|---|---|---|
a9289d8e47 | |||
67ec8af3ae | |||
691f45a1dc | |||
80197454a3 | |||
841cd09f2d | |||
e7183864d0 | |||
12ac3dbf62 | |||
e41ec93058 | |||
8a86e3142a | |||
9d003d12b4 | |||
63c638751b | |||
a3e85b7207 | |||
4974be02b4 | |||
7fe2f9a5c1 | |||
f4634d03bd | |||
e1ca9fadcd | |||
adb7700300 | |||
ced0d4c2fc | |||
f483c76638 | |||
f65116aec8 | |||
bdcb6de4b2 | |||
af8e27d075 | |||
accf926ff7 | |||
acf387bd0e | |||
ab7a91e020 | |||
a1a0d6b757 | |||
2fb089805c | |||
6cf2a94431 | |||
162636e60f |
53
.github/ISSUE_TEMPLATE/bug_report.md
vendored
53
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -1,42 +1,41 @@
|
|||||||
---
|
---
|
||||||
name: Bug report
|
name: Rapport de bug
|
||||||
about: Create a report to help us improve
|
about: Créez un rapport pour nous aider à nous améliorer
|
||||||
title: ''
|
title: ''
|
||||||
labels: ''
|
labels: ''
|
||||||
assignees: ''
|
assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
**Describe the bug**
|
**Décrivez le bug**
|
||||||
A clear and concise description of what the bug is.
|
Une description claire et concise de ce qu'est le bug.
|
||||||
|
|
||||||
**Affected module**
|
**Module affecté**
|
||||||
Which one is the module that is not working as expected, e.g. Nessus, Qualys WAS, Qualys VM, OpenVAS, ELK, Jira...).
|
Lequel des modules ne fonctionne pas comme prévu, par exemple, Nessus, Qualys WAS, Qualys VM, OpenVAS, ELK, Jira...
|
||||||
|
|
||||||
**VulnWhisperer debug trail**
|
**Trace de débogage de VulnWhisperer**
|
||||||
If applicable, paste the VulnWhisperer debug trail of the execution for further detail (execute with '-d' flag).
|
Si possible, veuillez joindre la trace de débogage de l'exécution pour une enquête plus approfondie (exécuter avec l'option `-d`).
|
||||||
|
|
||||||
**To Reproduce**
|
**Pour reproduire**
|
||||||
Steps to reproduce the behavior:
|
Étapes pour reproduire le comportement :
|
||||||
1. Go to '...'
|
1. Allez à '...'
|
||||||
2. Click on '....'
|
2. Cliquez sur '....'
|
||||||
3. Scroll down to '....'
|
3. Voir l'erreur
|
||||||
4. See error
|
|
||||||
|
|
||||||
**Expected behavior**
|
**Comportement attendu**
|
||||||
A clear and concise description of what you expected to happen.
|
Une description claire et concise de ce à quoi vous vous attendiez.
|
||||||
|
|
||||||
**Screenshots**
|
**Captures d'écran**
|
||||||
If applicable, add screenshots to help explain your problem.
|
Si applicable, ajoutez des captures d'écran pour aider à expliquer votre problème.
|
||||||
|
|
||||||
**System in which VulnWhisperer runs (please complete the following information):**
|
**Système sur lequel VulnWhisperer s'exécute (veuillez compléter les informations suivantes) :**
|
||||||
- OS: [e.g. Ubuntu Server]
|
- OS : [ex. Ubuntu Server]
|
||||||
- Version: [e.g. 18.04.2 LTS]
|
- Version : [ex. 18.04.2 LTS]
|
||||||
- VulnWhisperer Version: [e.g. 1.7.1]
|
- Version de VulnWhisperer : [ex. 1.7.1]
|
||||||
|
|
||||||
**Additional context**
|
**Contexte additionnel**
|
||||||
Add any other context about the problem here.
|
Ajoutez tout autre contexte sur le problème ici.
|
||||||
|
|
||||||
## Important Note
|
**Note importante**
|
||||||
As VulnWhisperer relies on ELK for the data aggregation, it is expected that you already have an ELK instance or the knowledge to deploy one.
|
Comme VulnWhisperer s'appuie sur ELK pour l'agrégation de données, il est attendu que vous ayez déjà une instance ELK ou les connaissances pour en déployer une.
|
||||||
In order to speed up deployment, we provide an updated and tested docker-compose file which deploys all the needed infrastructure and we will support its deployment, but we will not be giving support to ELK instances.
|
Pour accélérer le déploiement, nous fournissons un fichier docker-compose à jour et testé qui déploie toute l'infrastructure nécessaire et nous supporterons son déploiement, mais nous ne donnerons pas de support pour les instances ELK.
|
@ -1,4 +1,4 @@
|
|||||||
FROM centos:latest
|
FROM centos:7
|
||||||
|
|
||||||
MAINTAINER Justin Henderson justin@hasecuritysolutions.com
|
MAINTAINER Justin Henderson justin@hasecuritysolutions.com
|
||||||
|
|
||||||
|
190
README.md
190
README.md
@ -1,19 +1,15 @@
|
|||||||
<p align="center"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/vuln_whisperer_logo_s.png" width="400px"></p>
|
<p align="center"><img src="https://git.gudita.com/Cyberdefense/VulnWhisperer/raw/branch/master/docs/source/vuln_whisperer_logo_s.png" width="400px"></p>
|
||||||
<p align="center"> <i>Create <u><b>actionable data</b></u> from your vulnerability scans </i> </p>
|
<p align="center"> <i>Créez des <u><b>données exploitables</b></u> à partir de vos scans de vulnérabilités</i> </p>
|
||||||
|
|
||||||
<p align="center" style="width:400px"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/vulnWhispererWebApplications.png" style="width:400px"></p>
|
<p align="center" style="width:400px"><img src="https://git.gudita.com/Cyberdefense/VulnWhisperer/raw/branch/master/docs/source/vulnWhispererWebApplications.png" style="width:400px"></p>
|
||||||
|
|
||||||
|
|
||||||
VulnWhisperer is a vulnerability management tool and report aggregator. VulnWhisperer will pull all the reports from the different Vulnerability scanners and create a file with a unique filename for each one, using that data later to sync with Jira and feed Logstash. Jira does a closed cycle full Sync with the data provided by the Scanners, while Logstash indexes and tags all of the information inside the report (see logstash files at /resources/elk6/pipeline/). Data is then shipped to ElasticSearch to be indexed, and ends up in a visual and searchable format in Kibana with already defined dashboards.
|
VulnWhisperer est un outil de gestion des vulnérabilités et un agrégateur de rapports. VulnWhisperer récupère tous les rapports des différents scanners de vulnérabilités et crée un fichier avec un nom unique pour chacun, utilisant ensuite ces données pour se synchroniser avec Jira et alimenter Logstash. Jira effectue une synchronisation complète en cycle fermé avec les données fournies par les scanners, tandis que Logstash indexe et étiquette toutes les informations contenues dans le rapport (voir les fichiers logstash dans `/resources/elk6/pipeline/`). Les données sont ensuite envoyées à ElasticSearch pour être indexées, et finissent dans un format visuel et consultable dans Kibana avec des tableaux de bord déjà définis.
|
||||||
|
|
||||||
[](https://travis-ci.org/HASecuritySolutions/VulnWhisperer)
|
VulnWhisperer est un projet open-source financé par la communauté. VulnWhisperer est actuellement fonctionnel mais nécessite une refonte de la documentation et une revue de code. Si vous souhaitez de l'aide, si vous êtes intéressé par de nouvelles fonctionnalités, ou si vous recherchez un support payant, veuillez nous contacter à **info@sahelcyber.com**.
|
||||||
[](http://choosealicense.com/licenses/mit/)
|
|
||||||
[](https://twitter.com/VulnWhisperer)
|
|
||||||
|
|
||||||
Currently Supports
|
|
||||||
-----------------
|
|
||||||
|
|
||||||
### Vulnerability Frameworks
|
### Scanners de Vulnérabilités Supportés
|
||||||
|
|
||||||
- [X] [Nessus (**v6**/**v7**/**v8**)](https://www.tenable.com/products/nessus/nessus-professional)
|
- [X] [Nessus (**v6**/**v7**/**v8**)](https://www.tenable.com/products/nessus/nessus-professional)
|
||||||
- [X] [Qualys Web Applications](https://www.qualys.com/apps/web-app-scanning/)
|
- [X] [Qualys Web Applications](https://www.qualys.com/apps/web-app-scanning/)
|
||||||
@ -26,143 +22,117 @@ Currently Supports
|
|||||||
- [ ] [NMAP](https://nmap.org/)
|
- [ ] [NMAP](https://nmap.org/)
|
||||||
- [ ] [Burp Suite](https://portswigger.net/burp)
|
- [ ] [Burp Suite](https://portswigger.net/burp)
|
||||||
- [ ] [OWASP ZAP](https://www.zaproxy.org/)
|
- [ ] [OWASP ZAP](https://www.zaproxy.org/)
|
||||||
- [ ] More to come
|
- [ ] Et d'autres à venir
|
||||||
|
|
||||||
### Reporting Frameworks
|
### Plateformes de Reporting Supportées
|
||||||
|
|
||||||
- [X] [ELK](https://www.elastic.co/elk-stack)
|
- [X] [Elastic Stack (**v6**/**v7**)](https://www.elastic.co/elk-stack)
|
||||||
|
- [ ] [OpenSearch - Envisagé pour la prochaine mise à jour](https://opensearch.org/)
|
||||||
- [X] [Jira](https://www.atlassian.com/software/jira)
|
- [X] [Jira](https://www.atlassian.com/software/jira)
|
||||||
- [ ] [Splunk](https://www.splunk.com/)
|
- [ ] [Splunk](https://www.splunk.com/)
|
||||||
|
|
||||||
Getting Started
|
## Démarrage
|
||||||
===============
|
|
||||||
|
|
||||||
1) Follow the [install requirements](#installreq)
|
1) Suivez les [prérequis d'installation](#installreq)
|
||||||
2) Fill out the section you want to process in <a href="https://github.com/HASecuritySolutions/VulnWhisperer/blob/master/configs/frameworks_example.ini">frameworks_example.ini file</a>
|
2) Remplissez la section que vous souhaitez traiter dans le fichier <a href="https://git.gudita.com/Cyberdefense/VulnWhisperer/src/branch/master/configs/frameworks_example.ini">frameworks_example.ini</a>
|
||||||
3) [JIRA] If using Jira, fill Jira config in the config file mentioned above.
|
3) [JIRA] Si vous utilisez Jira, remplissez la configuration Jira dans le fichier de configuration mentionné ci-dessus.
|
||||||
3) [ELK] Modify the IP settings in the <a href="https://github.com/HASecuritySolutions/VulnWhisperer/tree/master/resources/elk6/pipeline">Logstash files to accommodate your environment</a> and import them to your logstash conf directory (default is /etc/logstash/conf.d/)
|
3) [ELK] Modifiez les paramètres IP dans les <a href="https://git.gudita.com/Cyberdefense/VulnWhisperer/src/branch/master/resources/elk6/pipeline">fichiers Logstash pour correspondre à votre environnement</a> et importez-les dans votre répertoire de configuration logstash (par défaut `/etc/logstash/conf.d/`)
|
||||||
4) [ELK] Import the <a href="https://github.com/HASecuritySolutions/VulnWhisperer/blob/master/resources/elk6/kibana.json">Kibana visualizations</a>
|
4) [ELK] Importez les <a href="https://git.gudita.com/Cyberdefense/VulnWhisperer/src/branch/master/resources/elk6/kibana.json">visualisations Kibana</a>
|
||||||
5) [Run Vulnwhisperer](#run)
|
5) [Exécutez Vulnwhisperer](#run)
|
||||||
|
|
||||||
Need assistance or just want to chat? Join our [slack channel](https://join.slack.com/t/vulnwhisperer/shared_invite/enQtNDQ5MzE4OTIyODU0LWQxZTcxYTY0MWUwYzA4MTlmMWZlYWY2Y2ZmM2EzNDFmNWVlOTM4MzNjYzI0YzdkMDA0YmQyYWRhZGI2NGUxNGI)
|
> **Note importante concernant les liens du Wiki :** La migration de Gitea ne transfère pas toujours le Wiki d'un projet GitHub (qui est techniquement un dépôt séparé). Si les liens vers le Wiki (comme le guide de déploiement ELK) ne fonctionnent pas, vous devrez peut-être recréer ces pages manuellement dans l'onglet "Wiki" de votre dépôt sur Gitea.
|
||||||
|
|
||||||
Requirements
|
Besoin d'aide ou juste envie de discuter ? Rejoignez notre [canal Slack](https://join.slack.com/t/vulnwhisperer/shared_invite/enQtNDQ5MzE4OTIyODU0LWQxZTcxYTY0MWUwYzA4MTlmMWZlYWY2Y2ZmM2EzNDFmNWVlOTM4MzNjYzI0YzdkMDA0YmQyYWRhZGI2NGUxNGI)
|
||||||
-------------
|
|
||||||
####
|
|
||||||
* Python 2.7
|
|
||||||
* Vulnerability Scanner
|
|
||||||
* Reporting System: Jira / ElasticStack 6.6
|
|
||||||
|
|
||||||
<a id="installreq">Install Requirements-VulnWhisperer(may require sudo)</a>
|
## Prérequis
|
||||||
--------------------
|
* Python 2.7
|
||||||
**Install OS packages requirement dependencies** (Debian-based distros, CentOS don't need it)
|
* Un Scanner de Vulnérabilités
|
||||||
|
* Un Système de Reporting : Jira / ElasticStack 6.6
|
||||||
|
|
||||||
|
<a id="installreq"></a>
|
||||||
|
## Prérequis d'Installation - VulnWhisperer (peut nécessiter sudo)
|
||||||
|
**Installez les dépendances des paquets du système d'exploitation** (pour les distributions basées sur Debian, CentOS n'en a pas besoin)
|
||||||
```shell
|
```shell
|
||||||
|
sudo apt-get install zlib1g-dev libxml2-dev libxslt1-dev
|
||||||
|
|
||||||
sudo apt-get install zlib1g-dev libxml2-dev libxslt1-dev
|
(Optionnel) Utilisez un environnement virtuel python pour ne pas perturber les bibliothèques python de l'hôte
|
||||||
```
|
|
||||||
|
|
||||||
**(Optional) Use a python virtualenv to not mess with host python libraries**
|
virtualenv venv # créera l'environnement virtuel python 2.7
|
||||||
```shell
|
|
||||||
virtualenv venv (will create the python 2.7 virtualenv)
|
|
||||||
source venv/bin/activate (start the virtualenv, now pip will run there and should install libraries without sudo)
|
|
||||||
|
|
||||||
deactivate (for quitting the virtualenv once you are done)
|
source venv/bin/activate # démarre l'environnement, pip s'exécutera ici et devrait installer les bibliothèques sans sudo
|
||||||
```
|
|
||||||
|
|
||||||
**Install python libraries requirements**
|
deactivate # pour quitter l'environnement virtuel une fois que vous avez terminé
|
||||||
|
|
||||||
```python
|
Installez les dépendances des bibliothèques python
|
||||||
pip install -r /path/to/VulnWhisperer/requirements.txt
|
|
||||||
cd /path/to/VulnWhisperer
|
pip install -r /chemin/vers/VulnWhisperer/requirements.txt
|
||||||
|
cd /chemin/vers/VulnWhisperer
|
||||||
python setup.py install
|
python setup.py install
|
||||||
```
|
|
||||||
|
|
||||||
**(Optional) If using a proxy, add proxy URL as environment variable to PATH**
|
(Optionnel) Si vous utilisez un proxy, ajoutez l'URL du proxy comme variable d'environnement au PATH
|
||||||
```shell
|
|
||||||
export HTTP_PROXY=http://example.com:8080
|
|
||||||
export HTTPS_PROXY=http://example.com:8080
|
|
||||||
```
|
|
||||||
|
|
||||||
Now you're ready to pull down scans. (see <a href="#run">run section</a>)
|
export HTTP_PROXY=[http://exemple.com:8080](http://exemple.com:8080)
|
||||||
|
export HTTPS_PROXY=[http://exemple.com:8080](http://exemple.com:8080)
|
||||||
|
|
||||||
|
Vous êtes maintenant prêt à télécharger les scans.
|
||||||
|
|
||||||
Configuration
|
Configuration
|
||||||
-----
|
Il y a quelques étapes de configuration pour mettre en place VulnWhisperer :
|
||||||
|
|
||||||
There are a few configuration steps to setting up VulnWhisperer:
|
Configurer le fichier Ini
|
||||||
* Configure Ini file
|
|
||||||
* Setup Logstash File
|
|
||||||
* Import ElasticSearch Templates
|
|
||||||
* Import Kibana Dashboards
|
|
||||||
|
|
||||||
<a href="https://github.com/austin-taylor/VulnWhisperer/blob/master/configs/frameworks_example.ini">frameworks_example.ini file</a>
|
Configurer le fichier Logstash
|
||||||
<p align="left" style="width:200px"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/config_example.png" style="width:200px"></p>
|
|
||||||
|
|
||||||
|
Importer les modèles ElasticSearch
|
||||||
|
|
||||||
<a id="run">Run</a>
|
Importer les tableaux de bord Kibana
|
||||||
-----
|
|
||||||
To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.
|
Exécution
|
||||||
```python
|
Pour exécuter, remplissez le fichier de configuration avec les paramètres de votre scanner de vulnérabilités. Ensuite, vous pouvez l'exécuter depuis la ligne de commande.
|
||||||
(optional flag: -F -> provides "Fancy" log colouring, good for comprehension when manually executing VulnWhisperer)
|
|
||||||
|
# (optionnel : -F -> fournit une coloration "Fantaisie" des logs, utile pour la compréhension lors de l'exécution manuelle de VulnWhisperer)
|
||||||
vuln_whisperer -c configs/frameworks_example.ini -s nessus
|
vuln_whisperer -c configs/frameworks_example.ini -s nessus
|
||||||
or
|
# ou
|
||||||
vuln_whisperer -c configs/frameworks_example.ini -s qualys
|
vuln_whisperer -c configs/frameworks_example.ini -s qualys
|
||||||
|
|
||||||
```
|
Si aucune section n'est spécifiée (ex. -s nessus), vulnwhisperer vérifiera dans le fichier de configuration les modules ayant la propriété enabled=true et les exécutera séquentiellement.
|
||||||
If no section is specified (e.g. -s nessus), vulnwhisperer will check on the config file for the modules that have the property `enabled=true` and run them sequentially.
|
|
||||||
|
|
||||||
<p align="center" style="width:300px"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/running_vuln_whisperer.png" style="width:400px"></p>
|
|
||||||
Next you'll need to import the visualizations into Kibana and setup your logstash config. You can either follow the sample setup instructions [here](https://github.com/HASecuritySolutions/VulnWhisperer/wiki/Sample-Guide-ELK-Deployment) or go for the `docker-compose` solution we offer.
|
|
||||||
|
|
||||||
|
|
||||||
Docker-compose
|
Docker-compose
|
||||||
-----
|
ELK est un monde en soi, et pour les nouveaux venus sur la plateforme, cela nécessite des compétences de base sous Linux et généralement un peu de dépannage jusqu'à ce qu'il soit déployé et fonctionne comme prévu. Comme nous ne sommes pas en mesure de fournir un support pour les problèmes ELK de chaque utilisateur, nous avons mis en place un docker-compose qui inclut :
|
||||||
ELK is a whole world by itself, and for newcomers to the platform, it requires basic Linux skills and usually a bit of troubleshooting until it is deployed and working as expected. As we are not able to provide support for each users ELK problems, we put together a docker-compose which includes:
|
|
||||||
|
|
||||||
- VulnWhisperer
|
VulnWhisperer
|
||||||
- Logstash 6.6
|
|
||||||
- ElasticSearch 6.6
|
|
||||||
- Kibana 6.6
|
|
||||||
|
|
||||||
The docker-compose just requires specifying the paths where the VulnWhisperer data will be saved, and where the config files reside. If ran directly after `git clone`, with just adding the Scanner config to the VulnWhisperer config file ([/resources/elk6/vulnwhisperer.ini](https://github.com/HASecuritySolutions/VulnWhisperer/blob/master/resources/elk6/vulnwhisperer.ini)), it will work out of the box.
|
Logstash 6.6
|
||||||
|
|
||||||
It also takes care to load the Kibana Dashboards and Visualizations automatically through the API, which needs to be done manually otherwise at Kibana's startup.
|
ElasticSearch 6.6
|
||||||
|
|
||||||
For more info about the docker-compose, check on the [docker-compose wiki](https://github.com/HASecuritySolutions/VulnWhisperer/wiki/docker-compose-Instructions) or the [FAQ](https://github.com/HASecuritySolutions/VulnWhisperer/wiki).
|
Kibana 6.6
|
||||||
|
|
||||||
Getting Started
|
Le docker-compose nécessite simplement de spécifier les chemins où les données de VulnWhisperer seront sauvegardées, et où se trouvent les fichiers de configuration. S'il est exécuté directement après un git clone, en ajoutant simplement la configuration du scanner au fichier de configuration de VulnWhisperer (/resources/elk6/vulnwhisperer.ini), il fonctionnera immédiatement.
|
||||||
===============
|
|
||||||
|
|
||||||
Our current Roadmap is as follows:
|
Il se charge également de charger automatiquement les tableaux de bord et les visualisations Kibana via l'API, ce qui doit être fait manuellement autrement au démarrage de Kibana.
|
||||||
- [ ] Create a Vulnerability Standard
|
|
||||||
- [ ] Map every scanner results to the standard
|
|
||||||
- [ ] Create Scanner module guidelines for easy integration of new scanners (consistency will allow #14)
|
|
||||||
- [ ] Refactor the code to reuse functions and enable full compatibility among modules
|
|
||||||
- [ ] Change Nessus CSV to JSON (Consistency and Fix #82)
|
|
||||||
- [ ] Adapt single Logstash to standard and Kibana Dashboards
|
|
||||||
- [ ] Implement Detectify Scanner
|
|
||||||
- [ ] Implement Splunk Reporting/Dashboards
|
|
||||||
|
|
||||||
On top of this, we try to focus on fixing bugs as soon as possible, which might delay the development. We also very welcome PR's, and once we have the new standard implemented, it will be very easy to add compatibility with new scanners.
|
Pour plus d'informations sur le docker-compose, consultez le wiki docker-compose ou la FAQ.
|
||||||
|
|
||||||
The Vulnerability Standard will initially be a new simple one level JSON with all the information that matches from the different scanners having standardized variable names, while maintaining the rest of the variables as they are. In the future, once everything is implemented, we will evaluate moving to an existing standard like ECS or AWS Vulnerability Schema; we prioritize functionality over perfection.
|
Feuille de route
|
||||||
|
Notre feuille de route actuelle est la suivante :
|
||||||
|
|
||||||
Video Walkthrough -- Featured on ElasticWebinar
|
[ ] Créer un standard de vulnérabilité
|
||||||
----------------------------------------------
|
|
||||||
<a href="http://www.youtube.com/watch?feature=player_embedded&v=zrEuTtRUfNw?start=30
|
|
||||||
" target="_blank"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/elastic_webinar.png"
|
|
||||||
alt="Elastic presentation on VulnWhisperer" border="10" /></a>
|
|
||||||
|
|
||||||
Authors
|
[ ] Mapper les résultats de chaque scanner au standard
|
||||||
------
|
|
||||||
- [Austin Taylor (@HuntOperator)](https://github.com/austin-taylor)
|
|
||||||
- [Justin Henderson (@smapper)](https://github.com/SMAPPER)
|
|
||||||
|
|
||||||
Contributors
|
|
||||||
------------
|
|
||||||
- [Quim Montal (@qmontal)](https://github.com/qmontal)
|
|
||||||
- [@pemontto](https://github.com/pemontto)
|
|
||||||
- [@cybergoof](https://github.com/cybergoof)
|
|
||||||
|
|
||||||
AS SEEN ON TV
|
[ ] Créer des directives de module de scanner pour une intégration facile de nouveaux scanners
|
||||||
-------------
|
|
||||||
<p align="center" style="width:400px"><a href="https://twitter.com/MalwareJake/status/935654519471353856"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/as_seen_on_tv.png" style="width:400px"></a></p>
|
[ ] Refactoriser le code pour réutiliser les fonctions et permettre une compatibilité totale entre les modules
|
||||||
|
|
||||||
|
[ ] Changer Nessus CSV en JSON
|
||||||
|
|
||||||
|
[ ] Adapter le Logstash unique au standard et aux tableaux de bord Kibana
|
||||||
|
|
||||||
|
[ ] Implémenter le scanner Detectify
|
||||||
|
|
||||||
|
[ ] Implémenter le reporting/tableaux de bord Splunk
|
||||||
|
|
||||||
|
En plus de cela, nous essayons de nous concentrer sur la correction des bugs dès que possible, ce qui peut retarder le développement. Nous accueillons également très volontiers les PR (Pull Requests), et une fois que nous aurons implémenté le nouveau standard, il sera très facile d'ajouter la compatibilité avec de nouveaux scanners.
|
||||||
|
|
||||||
|
Le standard de vulnérabilité sera initialement un nouveau JSON simple à un niveau avec toutes les informations correspondantes des différents scanners ayant des noms de variables standardisés, tout en conservant le reste des variables telles quelles.
|
||||||
|
@ -83,14 +83,17 @@ def main():
|
|||||||
enabled_sections = config.get_sections_with_attribute('enabled')
|
enabled_sections = config.get_sections_with_attribute('enabled')
|
||||||
|
|
||||||
for section in enabled_sections:
|
for section in enabled_sections:
|
||||||
vw = vulnWhisperer(config=args.config,
|
try:
|
||||||
profile=section,
|
vw = vulnWhisperer(config=args.config,
|
||||||
verbose=args.verbose,
|
profile=section,
|
||||||
username=args.username,
|
verbose=args.verbose,
|
||||||
password=args.password,
|
username=args.username,
|
||||||
source=args.source,
|
password=args.password,
|
||||||
scanname=args.scanname)
|
source=args.source,
|
||||||
exit_code += vw.whisper_vulnerabilities()
|
scanname=args.scanname)
|
||||||
|
exit_code += vw.whisper_vulnerabilities()
|
||||||
|
except Exception as e:
|
||||||
|
logger.error("VulnWhisperer was unable to perform the processing on '{}'".format(args.source))
|
||||||
else:
|
else:
|
||||||
logger.info('Running vulnwhisperer for section {}'.format(args.section))
|
logger.info('Running vulnwhisperer for section {}'.format(args.section))
|
||||||
vw = vulnWhisperer(config=args.config,
|
vw = vulnWhisperer(config=args.config,
|
||||||
|
@ -2,6 +2,8 @@
|
|||||||
enabled=true
|
enabled=true
|
||||||
hostname=localhost
|
hostname=localhost
|
||||||
port=8834
|
port=8834
|
||||||
|
access_key=
|
||||||
|
secret_key=
|
||||||
username=nessus_username
|
username=nessus_username
|
||||||
password=nessus_password
|
password=nessus_password
|
||||||
write_path=/opt/VulnWhisperer/data/nessus/
|
write_path=/opt/VulnWhisperer/data/nessus/
|
||||||
@ -13,6 +15,8 @@ verbose=true
|
|||||||
enabled=true
|
enabled=true
|
||||||
hostname=cloud.tenable.com
|
hostname=cloud.tenable.com
|
||||||
port=443
|
port=443
|
||||||
|
access_key=
|
||||||
|
secret_key=
|
||||||
username=tenable.io_username
|
username=tenable.io_username
|
||||||
password=tenable.io_password
|
password=tenable.io_password
|
||||||
write_path=/opt/VulnWhisperer/data/tenable/
|
write_path=/opt/VulnWhisperer/data/tenable/
|
||||||
@ -37,7 +41,7 @@ max_retries = 10
|
|||||||
template_id = 126024
|
template_id = 126024
|
||||||
|
|
||||||
[qualys_vuln]
|
[qualys_vuln]
|
||||||
#Reference https://www.qualys.com/docs/qualys-was-api-user-guide.pdf to find your API
|
#Reference https://www.qualys.com/docs/qualys-api-vmpc-user-guide.pdf to find your API
|
||||||
enabled = true
|
enabled = true
|
||||||
hostname = qualysapi.qg2.apps.qualys.com
|
hostname = qualysapi.qg2.apps.qualys.com
|
||||||
username = exampleuser
|
username = exampleuser
|
||||||
|
@ -2,6 +2,8 @@
|
|||||||
enabled=true
|
enabled=true
|
||||||
hostname=nessus
|
hostname=nessus
|
||||||
port=443
|
port=443
|
||||||
|
access_key=
|
||||||
|
secret_key=
|
||||||
username=nessus_username
|
username=nessus_username
|
||||||
password=nessus_password
|
password=nessus_password
|
||||||
write_path=/opt/VulnWhisperer/data/nessus/
|
write_path=/opt/VulnWhisperer/data/nessus/
|
||||||
@ -13,6 +15,8 @@ verbose=true
|
|||||||
enabled=true
|
enabled=true
|
||||||
hostname=tenable
|
hostname=tenable
|
||||||
port=443
|
port=443
|
||||||
|
access_key=
|
||||||
|
secret_key=
|
||||||
username=tenable.io_username
|
username=tenable.io_username
|
||||||
password=tenable.io_password
|
password=tenable.io_password
|
||||||
write_path=/opt/VulnWhisperer/data/tenable/
|
write_path=/opt/VulnWhisperer/data/tenable/
|
||||||
|
@ -1,12 +1,12 @@
|
|||||||
pandas==0.20.3
|
pandas==0.20.3
|
||||||
setuptools==40.4.3
|
setuptools==40.4.3
|
||||||
pytz==2017.2
|
pytz==2017.2
|
||||||
Requests==2.18.3
|
Requests==2.20.0
|
||||||
lxml==4.1.1
|
lxml==4.6.5
|
||||||
future-fstrings
|
future-fstrings
|
||||||
bs4
|
bs4
|
||||||
jira
|
jira
|
||||||
bottle
|
bottle
|
||||||
coloredlogs
|
coloredlogs
|
||||||
qualysapi>=5.1.0
|
qualysapi==6.0.0
|
||||||
httpretty
|
httpretty
|
||||||
|
@ -2,7 +2,7 @@
|
|||||||
# Email: austin@hasecuritysolutions.com
|
# Email: austin@hasecuritysolutions.com
|
||||||
# Last Update: 03/04/2018
|
# Last Update: 03/04/2018
|
||||||
# Version 0.3
|
# Version 0.3
|
||||||
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash
|
# Description: Take in Openvas web scan reports from vulnWhisperer and pumps into logstash
|
||||||
|
|
||||||
input {
|
input {
|
||||||
file {
|
file {
|
||||||
|
231
resources/elk6/logstash-vulnwhisperer-template_elk7.json
Executable file
231
resources/elk6/logstash-vulnwhisperer-template_elk7.json
Executable file
@ -0,0 +1,231 @@
|
|||||||
|
{
|
||||||
|
"index_patterns": "logstash-vulnwhisperer-*",
|
||||||
|
"mappings": {
|
||||||
|
"properties": {
|
||||||
|
"@timestamp": {
|
||||||
|
"type": "date"
|
||||||
|
},
|
||||||
|
"@version": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"asset": {
|
||||||
|
"type": "text",
|
||||||
|
"norms": false,
|
||||||
|
"fields": {
|
||||||
|
"keyword": {
|
||||||
|
"type": "keyword",
|
||||||
|
"ignore_above": 256
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"asset_uuid": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"assign_ip": {
|
||||||
|
"type": "ip"
|
||||||
|
},
|
||||||
|
"category": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"cve": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"cvss_base": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"cvss_temporal_vector": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"cvss_temporal": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"cvss_vector": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"cvss": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"cvss3_base": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"cvss3_temporal_vector": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"cvss3_temporal": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"cvss3_vector": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"cvss3": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"description": {
|
||||||
|
"fields": {
|
||||||
|
"keyword": {
|
||||||
|
"ignore_above": 256,
|
||||||
|
"type": "keyword"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"norms": false,
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"dns": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"exploitability": {
|
||||||
|
"fields": {
|
||||||
|
"keyword": {
|
||||||
|
"ignore_above": 256,
|
||||||
|
"type": "keyword"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"norms": false,
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"fqdn": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"geoip": {
|
||||||
|
"dynamic": true,
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"ip": {
|
||||||
|
"type": "ip"
|
||||||
|
},
|
||||||
|
"latitude": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"location": {
|
||||||
|
"type": "geo_point"
|
||||||
|
},
|
||||||
|
"longitude": {
|
||||||
|
"type": "float"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"history_id": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"host": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"host_end": {
|
||||||
|
"type": "date"
|
||||||
|
},
|
||||||
|
"host_start": {
|
||||||
|
"type": "date"
|
||||||
|
},
|
||||||
|
"impact": {
|
||||||
|
"fields": {
|
||||||
|
"keyword": {
|
||||||
|
"ignore_above": 256,
|
||||||
|
"type": "keyword"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"norms": false,
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"ip_status": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"ip": {
|
||||||
|
"type": "ip"
|
||||||
|
},
|
||||||
|
"last_updated": {
|
||||||
|
"type": "date"
|
||||||
|
},
|
||||||
|
"operating_system": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"path": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"pci_vuln": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"plugin_family": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"plugin_id": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"plugin_name": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"plugin_output": {
|
||||||
|
"fields": {
|
||||||
|
"keyword": {
|
||||||
|
"ignore_above": 256,
|
||||||
|
"type": "keyword"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"norms": false,
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"port": {
|
||||||
|
"type": "integer"
|
||||||
|
},
|
||||||
|
"protocol": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"results": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"risk_number": {
|
||||||
|
"type": "integer"
|
||||||
|
},
|
||||||
|
"risk_score_name": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"risk_score": {
|
||||||
|
"type": "float"
|
||||||
|
},
|
||||||
|
"risk": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"scan_id": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"scan_name": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"scan_reference": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"see_also": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"solution": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"source": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"ssl": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"synopsis": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"system_type": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"tags": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"threat": {
|
||||||
|
"type": "text"
|
||||||
|
},
|
||||||
|
"type": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"vendor_reference": {
|
||||||
|
"type": "keyword"
|
||||||
|
},
|
||||||
|
"vulnerability_state": {
|
||||||
|
"type": "keyword"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
@ -31,7 +31,7 @@ class vwConfig(object):
|
|||||||
for section in self.config.sections():
|
for section in self.config.sections():
|
||||||
try:
|
try:
|
||||||
if self.get(section, attribute) in check:
|
if self.get(section, attribute) in check:
|
||||||
sections.append(section)
|
sections.append(section)
|
||||||
except:
|
except:
|
||||||
self.logger.warn("Section {} has no option '{}'".format(section, attribute))
|
self.logger.warn("Section {} has no option '{}'".format(section, attribute))
|
||||||
return sections
|
return sections
|
||||||
@ -45,7 +45,7 @@ class vwConfig(object):
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
def update_jira_profiles(self, profiles):
|
def update_jira_profiles(self, profiles):
|
||||||
# create JIRA profiles in the ini config file
|
# create JIRA profiles in the ini config file
|
||||||
self.logger.debug('Updating Jira profiles: {}'.format(str(profiles)))
|
self.logger.debug('Updating Jira profiles: {}'.format(str(profiles)))
|
||||||
|
|
||||||
for profile in profiles:
|
for profile in profiles:
|
||||||
@ -67,7 +67,7 @@ class vwConfig(object):
|
|||||||
self.config.set(section_name, 'min_critical_to_report', 'high')
|
self.config.set(section_name, 'min_critical_to_report', 'high')
|
||||||
self.config.set(section_name, '; automatically report, boolean value ')
|
self.config.set(section_name, '; automatically report, boolean value ')
|
||||||
self.config.set(section_name, 'autoreport', 'false')
|
self.config.set(section_name, 'autoreport', 'false')
|
||||||
|
|
||||||
# TODO: try/catch this
|
# TODO: try/catch this
|
||||||
# writing changes back to file
|
# writing changes back to file
|
||||||
with open(self.config_in, 'w') as configfile:
|
with open(self.config_in, 'w') as configfile:
|
||||||
|
@ -24,15 +24,19 @@ class NessusAPI(object):
|
|||||||
EXPORT_STATUS = EXPORT + '/{file_id}/status'
|
EXPORT_STATUS = EXPORT + '/{file_id}/status'
|
||||||
EXPORT_HISTORY = EXPORT + '?history_id={history_id}'
|
EXPORT_HISTORY = EXPORT + '?history_id={history_id}'
|
||||||
|
|
||||||
def __init__(self, hostname=None, port=None, username=None, password=None, verbose=True):
|
def __init__(self, hostname=None, port=None, username=None, password=None, verbose=True, profile=None, access_key=None, secret_key=None):
|
||||||
self.logger = logging.getLogger('NessusAPI')
|
self.logger = logging.getLogger('NessusAPI')
|
||||||
if verbose:
|
if verbose:
|
||||||
self.logger.setLevel(logging.DEBUG)
|
self.logger.setLevel(logging.DEBUG)
|
||||||
if username is None or password is None:
|
if not all((username, password)) and not all((access_key, secret_key)):
|
||||||
raise Exception('ERROR: Missing username or password.')
|
raise Exception('ERROR: Missing username, password or API keys.')
|
||||||
|
|
||||||
|
self.profile = profile
|
||||||
self.user = username
|
self.user = username
|
||||||
self.password = password
|
self.password = password
|
||||||
|
self.api_keys = False
|
||||||
|
self.access_key = access_key
|
||||||
|
self.secret_key = secret_key
|
||||||
self.base = 'https://{hostname}:{port}'.format(hostname=hostname, port=port)
|
self.base = 'https://{hostname}:{port}'.format(hostname=hostname, port=port)
|
||||||
self.verbose = verbose
|
self.verbose = verbose
|
||||||
|
|
||||||
@ -52,7 +56,13 @@ class NessusAPI(object):
|
|||||||
'X-Cookie': None
|
'X-Cookie': None
|
||||||
}
|
}
|
||||||
|
|
||||||
self.login()
|
if all((self.access_key, self.secret_key)):
|
||||||
|
self.logger.debug('Using {} API keys'.format(self.profile))
|
||||||
|
self.api_keys = True
|
||||||
|
self.session.headers['X-ApiKeys'] = 'accessKey={}; secretKey={}'.format(self.access_key, self.secret_key)
|
||||||
|
else:
|
||||||
|
self.login()
|
||||||
|
|
||||||
self.scans = self.get_scans()
|
self.scans = self.get_scans()
|
||||||
self.scan_ids = self.get_scan_ids()
|
self.scan_ids = self.get_scan_ids()
|
||||||
|
|
||||||
@ -67,7 +77,7 @@ class NessusAPI(object):
|
|||||||
def request(self, url, data=None, headers=None, method='POST', download=False, json_output=False):
|
def request(self, url, data=None, headers=None, method='POST', download=False, json_output=False):
|
||||||
timeout = 0
|
timeout = 0
|
||||||
success = False
|
success = False
|
||||||
|
|
||||||
method = method.lower()
|
method = method.lower()
|
||||||
url = self.base + url
|
url = self.base + url
|
||||||
self.logger.debug('Requesting to url {}'.format(url))
|
self.logger.debug('Requesting to url {}'.format(url))
|
||||||
@ -78,8 +88,10 @@ class NessusAPI(object):
|
|||||||
if url == self.base + self.SESSION:
|
if url == self.base + self.SESSION:
|
||||||
break
|
break
|
||||||
try:
|
try:
|
||||||
self.login()
|
|
||||||
timeout += 1
|
timeout += 1
|
||||||
|
if self.api_keys:
|
||||||
|
continue
|
||||||
|
self.login()
|
||||||
self.logger.info('Token refreshed')
|
self.logger.info('Token refreshed')
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error('Could not refresh token\nReason: {}'.format(str(e)))
|
self.logger.error('Could not refresh token\nReason: {}'.format(str(e)))
|
||||||
@ -114,7 +126,7 @@ class NessusAPI(object):
|
|||||||
data = self.request(self.SCAN_ID.format(scan_id=scan_id), method='GET', json_output=True)
|
data = self.request(self.SCAN_ID.format(scan_id=scan_id), method='GET', json_output=True)
|
||||||
return data['history']
|
return data['history']
|
||||||
|
|
||||||
def download_scan(self, scan_id=None, history=None, export_format="", profile=""):
|
def download_scan(self, scan_id=None, history=None, export_format=""):
|
||||||
running = True
|
running = True
|
||||||
counter = 0
|
counter = 0
|
||||||
|
|
||||||
@ -127,7 +139,8 @@ class NessusAPI(object):
|
|||||||
req = self.request(query, data=json.dumps(data), method='POST', json_output=True)
|
req = self.request(query, data=json.dumps(data), method='POST', json_output=True)
|
||||||
try:
|
try:
|
||||||
file_id = req['file']
|
file_id = req['file']
|
||||||
token_id = req['token'] if 'token' in req else req['temp_token']
|
if self.profile == 'nessus':
|
||||||
|
token_id = req['token'] if 'token' in req else req['temp_token']
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error('{}'.format(str(e)))
|
self.logger.error('{}'.format(str(e)))
|
||||||
self.logger.info('Download for file id {}'.format(str(file_id)))
|
self.logger.info('Download for file id {}'.format(str(file_id)))
|
||||||
@ -143,7 +156,7 @@ class NessusAPI(object):
|
|||||||
if counter % 60 == 0:
|
if counter % 60 == 0:
|
||||||
self.logger.info("Completed: {}".format(counter))
|
self.logger.info("Completed: {}".format(counter))
|
||||||
self.logger.info("Done: {}".format(counter))
|
self.logger.info("Done: {}".format(counter))
|
||||||
if profile == 'tenable':
|
if self.profile == 'tenable' or self.api_keys:
|
||||||
content = self.request(self.EXPORT_FILE_DOWNLOAD.format(scan_id=scan_id, file_id=file_id), method='GET', download=True)
|
content = self.request(self.EXPORT_FILE_DOWNLOAD.format(scan_id=scan_id, file_id=file_id), method='GET', download=True)
|
||||||
else:
|
else:
|
||||||
content = self.request(self.EXPORT_TOKEN_DOWNLOAD.format(token_id=token_id), method='GET', download=True)
|
content = self.request(self.EXPORT_TOKEN_DOWNLOAD.format(token_id=token_id), method='GET', download=True)
|
||||||
@ -152,7 +165,7 @@ class NessusAPI(object):
|
|||||||
def get_utc_from_local(self, date_time, local_tz=None, epoch=True):
|
def get_utc_from_local(self, date_time, local_tz=None, epoch=True):
|
||||||
date_time = datetime.fromtimestamp(date_time)
|
date_time = datetime.fromtimestamp(date_time)
|
||||||
if local_tz is None:
|
if local_tz is None:
|
||||||
local_tz = pytz.timezone('US/Central')
|
local_tz = pytz.timezone('UTC')
|
||||||
else:
|
else:
|
||||||
local_tz = pytz.timezone(local_tz)
|
local_tz = pytz.timezone(local_tz)
|
||||||
local_time = local_tz.normalize(local_tz.localize(date_time))
|
local_time = local_tz.normalize(local_tz.localize(date_time))
|
||||||
|
@ -428,7 +428,7 @@ class qualysScanReport:
|
|||||||
|
|
||||||
merged_df = merged_df.drop(['QID_y', 'QID_x'], axis=1)
|
merged_df = merged_df.drop(['QID_y', 'QID_x'], axis=1)
|
||||||
merged_df = merged_df.rename(columns={'Id': 'QID'})
|
merged_df = merged_df.rename(columns={'Id': 'QID'})
|
||||||
|
|
||||||
merged_df = merged_df.assign(**df_dict['SCAN_META'].to_dict(orient='records')[0])
|
merged_df = merged_df.assign(**df_dict['SCAN_META'].to_dict(orient='records')[0])
|
||||||
|
|
||||||
merged_df = pd.merge(merged_df, df_dict['CATEGORY_HEADER'], how='left', left_on=['Category', 'Severity Level'],
|
merged_df = pd.merge(merged_df, df_dict['CATEGORY_HEADER'], how='left', left_on=['Category', 'Severity Level'],
|
||||||
|
@ -67,18 +67,23 @@ class JiraAPI(object):
|
|||||||
if not exists:
|
if not exists:
|
||||||
self.logger.error("Error creating Ticket: component {} not found".format(component))
|
self.logger.error("Error creating Ticket: component {} not found".format(component))
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
new_issue = self.jira.create_issue(project=project,
|
|
||||||
summary=title,
|
|
||||||
description=desc,
|
|
||||||
issuetype={'name': 'Bug'},
|
|
||||||
labels=labels,
|
|
||||||
components=components_ticket)
|
|
||||||
|
|
||||||
self.logger.info("Ticket {} created successfully".format(new_issue))
|
try:
|
||||||
|
new_issue = self.jira.create_issue(project=project,
|
||||||
|
summary=title,
|
||||||
|
description=desc,
|
||||||
|
issuetype={'name': 'Bug'},
|
||||||
|
labels=labels,
|
||||||
|
components=components_ticket)
|
||||||
|
|
||||||
|
self.logger.info("Ticket {} created successfully".format(new_issue))
|
||||||
|
|
||||||
|
if attachment_contents:
|
||||||
|
self.add_content_as_attachment(new_issue, attachment_contents)
|
||||||
|
|
||||||
if attachment_contents:
|
except Exception as e:
|
||||||
self.add_content_as_attachment(new_issue, attachment_contents)
|
self.logger.error("Failed to create ticket on Jira Project '{}'. Error: {}".format(project, e))
|
||||||
|
new_issue = False
|
||||||
|
|
||||||
return new_issue
|
return new_issue
|
||||||
|
|
||||||
@ -226,32 +231,44 @@ class JiraAPI(object):
|
|||||||
def ticket_get_unique_fields(self, ticket):
|
def ticket_get_unique_fields(self, ticket):
|
||||||
title = ticket.raw.get('fields', {}).get('summary').encode("ascii").strip()
|
title = ticket.raw.get('fields', {}).get('summary').encode("ascii").strip()
|
||||||
ticketid = ticket.key.encode("ascii")
|
ticketid = ticket.key.encode("ascii")
|
||||||
assets = []
|
|
||||||
try:
|
assets = self.get_assets_from_description(ticket)
|
||||||
affected_assets_section = ticket.raw.get('fields', {}).get('description').encode("ascii").split("{panel:title=Affected Assets}")[1].split("{panel}")[0]
|
if not assets:
|
||||||
assets = list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets_section)))
|
#check if attachment, if so, get assets from attachment
|
||||||
|
assets = self.get_assets_from_attachment(ticket)
|
||||||
except Exception as e:
|
|
||||||
self.logger.error("Ticket IPs regex failed. Ticket ID: {}. Reason: {}".format(ticketid, e))
|
|
||||||
assets = []
|
|
||||||
|
|
||||||
try:
|
|
||||||
if not assets:
|
|
||||||
#check if attachment, if so, get assets from attachment
|
|
||||||
affected_assets_section = self.check_ips_attachment(ticket)
|
|
||||||
if affected_assets_section:
|
|
||||||
assets = list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets_section)))
|
|
||||||
except Exception as e:
|
|
||||||
self.logger.error("Ticket IPs Attachment regex failed. Ticket ID: {}. Reason: {}".format(ticketid, e))
|
|
||||||
|
|
||||||
return ticketid, title, assets
|
return ticketid, title, assets
|
||||||
|
|
||||||
def check_ips_attachment(self, ticket):
|
def get_assets_from_description(self, ticket, _raw = False):
|
||||||
affected_assets_section = []
|
# Get the assets as a string "host - protocol/port - hostname" separated by "\n"
|
||||||
|
# structure the text to have the same structure as the assets from the attachment
|
||||||
|
affected_assets = ""
|
||||||
|
try:
|
||||||
|
affected_assets = ticket.raw.get('fields', {}).get('description').encode("ascii").split("{panel:title=Affected Assets}")[1].split("{panel}")[0].replace('\n','').replace(' * ','\n').replace('\n', '', 1)
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Unable to process the Ticket's 'Affected Assets'. Ticket ID: {}. Reason: {}".format(ticket, e))
|
||||||
|
|
||||||
|
if affected_assets:
|
||||||
|
if _raw:
|
||||||
|
# from line 406 check if the text in the panel corresponds to having added an attachment
|
||||||
|
if "added as an attachment" in affected_assets:
|
||||||
|
return False
|
||||||
|
return affected_assets
|
||||||
|
|
||||||
|
try:
|
||||||
|
# if _raw is not true, we return only the IPs of the affected assets
|
||||||
|
return list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets)))
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Ticket IPs regex failed. Ticket ID: {}. Reason: {}".format(ticket, e))
|
||||||
|
return False
|
||||||
|
|
||||||
|
def get_assets_from_attachment(self, ticket, _raw = False):
|
||||||
|
# Get the assets as a string "host - protocol/port - hostname" separated by "\n"
|
||||||
|
affected_assets = []
|
||||||
try:
|
try:
|
||||||
fields = self.jira.issue(ticket.key).raw.get('fields', {})
|
fields = self.jira.issue(ticket.key).raw.get('fields', {})
|
||||||
attachments = fields.get('attachment', {})
|
attachments = fields.get('attachment', {})
|
||||||
affected_assets_section = ""
|
affected_assets = ""
|
||||||
#we will make sure we get the latest version of the file
|
#we will make sure we get the latest version of the file
|
||||||
latest = ''
|
latest = ''
|
||||||
attachment_id = ''
|
attachment_id = ''
|
||||||
@ -265,12 +282,44 @@ class JiraAPI(object):
|
|||||||
if latest < item.get('created'):
|
if latest < item.get('created'):
|
||||||
latest = item.get('created')
|
latest = item.get('created')
|
||||||
attachment_id = item.get('id')
|
attachment_id = item.get('id')
|
||||||
affected_assets_section = self.jira.attachment(attachment_id).get()
|
affected_assets = self.jira.attachment(attachment_id).get()
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error("Failed to get assets from ticket attachment. Ticket ID: {}. Reason: {}".format(ticket, e))
|
self.logger.error("Failed to get assets from ticket attachment. Ticket ID: {}. Reason: {}".format(ticket, e))
|
||||||
|
|
||||||
return affected_assets_section
|
if affected_assets:
|
||||||
|
if _raw:
|
||||||
|
return affected_assets
|
||||||
|
|
||||||
|
try:
|
||||||
|
# if _raw is not true, we return only the IPs of the affected assets
|
||||||
|
affected_assets = list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets)))
|
||||||
|
return affected_assets
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Ticket IPs Attachment regex failed. Ticket ID: {}. Reason: {}".format(ticket, e))
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
def parse_asset_to_json(self, asset):
|
||||||
|
hostname, protocol, port = "", "", ""
|
||||||
|
asset_info = asset.split(" - ")
|
||||||
|
ip = asset_info[0]
|
||||||
|
proto_port = asset_info[1]
|
||||||
|
# in case there is some case where hostname is not reported at all
|
||||||
|
if len(asset_info) == 3:
|
||||||
|
hostname = asset_info[2]
|
||||||
|
if proto_port != "N/A/N/A":
|
||||||
|
protocol, port = proto_port.split("/")
|
||||||
|
port = int(float(port))
|
||||||
|
|
||||||
|
asset_dict = {
|
||||||
|
"host": ip,
|
||||||
|
"protocol": protocol,
|
||||||
|
"port": port,
|
||||||
|
"hostname": hostname
|
||||||
|
}
|
||||||
|
|
||||||
|
return asset_dict
|
||||||
|
|
||||||
def clean_old_attachments(self, ticket):
|
def clean_old_attachments(self, ticket):
|
||||||
fields = ticket.raw.get('fields')
|
fields = ticket.raw.get('fields')
|
||||||
@ -441,7 +490,7 @@ class JiraAPI(object):
|
|||||||
if transition.get('name') == self.JIRA_REOPEN_ISSUE:
|
if transition.get('name') == self.JIRA_REOPEN_ISSUE:
|
||||||
self.logger.debug("Ticket is reopenable")
|
self.logger.debug("Ticket is reopenable")
|
||||||
return True
|
return True
|
||||||
self.logger.warn("Ticket can't be opened. Check Jira transitions.")
|
self.logger.error("Ticket {} can't be opened. Check Jira transitions.".format(ticket_obj))
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def is_ticket_closeable(self, ticket_obj):
|
def is_ticket_closeable(self, ticket_obj):
|
||||||
@ -449,7 +498,7 @@ class JiraAPI(object):
|
|||||||
for transition in transitions:
|
for transition in transitions:
|
||||||
if transition.get('name') == self.JIRA_CLOSE_ISSUE:
|
if transition.get('name') == self.JIRA_CLOSE_ISSUE:
|
||||||
return True
|
return True
|
||||||
self.logger.warn("Ticket can't closed. Check Jira transitions.")
|
self.logger.error("Ticket {} can't closed. Check Jira transitions.".format(ticket_obj))
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def is_ticket_resolved(self, ticket_obj):
|
def is_ticket_resolved(self, ticket_obj):
|
||||||
@ -522,7 +571,7 @@ class JiraAPI(object):
|
|||||||
def close_obsolete_tickets(self):
|
def close_obsolete_tickets(self):
|
||||||
# Close tickets older than 12 months, vulnerabilities not solved will get created a new ticket
|
# Close tickets older than 12 months, vulnerabilities not solved will get created a new ticket
|
||||||
self.logger.info("Closing obsolete tickets older than {} months".format(self.max_time_tracking))
|
self.logger.info("Closing obsolete tickets older than {} months".format(self.max_time_tracking))
|
||||||
jql = "labels=vulnerability_management AND created <startOfMonth(-{}) and resolution=Unresolved".format(self.max_time_tracking)
|
jql = "labels=vulnerability_management AND NOT labels=advisory AND created <startOfMonth(-{}) and resolution=Unresolved".format(self.max_time_tracking)
|
||||||
tickets_to_close = self.jira.search_issues(jql, maxResults=0)
|
tickets_to_close = self.jira.search_issues(jql, maxResults=0)
|
||||||
|
|
||||||
comment = '''This ticket is being closed for hygiene, as it is more than {} months old.
|
comment = '''This ticket is being closed for hygiene, as it is more than {} months old.
|
||||||
@ -553,8 +602,35 @@ class JiraAPI(object):
|
|||||||
return True
|
return True
|
||||||
try:
|
try:
|
||||||
self.logger.info("Saving locally tickets from the last {} months".format(self.max_time_tracking))
|
self.logger.info("Saving locally tickets from the last {} months".format(self.max_time_tracking))
|
||||||
jql = "labels=vulnerability_management AND created >=startOfMonth(-{})".format(self.max_time_tracking)
|
jql = "labels=vulnerability_management AND NOT labels=advisory AND created >=startOfMonth(-{})".format(self.max_time_tracking)
|
||||||
tickets_data = self.jira.search_issues(jql, maxResults=0)
|
tickets_data = self.jira.search_issues(jql, maxResults=0)
|
||||||
|
|
||||||
|
#TODO process tickets, creating a new field called "_metadata" with all the affected assets well structured
|
||||||
|
# for future processing in ELK/Splunk; this includes downloading attachments with assets and processing them
|
||||||
|
|
||||||
|
processed_tickets = []
|
||||||
|
|
||||||
|
for ticket in tickets_data:
|
||||||
|
assets = self.get_assets_from_description(ticket, _raw=True)
|
||||||
|
if not assets:
|
||||||
|
# check if attachment, if so, get assets from attachment
|
||||||
|
assets = self.get_assets_from_attachment(ticket, _raw=True)
|
||||||
|
# process the affected assets to save them as json structure on a new field from the JSON
|
||||||
|
_metadata = {"affected_hosts": []}
|
||||||
|
if assets:
|
||||||
|
if "\n" in assets:
|
||||||
|
for asset in assets.split("\n"):
|
||||||
|
assets_json = self.parse_asset_to_json(asset)
|
||||||
|
_metadata["affected_hosts"].append(assets_json)
|
||||||
|
else:
|
||||||
|
assets_json = self.parse_asset_to_json(assets)
|
||||||
|
_metadata["affected_hosts"].append(assets_json)
|
||||||
|
|
||||||
|
|
||||||
|
temp_ticket = ticket.raw.get('fields')
|
||||||
|
temp_ticket['_metadata'] = _metadata
|
||||||
|
|
||||||
|
processed_tickets.append(temp_ticket)
|
||||||
|
|
||||||
#end of line needed, as writelines() doesn't add it automatically, otherwise one big line
|
#end of line needed, as writelines() doesn't add it automatically, otherwise one big line
|
||||||
to_save = [json.dumps(ticket.raw.get('fields'))+"\n" for ticket in tickets_data]
|
to_save = [json.dumps(ticket.raw.get('fields'))+"\n" for ticket in tickets_data]
|
||||||
@ -566,7 +642,7 @@ class JiraAPI(object):
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error("Tickets could not be saved locally: {}.".format(e))
|
self.logger.error("Tickets could not be saved locally: {}.".format(e))
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def decommission_cleanup(self):
|
def decommission_cleanup(self):
|
||||||
|
@ -35,13 +35,13 @@ class mockAPI(object):
|
|||||||
elif 'fetch' in request.parsed_body['action']:
|
elif 'fetch' in request.parsed_body['action']:
|
||||||
try:
|
try:
|
||||||
response_body = open('{}/{}'.format(
|
response_body = open('{}/{}'.format(
|
||||||
self.qualys_vuln_path,
|
self.qualys_vuln_path,
|
||||||
request.parsed_body['scan_ref'][0].replace('/', '_'))
|
request.parsed_body['scan_ref'][0].replace('/', '_'))
|
||||||
).read()
|
).read()
|
||||||
except:
|
except:
|
||||||
# Can't find the file, just send an empty response
|
# Can't find the file, just send an empty response
|
||||||
response_body = ''
|
response_body = ''
|
||||||
return [200, response_headers, response_body]
|
return [200, response_headers, response_body]
|
||||||
|
|
||||||
def create_nessus_resource(self, framework):
|
def create_nessus_resource(self, framework):
|
||||||
for filename in self.get_files('{}/{}'.format(self.mock_dir, framework)):
|
for filename in self.get_files('{}/{}'.format(self.mock_dir, framework)):
|
||||||
@ -60,7 +60,7 @@ class mockAPI(object):
|
|||||||
httpretty.GET,
|
httpretty.GET,
|
||||||
'https://{}:443/{}'.format(framework, 'msp/about.php'),
|
'https://{}:443/{}'.format(framework, 'msp/about.php'),
|
||||||
body='')
|
body='')
|
||||||
|
|
||||||
self.logger.debug('Adding mocked {} endpoint {} {}'.format(framework, 'POST', 'api/2.0/fo/scan'))
|
self.logger.debug('Adding mocked {} endpoint {} {}'.format(framework, 'POST', 'api/2.0/fo/scan'))
|
||||||
httpretty.register_uri(
|
httpretty.register_uri(
|
||||||
httpretty.POST, 'https://{}:443/{}'.format(framework, 'api/2.0/fo/scan/'),
|
httpretty.POST, 'https://{}:443/{}'.format(framework, 'api/2.0/fo/scan/'),
|
||||||
|
@ -55,8 +55,12 @@ class vulnWhispererBase(object):
|
|||||||
except:
|
except:
|
||||||
self.enabled = False
|
self.enabled = False
|
||||||
self.hostname = self.config.get(self.CONFIG_SECTION, 'hostname')
|
self.hostname = self.config.get(self.CONFIG_SECTION, 'hostname')
|
||||||
self.username = self.config.get(self.CONFIG_SECTION, 'username')
|
try:
|
||||||
self.password = self.config.get(self.CONFIG_SECTION, 'password')
|
self.username = self.config.get(self.CONFIG_SECTION, 'username')
|
||||||
|
self.password = self.config.get(self.CONFIG_SECTION, 'password')
|
||||||
|
except:
|
||||||
|
self.username = None
|
||||||
|
self.password = None
|
||||||
self.write_path = self.config.get(self.CONFIG_SECTION, 'write_path')
|
self.write_path = self.config.get(self.CONFIG_SECTION, 'write_path')
|
||||||
self.db_path = self.config.get(self.CONFIG_SECTION, 'db_path')
|
self.db_path = self.config.get(self.CONFIG_SECTION, 'db_path')
|
||||||
self.verbose = self.config.getbool(self.CONFIG_SECTION, 'verbose')
|
self.verbose = self.config.getbool(self.CONFIG_SECTION, 'verbose')
|
||||||
@ -144,7 +148,7 @@ class vulnWhispererBase(object):
|
|||||||
|
|
||||||
def record_insert(self, record):
|
def record_insert(self, record):
|
||||||
#for backwards compatibility with older versions without "reported" field
|
#for backwards compatibility with older versions without "reported" field
|
||||||
|
|
||||||
try:
|
try:
|
||||||
#-1 to get the latest column, 1 to get the column name (old version would be "processed", new "reported")
|
#-1 to get the latest column, 1 to get the column name (old version would be "processed", new "reported")
|
||||||
#TODO delete backward compatibility check after some versions
|
#TODO delete backward compatibility check after some versions
|
||||||
@ -171,7 +175,7 @@ class vulnWhispererBase(object):
|
|||||||
return True
|
return True
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error('Failed while setting scan with file {} as processed'.format(filename))
|
self.logger.error('Failed while setting scan with file {} as processed'.format(filename))
|
||||||
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def retrieve_uuids(self):
|
def retrieve_uuids(self):
|
||||||
@ -200,7 +204,8 @@ class vulnWhispererBase(object):
|
|||||||
def get_latest_results(self, source, scan_name):
|
def get_latest_results(self, source, scan_name):
|
||||||
processed = 0
|
processed = 0
|
||||||
results = []
|
results = []
|
||||||
|
reported = ""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
self.conn.text_factory = str
|
self.conn.text_factory = str
|
||||||
self.cur.execute('SELECT filename FROM scan_history WHERE source="{}" AND scan_name="{}" ORDER BY last_modified DESC LIMIT 1;'.format(source, scan_name))
|
self.cur.execute('SELECT filename FROM scan_history WHERE source="{}" AND scan_name="{}" ORDER BY last_modified DESC LIMIT 1;'.format(source, scan_name))
|
||||||
@ -218,11 +223,12 @@ class vulnWhispererBase(object):
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error("Error when getting latest results from {}.{} : {}".format(source, scan_name, e))
|
self.logger.error("Error when getting latest results from {}.{} : {}".format(source, scan_name, e))
|
||||||
|
|
||||||
return results, reported
|
return results, reported
|
||||||
|
|
||||||
def get_scan_profiles(self):
|
def get_scan_profiles(self):
|
||||||
# Returns a list of source.scan_name elements from the database
|
# Returns a list of source.scan_name elements from the database
|
||||||
|
|
||||||
# we get the list of sources
|
# we get the list of sources
|
||||||
try:
|
try:
|
||||||
self.conn.text_factory = str
|
self.conn.text_factory = str
|
||||||
@ -231,7 +237,7 @@ class vulnWhispererBase(object):
|
|||||||
except:
|
except:
|
||||||
sources = []
|
sources = []
|
||||||
self.logger.error("Process failed at executing 'SELECT DISTINCT source FROM scan_history;'")
|
self.logger.error("Process failed at executing 'SELECT DISTINCT source FROM scan_history;'")
|
||||||
|
|
||||||
results = []
|
results = []
|
||||||
|
|
||||||
# we get the list of scans within each source
|
# we get the list of scans within each source
|
||||||
@ -274,6 +280,8 @@ class vulnWhispererNessus(vulnWhispererBase):
|
|||||||
|
|
||||||
self.develop = True
|
self.develop = True
|
||||||
self.purge = purge
|
self.purge = purge
|
||||||
|
self.access_key = None
|
||||||
|
self.secret_key = None
|
||||||
|
|
||||||
if config is not None:
|
if config is not None:
|
||||||
try:
|
try:
|
||||||
@ -283,24 +291,36 @@ class vulnWhispererNessus(vulnWhispererBase):
|
|||||||
'trash')
|
'trash')
|
||||||
|
|
||||||
try:
|
try:
|
||||||
self.logger.info('Attempting to connect to nessus...')
|
self.access_key = self.config.get(self.CONFIG_SECTION,'access_key')
|
||||||
|
self.secret_key = self.config.get(self.CONFIG_SECTION,'secret_key')
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.logger.info('Attempting to connect to {}...'.format(self.CONFIG_SECTION))
|
||||||
self.nessus = \
|
self.nessus = \
|
||||||
NessusAPI(hostname=self.hostname,
|
NessusAPI(hostname=self.hostname,
|
||||||
port=self.nessus_port,
|
port=self.nessus_port,
|
||||||
username=self.username,
|
username=self.username,
|
||||||
password=self.password)
|
password=self.password,
|
||||||
|
profile=self.CONFIG_SECTION,
|
||||||
|
access_key=self.access_key,
|
||||||
|
secret_key=self.secret_key
|
||||||
|
)
|
||||||
self.nessus_connect = True
|
self.nessus_connect = True
|
||||||
self.logger.info('Connected to nessus on {host}:{port}'.format(host=self.hostname,
|
self.logger.info('Connected to {} on {host}:{port}'.format(self.CONFIG_SECTION, host=self.hostname,
|
||||||
port=str(self.nessus_port)))
|
port=str(self.nessus_port)))
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error('Exception: {}'.format(str(e)))
|
self.logger.error('Exception: {}'.format(str(e)))
|
||||||
raise Exception(
|
raise Exception(
|
||||||
'Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
|
'Could not connect to {} -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
|
||||||
|
self.CONFIG_SECTION,
|
||||||
config=self.config.config_in,
|
config=self.config.config_in,
|
||||||
e=e))
|
e=e))
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error('Could not properly load your config!\nReason: {e}'.format(e=e))
|
self.logger.error('Could not properly load your config!\nReason: {e}'.format(e=e))
|
||||||
sys.exit(1)
|
return False
|
||||||
|
#sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@ -435,20 +455,21 @@ class vulnWhispererNessus(vulnWhispererBase):
|
|||||||
try:
|
try:
|
||||||
file_req = \
|
file_req = \
|
||||||
self.nessus.download_scan(scan_id=scan_id, history=history_id,
|
self.nessus.download_scan(scan_id=scan_id, history=history_id,
|
||||||
export_format='csv', profile=self.CONFIG_SECTION)
|
export_format='csv')
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.error('Could not download {} scan {}: {}'.format(self.CONFIG_SECTION, scan_id, str(e)))
|
self.logger.error('Could not download {} scan {}: {}'.format(self.CONFIG_SECTION, scan_id, str(e)))
|
||||||
self.exit_code += 1
|
self.exit_code += 1
|
||||||
continue
|
continue
|
||||||
|
|
||||||
clean_csv = \
|
clean_csv = \
|
||||||
pd.read_csv(io.StringIO(file_req.decode('utf-8')))
|
pd.read_csv(io.StringIO(file_req.decode('utf-8')))
|
||||||
if len(clean_csv) > 2:
|
if len(clean_csv) > 2:
|
||||||
self.logger.info('Processing {}/{} for scan: {}'.format(scan_count, len(scan_list), scan_name.encode('utf8')))
|
self.logger.info('Processing {}/{} for scan: {}'.format(scan_count, len(scan_list), scan_name.encode('utf8')))
|
||||||
columns_to_cleanse = ['CVSS','CVE','Description','Synopsis','Solution','See Also','Plugin Output']
|
columns_to_cleanse = ['CVSS','CVE','Description','Synopsis','Solution','See Also','Plugin Output', 'MAC Address']
|
||||||
|
|
||||||
for col in columns_to_cleanse:
|
for col in columns_to_cleanse:
|
||||||
clean_csv[col] = clean_csv[col].astype(str).apply(self.cleanser)
|
if col in clean_csv:
|
||||||
|
clean_csv[col] = clean_csv[col].astype(str).apply(self.cleanser)
|
||||||
|
|
||||||
clean_csv.to_csv(relative_path_name, index=False)
|
clean_csv.to_csv(relative_path_name, index=False)
|
||||||
record_meta = (
|
record_meta = (
|
||||||
@ -555,8 +576,11 @@ class vulnWhispererQualys(vulnWhispererBase):
|
|||||||
self.logger = logging.getLogger('vulnWhispererQualys')
|
self.logger = logging.getLogger('vulnWhispererQualys')
|
||||||
if debug:
|
if debug:
|
||||||
self.logger.setLevel(logging.DEBUG)
|
self.logger.setLevel(logging.DEBUG)
|
||||||
|
try:
|
||||||
self.qualys_scan = qualysScanReport(config=config)
|
self.qualys_scan = qualysScanReport(config=config)
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Unable to establish connection with Qualys scanner. Reason: {}".format(e))
|
||||||
|
return False
|
||||||
self.latest_scans = self.qualys_scan.qw.get_all_scans()
|
self.latest_scans = self.qualys_scan.qw.get_all_scans()
|
||||||
self.directory_check()
|
self.directory_check()
|
||||||
self.scans_to_process = None
|
self.scans_to_process = None
|
||||||
@ -642,8 +666,7 @@ class vulnWhispererQualys(vulnWhispererBase):
|
|||||||
|
|
||||||
if cleanup:
|
if cleanup:
|
||||||
self.logger.info('Removing report {} from Qualys Database'.format(generated_report_id))
|
self.logger.info('Removing report {} from Qualys Database'.format(generated_report_id))
|
||||||
cleaning_up = \
|
cleaning_up = self.qualys_scan.qw.delete_report(generated_report_id)
|
||||||
self.qualys_scan.qw.delete_report(generated_report_id)
|
|
||||||
os.remove(self.path_check(str(generated_report_id) + '.csv'))
|
os.remove(self.path_check(str(generated_report_id) + '.csv'))
|
||||||
self.logger.info('Deleted report from local disk: {}'.format(self.path_check(str(generated_report_id))))
|
self.logger.info('Deleted report from local disk: {}'.format(self.path_check(str(generated_report_id))))
|
||||||
else:
|
else:
|
||||||
@ -728,10 +751,14 @@ class vulnWhispererOpenVAS(vulnWhispererBase):
|
|||||||
self.develop = True
|
self.develop = True
|
||||||
self.purge = purge
|
self.purge = purge
|
||||||
self.scans_to_process = None
|
self.scans_to_process = None
|
||||||
self.openvas_api = OpenVAS_API(hostname=self.hostname,
|
try:
|
||||||
port=self.port,
|
self.openvas_api = OpenVAS_API(hostname=self.hostname,
|
||||||
username=self.username,
|
port=self.port,
|
||||||
password=self.password)
|
username=self.username,
|
||||||
|
password=self.password)
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Unable to establish connection with OpenVAS scanner. Reason: {}".format(e))
|
||||||
|
return False
|
||||||
|
|
||||||
def whisper_reports(self, output_format='json', launched_date=None, report_id=None, cleanup=True):
|
def whisper_reports(self, output_format='json', launched_date=None, report_id=None, cleanup=True):
|
||||||
report = None
|
report = None
|
||||||
@ -837,13 +864,16 @@ class vulnWhispererQualysVuln(vulnWhispererBase):
|
|||||||
username=None,
|
username=None,
|
||||||
password=None,
|
password=None,
|
||||||
):
|
):
|
||||||
|
|
||||||
super(vulnWhispererQualysVuln, self).__init__(config=config)
|
super(vulnWhispererQualysVuln, self).__init__(config=config)
|
||||||
self.logger = logging.getLogger('vulnWhispererQualysVuln')
|
self.logger = logging.getLogger('vulnWhispererQualysVuln')
|
||||||
if debug:
|
if debug:
|
||||||
self.logger.setLevel(logging.DEBUG)
|
self.logger.setLevel(logging.DEBUG)
|
||||||
|
try:
|
||||||
self.qualys_scan = qualysVulnScan(config=config)
|
self.qualys_scan = qualysVulnScan(config=config)
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Unable to create connection with Qualys. Reason: {}".format(e))
|
||||||
|
return False
|
||||||
self.directory_check()
|
self.directory_check()
|
||||||
self.scans_to_process = None
|
self.scans_to_process = None
|
||||||
|
|
||||||
@ -854,7 +884,7 @@ class vulnWhispererQualysVuln(vulnWhispererBase):
|
|||||||
scan_reference=None,
|
scan_reference=None,
|
||||||
output_format='json',
|
output_format='json',
|
||||||
cleanup=True):
|
cleanup=True):
|
||||||
launched_date
|
|
||||||
if 'Z' in launched_date:
|
if 'Z' in launched_date:
|
||||||
launched_date = self.qualys_scan.utils.iso_to_epoch(launched_date)
|
launched_date = self.qualys_scan.utils.iso_to_epoch(launched_date)
|
||||||
report_name = 'qualys_vuln_' + report_id.replace('/','_') \
|
report_name = 'qualys_vuln_' + report_id.replace('/','_') \
|
||||||
@ -966,8 +996,15 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
self.config_path = config
|
self.config_path = config
|
||||||
self.config = vwConfig(config)
|
self.config = vwConfig(config)
|
||||||
self.host_resolv_cache = {}
|
self.host_resolv_cache = {}
|
||||||
self.directory_check()
|
self.host_no_resolv = []
|
||||||
|
self.no_resolv_by_team_dict = {}
|
||||||
|
#Save locally those assets without DNS entry for flag to system owners
|
||||||
|
self.no_resolv_fname="no_resolv.txt"
|
||||||
|
if os.path.isfile(self.no_resolv_fname):
|
||||||
|
with open(self.no_resolv_fname, "r") as json_file:
|
||||||
|
self.no_resolv_by_team_dict = json.load(json_file)
|
||||||
|
self.directory_check()
|
||||||
|
|
||||||
if config is not None:
|
if config is not None:
|
||||||
try:
|
try:
|
||||||
self.logger.info('Attempting to connect to jira...')
|
self.logger.info('Attempting to connect to jira...')
|
||||||
@ -983,17 +1020,18 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
raise Exception(
|
raise Exception(
|
||||||
'Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
|
'Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
|
||||||
config=self.config.config_in, e=e))
|
config=self.config.config_in, e=e))
|
||||||
sys.exit(1)
|
return False
|
||||||
|
#sys.exit(1)
|
||||||
|
|
||||||
profiles = []
|
profiles = []
|
||||||
profiles = self.get_scan_profiles()
|
profiles = self.get_scan_profiles()
|
||||||
|
|
||||||
if not self.config.exists_jira_profiles(profiles):
|
if not self.config.exists_jira_profiles(profiles):
|
||||||
self.config.update_jira_profiles(profiles)
|
self.config.update_jira_profiles(profiles)
|
||||||
self.logger.info("Jira profiles have been created in {config}, please fill the variables before rerunning the module.".format(config=self.config_path))
|
self.logger.info("Jira profiles have been created in {config}, please fill the variables before rerunning the module.".format(config=self.config_path))
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
|
|
||||||
def get_env_variables(self, source, scan_name):
|
def get_env_variables(self, source, scan_name):
|
||||||
# function returns an array with [jira_project, jira_components, datafile_path]
|
# function returns an array with [jira_project, jira_components, datafile_path]
|
||||||
|
|
||||||
@ -1004,32 +1042,32 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
if project == "":
|
if project == "":
|
||||||
self.logger.error('JIRA project is missing on the configuration file!')
|
self.logger.error('JIRA project is missing on the configuration file!')
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
# check that project actually exists
|
# check that project actually exists
|
||||||
if not self.jira.project_exists(project):
|
if not self.jira.project_exists(project):
|
||||||
self.logger.error("JIRA project '{project}' doesn't exist!".format(project=project))
|
self.logger.error("JIRA project '{project}' doesn't exist!".format(project=project))
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
components = self.config.get(jira_section,'components').split(',')
|
components = self.config.get(jira_section,'components').split(',')
|
||||||
|
|
||||||
#cleaning empty array from ''
|
#cleaning empty array from ''
|
||||||
if not components[0]:
|
if not components[0]:
|
||||||
components = []
|
components = []
|
||||||
|
|
||||||
min_critical = self.config.get(jira_section,'min_critical_to_report')
|
min_critical = self.config.get(jira_section,'min_critical_to_report')
|
||||||
if not min_critical:
|
if not min_critical:
|
||||||
self.logger.error('"min_critical_to_report" variable on config file is empty.')
|
self.logger.error('"min_critical_to_report" variable on config file is empty.')
|
||||||
sys.exit(0)
|
sys.exit(0)
|
||||||
|
|
||||||
#datafile path
|
#datafile path
|
||||||
filename, reported = self.get_latest_results(source, scan_name)
|
filename, reported = self.get_latest_results(source, scan_name)
|
||||||
fullpath = ""
|
fullpath = ""
|
||||||
|
|
||||||
# search data files under user specified directory
|
# search data files under user specified directory
|
||||||
for root, dirnames, filenames in os.walk(vwConfig(self.config_path).get(source,'write_path')):
|
for root, dirnames, filenames in os.walk(vwConfig(self.config_path).get(source,'write_path')):
|
||||||
if filename in filenames:
|
if filename in filenames:
|
||||||
fullpath = "{}/{}".format(root,filename)
|
fullpath = "{}/{}".format(root,filename)
|
||||||
|
|
||||||
if reported:
|
if reported:
|
||||||
self.logger.warn('Last Scan of "{scan_name}" for source "{source}" has already been reported; will be skipped.'.format(scan_name=scan_name, source=source))
|
self.logger.warn('Last Scan of "{scan_name}" for source "{source}" has already been reported; will be skipped.'.format(scan_name=scan_name, source=source))
|
||||||
return [False] * 5
|
return [False] * 5
|
||||||
@ -1037,7 +1075,7 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
if not fullpath:
|
if not fullpath:
|
||||||
self.logger.error('Scan of "{scan_name}" for source "{source}" has not been found. Please check that the scanner data files are in place.'.format(scan_name=scan_name, source=source))
|
self.logger.error('Scan of "{scan_name}" for source "{source}" has not been found. Please check that the scanner data files are in place.'.format(scan_name=scan_name, source=source))
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
dns_resolv = self.config.get('jira','dns_resolv')
|
dns_resolv = self.config.get('jira','dns_resolv')
|
||||||
if dns_resolv in ('False', 'false', ''):
|
if dns_resolv in ('False', 'false', ''):
|
||||||
dns_resolv = False
|
dns_resolv = False
|
||||||
@ -1051,22 +1089,22 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
|
|
||||||
|
|
||||||
def parse_nessus_vulnerabilities(self, fullpath, source, scan_name, min_critical):
|
def parse_nessus_vulnerabilities(self, fullpath, source, scan_name, min_critical):
|
||||||
|
|
||||||
vulnerabilities = []
|
vulnerabilities = []
|
||||||
|
|
||||||
# we need to parse the CSV
|
# we need to parse the CSV
|
||||||
risks = ['none', 'low', 'medium', 'high', 'critical']
|
risks = ['none', 'low', 'medium', 'high', 'critical']
|
||||||
min_risk = int([i for i,x in enumerate(risks) if x == min_critical][0])
|
min_risk = int([i for i,x in enumerate(risks) if x == min_critical][0])
|
||||||
|
|
||||||
df = pd.read_csv(fullpath, delimiter=',')
|
df = pd.read_csv(fullpath, delimiter=',')
|
||||||
|
|
||||||
#nessus fields we want - ['Host','Protocol','Port', 'Name', 'Synopsis', 'Description', 'Solution', 'See Also']
|
#nessus fields we want - ['Host','Protocol','Port', 'Name', 'Synopsis', 'Description', 'Solution', 'See Also']
|
||||||
for index in range(len(df)):
|
for index in range(len(df)):
|
||||||
# filtering vulnerabilities by criticality, discarding low risk
|
# filtering vulnerabilities by criticality, discarding low risk
|
||||||
to_report = int([i for i,x in enumerate(risks) if x == df.loc[index]['Risk'].lower()][0])
|
to_report = int([i for i,x in enumerate(risks) if x == df.loc[index]['Risk'].lower()][0])
|
||||||
if to_report < min_risk:
|
if to_report < min_risk:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
if not vulnerabilities or df.loc[index]['Name'] not in [entry['title'] for entry in vulnerabilities]:
|
if not vulnerabilities or df.loc[index]['Name'] not in [entry['title'] for entry in vulnerabilities]:
|
||||||
vuln = {}
|
vuln = {}
|
||||||
#vulnerabilities should have all the info for creating all JIRA labels
|
#vulnerabilities should have all the info for creating all JIRA labels
|
||||||
@ -1080,7 +1118,7 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
vuln['ips'] = []
|
vuln['ips'] = []
|
||||||
vuln['ips'].append("{} - {}/{}".format(df.loc[index]['Host'], df.loc[index]['Protocol'], df.loc[index]['Port']))
|
vuln['ips'].append("{} - {}/{}".format(df.loc[index]['Host'], df.loc[index]['Protocol'], df.loc[index]['Port']))
|
||||||
vuln['risk'] = df.loc[index]['Risk'].lower()
|
vuln['risk'] = df.loc[index]['Risk'].lower()
|
||||||
|
|
||||||
# Nessus "nan" value gets automatically casted to float by python
|
# Nessus "nan" value gets automatically casted to float by python
|
||||||
if not (type(df.loc[index]['See Also']) is float):
|
if not (type(df.loc[index]['See Also']) is float):
|
||||||
vuln['references'] = df.loc[index]['See Also'].split("\\n")
|
vuln['references'] = df.loc[index]['See Also'].split("\\n")
|
||||||
@ -1093,24 +1131,24 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
for vuln in vulnerabilities:
|
for vuln in vulnerabilities:
|
||||||
if vuln['title'] == df.loc[index]['Name']:
|
if vuln['title'] == df.loc[index]['Name']:
|
||||||
vuln['ips'].append("{} - {}/{}".format(df.loc[index]['Host'], df.loc[index]['Protocol'], df.loc[index]['Port']))
|
vuln['ips'].append("{} - {}/{}".format(df.loc[index]['Host'], df.loc[index]['Protocol'], df.loc[index]['Port']))
|
||||||
|
|
||||||
return vulnerabilities
|
return vulnerabilities
|
||||||
|
|
||||||
def parse_qualys_vuln_vulnerabilities(self, fullpath, source, scan_name, min_critical, dns_resolv = False):
|
def parse_qualys_vuln_vulnerabilities(self, fullpath, source, scan_name, min_critical, dns_resolv = False):
|
||||||
#parsing of the qualys vulnerabilities schema
|
#parsing of the qualys vulnerabilities schema
|
||||||
#parse json
|
#parse json
|
||||||
vulnerabilities = []
|
vulnerabilities = []
|
||||||
|
|
||||||
risks = ['info', 'low', 'medium', 'high', 'critical']
|
risks = ['info', 'low', 'medium', 'high', 'critical']
|
||||||
# +1 as array is 0-4, but score is 1-5
|
# +1 as array is 0-4, but score is 1-5
|
||||||
min_risk = int([i for i,x in enumerate(risks) if x == min_critical][0])+1
|
min_risk = int([i for i,x in enumerate(risks) if x == min_critical][0])+1
|
||||||
|
|
||||||
try:
|
try:
|
||||||
data=[json.loads(line) for line in open(fullpath).readlines()]
|
data=[json.loads(line) for line in open(fullpath).readlines()]
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.logger.warn("Scan has no vulnerabilities, skipping.")
|
self.logger.warn("Scan has no vulnerabilities, skipping.")
|
||||||
return vulnerabilities
|
return vulnerabilities
|
||||||
|
|
||||||
#qualys fields we want - []
|
#qualys fields we want - []
|
||||||
for index in range(len(data)):
|
for index in range(len(data)):
|
||||||
if int(data[index]['risk']) < min_risk:
|
if int(data[index]['risk']) < min_risk:
|
||||||
@ -1119,7 +1157,7 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
elif data[index]['type'] == 'Practice' or data[index]['type'] == 'Ig':
|
elif data[index]['type'] == 'Practice' or data[index]['type'] == 'Ig':
|
||||||
self.logger.debug("Vulnerability '{vuln}' ignored, as it is 'Practice/Potential', not verified.".format(vuln=data[index]['plugin_name']))
|
self.logger.debug("Vulnerability '{vuln}' ignored, as it is 'Practice/Potential', not verified.".format(vuln=data[index]['plugin_name']))
|
||||||
continue
|
continue
|
||||||
|
|
||||||
if not vulnerabilities or data[index]['plugin_name'] not in [entry['title'] for entry in vulnerabilities]:
|
if not vulnerabilities or data[index]['plugin_name'] not in [entry['title'] for entry in vulnerabilities]:
|
||||||
vuln = {}
|
vuln = {}
|
||||||
#vulnerabilities should have all the info for creating all JIRA labels
|
#vulnerabilities should have all the info for creating all JIRA labels
|
||||||
@ -1132,12 +1170,12 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
vuln['solution'] = data[index]['solution'].replace('\\n',' ')
|
vuln['solution'] = data[index]['solution'].replace('\\n',' ')
|
||||||
vuln['ips'] = []
|
vuln['ips'] = []
|
||||||
#TODO ADDED DNS RESOLUTION FROM QUALYS! \n SEPARATORS INSTEAD OF \\n!
|
#TODO ADDED DNS RESOLUTION FROM QUALYS! \n SEPARATORS INSTEAD OF \\n!
|
||||||
|
|
||||||
vuln['ips'].append("{ip} - {protocol}/{port} - {dns}".format(**self.get_asset_fields(data[index], dns_resolv)))
|
vuln['ips'].append("{ip} - {protocol}/{port} - {dns}".format(**self.get_asset_fields(data[index], dns_resolv)))
|
||||||
|
|
||||||
#different risk system than Nessus!
|
#different risk system than Nessus!
|
||||||
vuln['risk'] = risks[int(data[index]['risk'])-1]
|
vuln['risk'] = risks[int(data[index]['risk'])-1]
|
||||||
|
|
||||||
# Nessus "nan" value gets automatically casted to float by python
|
# Nessus "nan" value gets automatically casted to float by python
|
||||||
if not (type(data[index]['vendor_reference']) is float or data[index]['vendor_reference'] == None):
|
if not (type(data[index]['vendor_reference']) is float or data[index]['vendor_reference'] == None):
|
||||||
vuln['references'] = data[index]['vendor_reference'].split("\\n")
|
vuln['references'] = data[index]['vendor_reference'].split("\\n")
|
||||||
@ -1155,8 +1193,8 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
def get_asset_fields(self, vuln, dns_resolv):
|
def get_asset_fields(self, vuln, dns_resolv):
|
||||||
values = {}
|
values = {}
|
||||||
values['ip'] = vuln['ip']
|
values['ip'] = vuln['ip']
|
||||||
values['protocol'] = vuln['protocol']
|
values['protocol'] = vuln['protocol']
|
||||||
values['port'] = vuln['port']
|
values['port'] = vuln['port']
|
||||||
values['dns'] = ''
|
values['dns'] = ''
|
||||||
if dns_resolv:
|
if dns_resolv:
|
||||||
if vuln['dns']:
|
if vuln['dns']:
|
||||||
@ -1173,6 +1211,7 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
self.logger.debug("Hostname found: {hostname}.".format(hostname=values['dns']))
|
self.logger.debug("Hostname found: {hostname}.".format(hostname=values['dns']))
|
||||||
except:
|
except:
|
||||||
self.host_resolv_cache[values['ip']] = ''
|
self.host_resolv_cache[values['ip']] = ''
|
||||||
|
self.host_no_resolv.append(values['ip'])
|
||||||
self.logger.debug("Hostname not found for: {ip}.".format(ip=values['ip']))
|
self.logger.debug("Hostname not found for: {ip}.".format(ip=values['ip']))
|
||||||
|
|
||||||
for key in values.keys():
|
for key in values.keys():
|
||||||
@ -1206,18 +1245,31 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
#***Qualys VM parsing***
|
#***Qualys VM parsing***
|
||||||
if source == "qualys_vuln":
|
if source == "qualys_vuln":
|
||||||
vulnerabilities = self.parse_qualys_vuln_vulnerabilities(fullpath, source, scan_name, min_critical, dns_resolv)
|
vulnerabilities = self.parse_qualys_vuln_vulnerabilities(fullpath, source, scan_name, min_critical, dns_resolv)
|
||||||
|
|
||||||
#***JIRA sync***
|
#***JIRA sync***
|
||||||
if vulnerabilities:
|
try:
|
||||||
self.logger.info('{source} data has been successfuly parsed'.format(source=source.upper()))
|
if vulnerabilities:
|
||||||
self.logger.info('Starting JIRA sync')
|
self.logger.info('{source} data has been successfuly parsed'.format(source=source.upper()))
|
||||||
|
self.logger.info('Starting JIRA sync')
|
||||||
self.jira.sync(vulnerabilities, project, components)
|
|
||||||
else:
|
self.jira.sync(vulnerabilities, project, components)
|
||||||
self.logger.info("[{source}.{scan_name}] No vulnerabilities or vulnerabilities not parsed.".format(source=source, scan_name=scan_name))
|
else:
|
||||||
self.set_latest_scan_reported(fullpath.split("/")[-1])
|
self.logger.info("[{source}.{scan_name}] No vulnerabilities or vulnerabilities not parsed.".format(source=source, scan_name=scan_name))
|
||||||
|
self.set_latest_scan_reported(fullpath.split("/")[-1])
|
||||||
|
return False
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error("Error: {}".format(e))
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
#writing to file those assets without DNS resolution
|
||||||
|
#if its not empty
|
||||||
|
if self.host_no_resolv:
|
||||||
|
#we will replace old list of non resolved for the new one or create if it doesn't exist already
|
||||||
|
self.no_resolv_by_team_dict[scan_name] = self.host_no_resolv
|
||||||
|
with open(self.no_resolv_fname, 'w') as outfile:
|
||||||
|
json.dump(self.no_resolv_by_team_dict, outfile)
|
||||||
|
|
||||||
self.set_latest_scan_reported(fullpath.split("/")[-1])
|
self.set_latest_scan_reported(fullpath.split("/")[-1])
|
||||||
return True
|
return True
|
||||||
|
|
||||||
@ -1226,7 +1278,13 @@ class vulnWhispererJIRA(vulnWhispererBase):
|
|||||||
|
|
||||||
if autoreport_sections:
|
if autoreport_sections:
|
||||||
for scan in autoreport_sections:
|
for scan in autoreport_sections:
|
||||||
self.jira_sync(self.config.get(scan, 'source'), self.config.get(scan, 'scan_name'))
|
try:
|
||||||
|
self.jira_sync(self.config.get(scan, 'source'), self.config.get(scan, 'scan_name'))
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(
|
||||||
|
"VulnWhisperer wasn't able to report the vulnerabilities from the '{}'s source, section {}.\
|
||||||
|
\nError: {}".format(
|
||||||
|
self.config.get(scan, 'source'), self.config.get(scan, 'scan_name'), e))
|
||||||
return True
|
return True
|
||||||
return False
|
return False
|
||||||
|
|
||||||
@ -1258,43 +1316,43 @@ class vulnWhisperer(object):
|
|||||||
|
|
||||||
if self.profile == 'nessus':
|
if self.profile == 'nessus':
|
||||||
vw = vulnWhispererNessus(config=self.config,
|
vw = vulnWhispererNessus(config=self.config,
|
||||||
username=self.username,
|
|
||||||
password=self.password,
|
|
||||||
verbose=self.verbose,
|
|
||||||
profile=self.profile)
|
profile=self.profile)
|
||||||
self.exit_code += vw.whisper_nessus()
|
if vw:
|
||||||
|
self.exit_code += vw.whisper_nessus()
|
||||||
|
|
||||||
elif self.profile == 'qualys_web':
|
elif self.profile == 'qualys_web':
|
||||||
vw = vulnWhispererQualys(config=self.config)
|
vw = vulnWhispererQualys(config=self.config)
|
||||||
self.exit_code += vw.process_web_assets()
|
if vw:
|
||||||
|
self.exit_code += vw.process_web_assets()
|
||||||
|
|
||||||
elif self.profile == 'openvas':
|
elif self.profile == 'openvas':
|
||||||
vw_openvas = vulnWhispererOpenVAS(config=self.config)
|
vw = vulnWhispererOpenVAS(config=self.config)
|
||||||
self.exit_code += vw_openvas.process_openvas_scans()
|
if vw:
|
||||||
|
self.exit_code += vw.process_openvas_scans()
|
||||||
|
|
||||||
elif self.profile == 'tenable':
|
elif self.profile == 'tenable':
|
||||||
vw = vulnWhispererNessus(config=self.config,
|
vw = vulnWhispererNessus(config=self.config,
|
||||||
username=self.username,
|
|
||||||
password=self.password,
|
|
||||||
verbose=self.verbose,
|
|
||||||
profile=self.profile)
|
profile=self.profile)
|
||||||
self.exit_code += vw.whisper_nessus()
|
if vw:
|
||||||
|
self.exit_code += vw.whisper_nessus()
|
||||||
|
|
||||||
elif self.profile == 'qualys_vuln':
|
elif self.profile == 'qualys_vuln':
|
||||||
vw = vulnWhispererQualysVuln(config=self.config)
|
vw = vulnWhispererQualysVuln(config=self.config)
|
||||||
self.exit_code += vw.process_vuln_scans()
|
if vw:
|
||||||
|
self.exit_code += vw.process_vuln_scans()
|
||||||
|
|
||||||
elif self.profile == 'jira':
|
elif self.profile == 'jira':
|
||||||
#first we check config fields are created, otherwise we create them
|
#first we check config fields are created, otherwise we create them
|
||||||
vw = vulnWhispererJIRA(config=self.config)
|
vw = vulnWhispererJIRA(config=self.config)
|
||||||
if not (self.source and self.scanname):
|
if vw:
|
||||||
self.logger.info('No source/scan_name selected, all enabled scans will be synced')
|
if not (self.source and self.scanname):
|
||||||
success = vw.sync_all()
|
self.logger.info('No source/scan_name selected, all enabled scans will be synced')
|
||||||
if not success:
|
success = vw.sync_all()
|
||||||
self.logger.error('All scans sync failed!')
|
if not success:
|
||||||
self.logger.error('Source scanner and scan name needed!')
|
self.logger.error('All scans sync failed!')
|
||||||
return 0
|
self.logger.error('Source scanner and scan name needed!')
|
||||||
else:
|
return 0
|
||||||
vw.jira_sync(self.source, self.scanname)
|
else:
|
||||||
|
vw.jira_sync(self.source, self.scanname)
|
||||||
|
|
||||||
return self.exit_code
|
return self.exit_code
|
||||||
|
Reference in New Issue
Block a user