
* updating my base to match original vulnwhisperer (#1) * Create docker-compose.yml * Update 9000_output_nessus.conf * Added an argument for username and password, which takes precendece over nessus. Fixed #5 * Update README.md * Silence NoneType object * Put in a check to make sure that the config file exists. FIXES austin-taylor/VulnWhisperer#4 * remove leading and trailing spaces around all input switches. Fixes austi-taylor/VulnWhisperer#6 * Update README.md * Allow for any directories to be monitored * Addition of Qualys WebApp Processing * Addition of Qualys WebApp Processing * Fixed multiple bugs, cleaned up formatting, produces solid csv output for Qualys Web App scans * Adding custom version of QualysAPI * Field Cleanup * Addition of submodules, update to connectors, base class start * Addition of submodules, update to connectors, base class start * Addition of submodules, update to connectors, base class start * Refactored classes to be more modular, update to ini file and submodules * Refactored classes to be more modular, update to ini file and submodules * Removing commented code * Addition of category class and special class for Qualys Scanning Reports. Also added additional enrichments to reports * Column update for scans and N/A cleanup * Fix for str casting * Update README.md * Update to README * Update to README * Update to README * Update to requirements.txt * Support for json output * Database tracking for processed Qualys scans * Database tracking for processed Qualys scans * Bug fix for counter in Nessus and format fix for qualys * Check for new records * Update to count tracker * Update to write path logic * Better database handling * Addition of VulnWhisperer-Qualys logstash files * Addition of VulnWhisperer-Qualys logstash files * Update to logstash template * Updated dashboard * Update to README * Update to README * Logo update * Readme Update * Readme Update * Readme Update * Adding name of scan and scan reference * Plugin name converted to scan name * Update to README * Documentation update * README Update * README Update * Update README.md * Add free automated flake8 testing of pull requests [Flake8](http://flake8.pycqa.org) tests can help to find Python syntax errors, undefined names, and other code quality issues. [Travis CI](https://travis-ci.org) is a framework for running flake8 and other tests which is free for open source projects like this one. The owner of the this repo would need to go to https://travis-ci.org/profile and flip the repository switch __on__ to enable free automated flake8 testing of each pull request. * Testing build with no submodules * flake8 --ignore=deps/qualysapi * flake8 . --exclude=/deps/qualysapi * Remove leading slash * Add build status to README * Travis Config update * README Update * README Update * Create CNAME * Set theme jekyll-theme-leap-day * README Update * Getting started steps * Getting started steps * Remind user to select section if using a config * Update to readme * Update to readme * Update to readme * Update to readme * Update to README * Update to README * Update to example logstash config * Update to qualys logstash conf to reflect example config * Update to README * Update to README * Readme update * Rename logstash-nessus-template.json to logstash-vulnwhisperer-template.json * Update 1000_nessus_process_file.conf * Delete LICENSE * Create LICENSE * Update to make nessus visualizations consistent with qualys * Update to README * Update to README * Badge addition * Badge addition * Addition of OpenVAS Connector * Addition of OpenVAS * Update 9000_output_nessus.conf * Delete 9000_output_nessus.conf * Update 1000_nessus_process_file.conf * Automatically create filepath and directory if it does not exist * Addition of OpenVas -- ready for alpha * Addition of OpenVas -- ready for alpha * Allow template defined config form IDs * Completion of OpenVAS module * Completion of OpenVAS module * Remove template format * Addition of openvas logstash config * Update setup.py * Update README.md * ELK Sample Install (#37) Updated Readme.md to include a Sample ELK Install guide addressing multiple issues around ELK Cluster/Node Configuration. * Update vulnwhisp.py * VulnFramework Links (#39) Quick update regarding issue #33 * Updating config to be consistent with conf files * Preserving newlines & carriage returns (#48) * Preserve newlines & carriage returns * Convert '\n' & '\r' to newlines & carriage returns * Removed no longer supported InsecureRequestWarning workaround. (#55) * Removed no longer supported InsecureRequestWarning workaround. * Add dependencies to README.md * Update vulnwhisp.py * Fix to apt-get install * Nessus bugfixes (#68) * Handle cases where no scans are present * Prevent infinite login loop with incorrect creds * Print actual config file path * Don't overwrite Nessus Synopsis with Description * Tenable.io support (#70) * Basic tenable.io support * Add tenable config section * Use existing variable * Fix indent * Fix paren * Use ternary syntax * Update Logstash config for tenable.io * Update README.md * Update template to version 5.x (#73) * Update template to Elasticsearch 5.x * Update template to Elasticsearch 5.x I think _all field is no longer needed from ES 5.x because of the search all field execution if _all is disabled * Qualys Vulnerability Management integration (#74) * Add Qualys vulnerability scans * Use non-zero exit codes for failures * Convert to strings for Logstash * Update logstash config for vulnerability scans * Update README * Grab all scans statuses * Add Qualys vulnerability scans * Use non-zero exit codes for failures * Convert to strings for Logstash * Update logstash config for vulnerability scans * Update README * Grab all scans statuses * Fix error: "Cannot convert non-finite values (NA or inf) to integer" When trying to download the results of Qualys Vulnerability Management scans, the following error pops up: [FAIL] - Could not process scan/xxxxxxxxxx.xxxxx - Cannot convert non-finite values (NA or inf) to integer This error is due to pandas operating with the scan results json file, as the last element from the json doesn't fir with the rest of the response's scheme: that element is "target_distribution_across_scanner_appliances", which contains the scanners used and the IP ranges that each scanner went through. Taking out the last line solves the issue. Also adding the qualys_vuln scheme to the frameworks_example.ini * Update README.md * example.ini is frameworks_example.ini (#77) * No need to specify section to run (#88) * Add Qualys vulnerability scans * Use non-zero exit codes for failures * Convert to strings for Logstash * Update logstash config for vulnerability scans * Update README * Grab all scans statuses * Add Qualys vulnerability scans * Use non-zero exit codes for failures * Convert to strings for Logstash * Update logstash config for vulnerability scans * Update README * Grab all scans statuses * Fix error: "Cannot convert non-finite values (NA or inf) to integer" When trying to download the results of Qualys Vulnerability Management scans, the following error pops up: [FAIL] - Could not process scan/xxxxxxxxxx.xxxxx - Cannot convert non-finite values (NA or inf) to integer This error is due to pandas operating with the scan results json file, as the last element from the json doesn't fir with the rest of the response's scheme: that element is "target_distribution_across_scanner_appliances", which contains the scanners used and the IP ranges that each scanner went through. Taking out the last line solves the issue. Also adding the qualys_vuln scheme to the frameworks_example.ini * No need to specify section to run Until now it vulnwhisperer was not running if a section was not specified, but there is the variable "enabled" on each module config, so now it will check which modules are enabled and run them sequentialy. Made mainly in order to be able to automate with docker-compose instance, as the docker with vulnwhisperer (https://github.com/HASecuritySolutions/docker_vulnwhisperer) has that command run at the end. * added to readme + detectify * Silence requests warnings * Docker-compose fully working with vulnwhisperer integrated (#90) * ignore nessus requests warnings * docker-compose fully working with vulnwhisperer integrated * remove comments docker-compose * documenting docker-compose * Readme corrections * fix after recheck everything works out of the box * fix exits that break the no specified section execution mode * fix docker qualysapi issue, updated README * revert change on deps/qualysapi/qualysapi/util.py (no effect) * temporarily changed Dockerfile link to the working one * Update README.md * Update README.md * Fix docker-compose logstash config (#92) * ignore nessus requests warnings * docker-compose fully working with vulnwhisperer integrated * remove comments docker-compose * documenting docker-compose * Readme corrections * fix after recheck everything works out of the box * fix exits that break the no specified section execution mode * fix docker qualysapi issue, updated README * revert change on deps/qualysapi/qualysapi/util.py (no effect) * temporarily changed Dockerfile link to the working one * fix docker-compose logstash config * permissions needed for logstash container to work * changing default path qualys, there are no folders * Update 1000_vulnWhispererBaseVisuals.json Update field to include keyword to prevent error: TypeError: "field" is a required parameter * Update docker-compose.yml (#93) increase file descriptors to allow elasticsearch to start. * Update Slack link on README.md * Update README.md Added to README.md @pemontto as contributor * Jira module fully working (#104) * clean OS X .DS_Store files * fix nessus end of line carriage, added JIRA args * JIRA module fully working * jira module working with nessus * added check on already existing jira config, update README * qualys_vm<->jira working, qualys_vm database entries with qualys_vm, improved checks * JIRA module updates ticket's assets and comments update * added JIRA auto-close function for resolved vulnerabitilies * fix if components variable empty issue * fix creation of new ticket after updating existing one * final fixes, added extra line in template * added vulnerability criticality as label in order to be able to filter * Added jira section to config file and fail check for config variable (#105) * clean OS X .DS_Store files * fix nessus end of line carriage, added JIRA args * JIRA module fully working * jira module working with nessus * added check on already existing jira config, update README * qualys_vm<->jira working, qualys_vm database entries with qualys_vm, improved checks * JIRA module updates ticket's assets and comments update * added JIRA auto-close function for resolved vulnerabitilies * fix if components variable empty issue * fix creation of new ticket after updating existing one * final fixes, added extra line in template * added vulnerability criticality as label in order to be able to filter * jira module gets now minimum criticality from config file * added jira config to frameworks_example.ini * fail check for config variable in case it is left empty * fix issue jira-qualys criticality comparison * update qualysapi to latest + PR and refactored vulnwhisperer qualys module to qualys-web (#108) * update qualysapi to latest + PR and refactored vulnwhisperer qualys module to qualys-web * changing config template paths for qualys * Update frameworks_example.ini Will leave for now qualys local folder as "qualys" instead of changing to one for each module, as like this it will still be compatible with the current logstash and we will be able to update master to drop the qualysapi fork once the new version is uploaded to PyPI repository. PR from qualysapi repo has already been merged, so the only missing is the upload to PyPI. * Rework logging using the stdlib machinery (#116) * Rework logging using the stdlib machinery Use the verbose or debug flag to enable/disable logging.DEBUG Remove the vprint function from all classes Remove bcolors from all code Cleanup [INFO], [ERROR], {success} and similar * fix some errors my local linter missed but travis catched * add coloredlogs and --fancy command line flag * qualysapi dependency removal * Qualysapi update (#118) * update qualysapi to latest + PR and refactored vulnwhisperer qualys module to qualys-web * changing config template paths for qualys * Update frameworks_example.ini Will leave for now qualys local folder as "qualys" instead of changing to one for each module, as like this it will still be compatible with the current logstash and we will be able to update master to drop the qualysapi fork once the new version is uploaded to PyPI repository. PR from qualysapi repo has already been merged, so the only missing is the upload to PyPI. * delete qualysapi fork and added to requirements * merge with testing * Jira extras (#120) * changing config template paths for qualys * Update frameworks_example.ini Will leave for now qualys local folder as "qualys" instead of changing to one for each module, as like this it will still be compatible with the current logstash and we will be able to update master to drop the qualysapi fork once the new version is uploaded to PyPI repository. PR from qualysapi repo has already been merged, so the only missing is the upload to PyPI. * initialize variable fullpath to avoid break * fix get latest scan entry from db and ignore 'potential' not verified vulns * added host resolv + cache to speed already resolved, jira logging * make sure that vulnerability criticality appears as a label on ticket + automatic actions * jira bulk report of scans, fix on nessus logging, jira time resolution and list all ticket reported assets * added jira ticket data download + change default time window from 6 to 12 months * small fixes * jira logstash files * fix variable confusion (thx Travis :) * update readme (#121) * Add ansible provisioning (#122) * first ansible skeleton * first commit of ansible installation of vulnwhisperer outside docker * first ansible skeleton * first commit of ansible installation of vulnwhisperer outside docker * refactor the ansible role a bit * update readme, add fail validation step to provision.yml and fix typo when calling a logging funciton * removing ansible from vulnwhisperer, creating a new repo for ansible deployment * closed ticket metrics only get last 12 months tickets * Update README.md Fixing travis link * Restoring custom qualys wrapper * Restoring custom qualys wrapper * Update README.md * Updated the visualizations to support the 6.x ELK stack * making the text message more generic * removed visualizations that were not part of a dashboard * Built a single file, since Kibana allows for that. Created a new scripted value in the logstash-vulnwhisperer that will allow uniqu fingerprinting. Updated all visualizations to support the unqiue count of the scan_fingerprint. Fixes #130 Fixes #126 Fixes #111
Create actionable data from your vulnerability scans
VulnWhisperer is a vulnerability data and report aggregator. VulnWhisperer will pull all the reports and create a file with a unique filename which is then fed into logstash. Logstash extracts data from the filename and tags all of the information inside the report (see logstash_vulnwhisp.conf file). Data is then shipped to elasticsearch to be indexed.
Currently Supports
Vulnerability Frameworks
- Nessus (v6 & v7)
- Qualys Web Applications
- Qualys Vulnerability Management
- OpenVAS
- Tenable.io
- Detectify
- Nexpose
- Insight VM
- NMAP
- Burp Suite
- OWASP ZAP
- More to come
Reporting Frameworks
Getting Started
- Follow the install requirements
- Fill out the section you want to process in example.ini file
- Modify the IP settings in the logstash files to accomodate your environment and import them to your logstash conf directory (default is /etc/logstash/conf.d/)
- Import the kibana visualizations
- Run Vulnwhisperer
Need assistance or just want to chat? Join our slack channel
Requirements
- ElasticStack 5.x
- Python 2.7
- Vulnerability Scanner
- Optional: Message broker such as Kafka or RabbitMQ
Install Requirements-VulnWhisperer(may require sudo)
First install requirement dependencies
sudo apt-get install zlib1g-dev libxml2-dev libxslt1-dev
Then install requirements
pip install -r /path/to/VulnWhisperer/requirements.txt
cd /path/to/VulnWhisperer
python setup.py install
Now you're ready to pull down scans. (see run section)
Install Requirements-ELK Node *SAMPLE*
The following instructions should be utilized as a Sample Guide in the absence of an existing ELK Cluster/Node. This will cover a Debian example install guide of a stand-alone node of Elasticsearch & Kibana.
While Logstash is included in this install guide, it it recommended that a seperate host pulling the VulnWhisperer data is utilized with Logstash to ship the data to the Elasticsearch node.
Please note there is a docker-compose.yml available as well.
Debian: (https://www.elastic.co/guide/en/elasticsearch/reference/5.6/deb.html)
sudo apt-get install -y default-jre
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
sudo apt-get update && sudo apt-get install elasticsearch kibana logstash
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable elasticsearch.service
sudo /bin/systemctl enable kibana.service
sudo /bin/systemctl enable logstash.service
Elasticsearch & Kibana Sample Config Notes
Utilizing your favorite text editor:
- Grab your host IP and change the IP of your /etc/elasticsearch/elasticsearch.yml file. (This defaults to 'localhost')
- Validate Elasticsearch is set to run on port 9200 (Default)
- Grab your host IP and change the IP of your /etc/kibana/kibana.yml file. (This defaults to 'localhost') Validate that Kibana is pointing to the correct Elasticsearch IP (This was set in the previous step)
- Validate Kibana is set to run on port 5601 (Default)
Start elasticsearch and validate they are running/communicating with one another:
sudo service elasticsearch start
sudo service kibana start
OR
sudo systemctl start elasticsearch.service
sudo systemctl start kibana.service
Logstash Sample Config Notes
- Copy/Move the Logstash .conf files from /VulnWhisperer/logstash/ to /etc/logstash/conf.d/
- Validate the Logstash.conf files input contains the correct location of VulnWhisper Scans in the input.file.path directory identified below:
input {
file {
path => "/opt/vulnwhisperer/nessus/**/*"
start_position => "beginning"
tags => "nessus"
type => "nessus"
}
}
- Validate the Logstash.conf files output contains the correct Elasticsearch IP set during the previous step above: (This will default to localhost)
output {
if "nessus" in [tags] or [type] == "nessus" {
#stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
- Validate logstash has the correct file permissions to read the location of the VulnWhisperer Scans
Once configured run Logstash: (Running Logstash as a service will pick up all the files in /etc/logstash/conf.d/ If you would like to run only one logstash file please reference the command below):
Logstash as a service:
sudo service logstash start
OR
sudo systemctl start logstash.service
Single Logstash file:
sudo /usr/share/logstash/bin/logstash --path.settings /etc/logstash/ -f /etc/logstash/conf.d/1000_nessus_process_file.conf
Configuration
There are a few configuration steps to setting up VulnWhisperer:
- Configure Ini file
- Setup Logstash File
- Import ElasticSearch Templates
- Import Kibana Dashboards
Run
To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.
vuln_whisperer -c configs/frameworks_example.ini -s nessus
or
vuln_whisperer -c configs/frameworks_example.ini -s qualys
If no section is specified (e.g. -s nessus), vulnwhisperer will check on the config file for the modules that have the property enabled=true and run them sequentially.
Next you'll need to import the visualizations into Kibana and setup your logstash config. A more thorough README is underway with setup instructions.Docker-compose
The docker-compose file has been tested and running on a Ubuntu 18.04 environment, with docker-ce v.18.06. The structure's purpose is to store locally the data from the scanners, letting vulnwhisperer update the records and Logstash feed them to ElasticSearch, so it requires a local storage folder.
- It will run out of the box if you create on the root directory of VulnWhisperer a folder named "data", which needs permissions for other users to read/write/execute in order to sync:
mkdir data && chmod -R 666 data #data/database/report_tracker.db will need 777 to use with local vulnwhisperer
otherwise the users running inside the docker containers will not be able to work with it properly. If you don't apply chmod recursively, it will still work to sync the data, but only root use in localhost will have access to the created data (if you run local vulnwhisperer with the same data will break).
- docker/logstash.yml file will need other read/write permissions in order for logstash container to use the configuration file; youll need to run:
chmod 666 docker/logstash.yml
- You will need to rebuild the vulnwhisperer Dockerfile before launching the docker-compose, as by the way it is created right now it doesn't pull the last version of the VulnWhisperer code from Github, due to docker layering inner workings. To do this, the best way is to:
wget https://raw.githubusercontent.com/qmontal/docker_vulnwhisperer/master/Dockerfile
docker build --no-cache -t hasecuritysolutions/docker_vulnwhisperer -f Dockerfile . --network=host
This will create the image hasecuritysolutions/docker_vulnwhisperer:latest from scratch with the latest updates. Will soon fix that with the next VulnWhisperer version.
- The vulnwhisperer container inside of docker-compose is using network_mode=host instead of the bridge mode by default; this is due to issues encountered when the container is trying to pull data from your scanners from a different VLAN than the one you currently are. The host network mode uses the DNS and interface from the host itself, fixing those issues, but it breaks the network isolation from the container (this is due to docker creating bridge interfaces to route the traffic, blocking both container's and host's network). If you change this to bridge, you might need to add your DNS to the config in order to resolve internal hostnames.
- ElasticSearch requires having the value vm.max_map_count with a minimum of 262144; otherwise, it will probably break at launch. Please check https://elk-docker.readthedocs.io/#prerequisites to solve that.
- If you want to change the "data" folder for storing the results, remember to change it from both the docker-compose.yml file and the logstash files that are in the root "docker/" folder.
- Hostnames do NOT allow _ (underscores) on it, if you change the hostname configuration from the docker-compose file and add underscores, config files from logstash will fail.
- If you are having issues with the connection between hosts, to troubleshoot them you can spawn a shell in said host doing the following:
docker ps #check the images from the containers
docker exec -i -t 665b4a1e17b6 /bin/bash #where 665b4a1e17b6 is the container image you want to troubleshoot
You can also make sure that all ELK components are working by doing "curl -i host:9200 (elastic)/ host:5601 (kibana) /host:9600 (logstash). WARNING! It is possible that logstash is not exposing to the external network the port but it does to its internal docker network "esnet".
- If Kibana is not showing the results, check that you are searching on the whole ES range, as by default it shows logs for the last 15 minutes (you can choose up to last 5 years)
- X-Pack has been disabled by default due to the noise, plus being a trial version. You can enable it modifying the docker-compose.yml and docker/logstash.conf files. Logstash.conf contains the default credentials for the X-Pack enabled ES.
- On Logstash container, "/usr/share/logstash/pipeline/" is the default path for pipelines and "/usr/share/logstash/config/" for logstash.yml file, instead of "/etc/logstash/conf.d/" and "/etc/logstash/".
- In order to make vulnwhisperer run periodically, add to crontab the following:
0 8 * * * /usr/bin/docker-compose run vulnwhisp-vulnwhisperer
To launch docker-compose, do:
docker-compose -f docker-compose.yml up
Running Nightly
If you're running linux, be sure to setup a cronjob to remove old files that get stored in the database. Be sure to change .csv if you're using json.
Setup crontab -e with the following config (modify to your environment) - this will run vulnwhisperer each night at 0130:
00 1 * * * /usr/bin/find /opt/vulnwhisp/ -type f -name '*.csv' -ctime +3 -exec rm {} \;
30 1 * * * /usr/local/bin/vuln_whisperer -c /opt/vulnwhisp/configs/example.ini
For windows, you may need to type the full path of the binary in vulnWhisperer located in the bin directory.