9 Commits
1.70 ... 1.7.1

Author SHA1 Message Date
078bd9559e Update docker-compose.yml (#93)
increase file descriptors to allow elasticsearch to start.
2018-09-04 01:58:36 -04:00
fc5f9b5b7c Fix docker-compose logstash config (#92)
* ignore nessus requests warnings

* docker-compose fully working with vulnwhisperer integrated

* remove comments docker-compose

* documenting docker-compose

* Readme corrections

* fix after recheck everything works out of the box

* fix exits that break the no specified section execution mode

* fix docker qualysapi issue, updated README

* revert change on deps/qualysapi/qualysapi/util.py (no effect)

* temporarily changed Dockerfile link to the working one

* fix docker-compose logstash config

* permissions needed for logstash container to work

* changing default path qualys, there are no folders
2018-08-20 09:20:58 -04:00
a159d5b06f Update README.md 2018-08-19 12:03:38 -04:00
7b4202de52 Update README.md 2018-08-18 14:29:23 -04:00
8336b72314 Docker-compose fully working with vulnwhisperer integrated (#90)
* ignore nessus requests warnings

* docker-compose fully working with vulnwhisperer integrated

* remove comments docker-compose

* documenting docker-compose

* Readme corrections

* fix after recheck everything works out of the box

* fix exits that break the no specified section execution mode

* fix docker qualysapi issue, updated README

* revert change on deps/qualysapi/qualysapi/util.py (no effect)

* temporarily changed Dockerfile link to the working one
2018-08-17 08:51:28 -04:00
5b879e13c7 Silence requests warnings 2018-08-14 06:23:18 -04:00
a84576b551 No need to specify section to run (#88)
* Add Qualys vulnerability scans

* Use non-zero exit codes for failures

* Convert to strings for Logstash

* Update logstash config for vulnerability scans

* Update README

* Grab all scans statuses

* Add Qualys vulnerability scans

* Use non-zero exit codes for failures

* Convert to strings for Logstash

* Update logstash config for vulnerability scans

* Update README

* Grab all scans statuses

* Fix error: "Cannot convert non-finite values (NA or inf) to integer"

When trying to download the results of Qualys Vulnerability Management scans, the following error pops up:

[FAIL] - Could not process scan/xxxxxxxxxx.xxxxx - Cannot convert non-finite values (NA or inf) to integer

This error is due to pandas operating with the scan results json file, as the last element from the json doesn't fir with the rest of the response's scheme: that element is "target_distribution_across_scanner_appliances", which contains the scanners used and the IP ranges that each scanner went through.

Taking out the last line solves the issue.

Also adding the qualys_vuln scheme to the frameworks_example.ini

* No need to specify section to run

Until now it vulnwhisperer was not running if a section was not specified,
but there is the variable "enabled" on each module config, so now it will
check which modules are enabled and run them sequentialy.

Made mainly in order to be able to automate with docker-compose instance,
as the docker with vulnwhisperer (https://github.com/HASecuritySolutions/docker_vulnwhisperer)
has that command run at the end.

* added to readme + detectify
2018-08-09 16:39:57 -07:00
46be3c71ef example.ini is frameworks_example.ini (#77) 2018-07-06 22:18:26 -07:00
608a49d178 Update README.md 2018-07-05 13:47:22 -04:00
13 changed files with 692 additions and 42 deletions

View File

@ -11,7 +11,6 @@ VulnWhisperer is a vulnerability data and report aggregator. VulnWhisperer will
[![MIT License](https://img.shields.io/badge/license-MIT-blue.svg?style=flat)](http://choosealicense.com/licenses/mit/)
[![Twitter](https://img.shields.io/twitter/follow/VulnWhisperer.svg?style=social&label=Follow)](https://twitter.com/VulnWhisperer)
Currently Supports
-----------------
@ -19,9 +18,10 @@ Currently Supports
- [X] [Nessus (v6 & **v7**)](https://www.tenable.com/products/nessus/nessus-professional)
- [X] [Qualys Web Applications](https://www.qualys.com/apps/web-app-scanning/)
- [X] [Qualys Vulnerability Management (Need license)](https://www.qualys.com/apps/vulnerability-management/)
- [X] [Qualys Vulnerability Management](https://www.qualys.com/apps/vulnerability-management/)
- [X] [OpenVAS](http://www.openvas.org/)
- [X] [Tenable.io](https://www.tenable.com/products/tenable-io)
- [ ] [Detectify](https://detectify.com/)
- [ ] [Nexpose](https://www.rapid7.com/products/nexpose/)
- [ ] [Insight VM](https://www.rapid7.com/products/insightvm/)
- [ ] [NMAP](https://nmap.org/)
@ -36,6 +36,8 @@ Getting Started
4) Import the <a href="https://github.com/austin-taylor/VulnWhisperer/tree/master/kibana/vuln_whisp_kibana">kibana visualizations</a>
5) [Run Vulnwhisperer](#run)
Need assistance or just want to chat? Join our [slack channel](https://t.co/xlrqzLb3vY)
Requirements
-------------
####
@ -162,7 +164,7 @@ There are a few configuration steps to setting up VulnWhisperer:
* Import ElasticSearch Templates
* Import Kibana Dashboards
<a href="https://github.com/austin-taylor/VulnWhisperer/blob/master/configs/frameworks_example.ini">example.ini file</a>
<a href="https://github.com/austin-taylor/VulnWhisperer/blob/master/configs/frameworks_example.ini">frameworks_example.ini file</a>
<p align="left" style="width:200px"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/config_example.png" style="width:200px"></p>
@ -171,14 +173,60 @@ There are a few configuration steps to setting up VulnWhisperer:
To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.
```python
vuln_whisperer -c configs/example.ini -s nessus
vuln_whisperer -c configs/frameworks_example.ini -s nessus
or
vuln_whisperer -c configs/example.ini -s qualys
vuln_whisperer -c configs/frameworks_example.ini -s qualys
```
If no section is specified (e.g. -s nessus), vulnwhisperer will check on the config file for the modules that have the property enabled=true and run them sequentially.
<p align="center" style="width:300px"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/running_vuln_whisperer.png" style="width:400px"></p>
Next you'll need to import the visualizations into Kibana and setup your logstash config. A more thorough README is underway with setup instructions.
Docker-compose
-----
The docker-compose file has been tested and running on a Ubuntu 18.04 environment, with docker-ce v.18.06. The structure's purpose is to store locally the data from the scanners, letting vulnwhisperer update the records and Logstash feed them to ElasticSearch, so it requires a local storage folder.
- It will run out of the box if you create on the root directory of VulnWhisperer a folder named "data", which needs permissions for other users to read/write/execute in order to sync:
```shell
mkdir data && chmod -R 666 data #data/database/report_tracker.db will need 777 to use with local vulnwhisperer
```
otherwise the users running inside the docker containers will not be able to work with it properly. If you don't apply chmod recursively, it will still work to sync the data, but only root use in localhost will have access to the created data (if you run local vulnwhisperer with the same data will break).
- docker/logstash.yml file will need other read/write permissions in order for logstash container to use the configuration file; youll need to run:
```shell
chmod 666 docker/logstash.yml
```
- You will need to rebuild the vulnwhisperer Dockerfile before launching the docker-compose, as by the way it is created right now it doesn't pull the last version of the VulnWhisperer code from Github, due to docker layering inner workings. To do this, the best way is to:
```shell
wget https://raw.githubusercontent.com/qmontal/docker_vulnwhisperer/master/Dockerfile
docker build --no-cache -t hasecuritysolutions/docker_vulnwhisperer -f Dockerfile . --network=host
```
This will create the image hasecuritysolutions/docker_vulnwhisperer:latest from scratch with the latest updates. Will soon fix that with the next VulnWhisperer version.
- The vulnwhisperer container inside of docker-compose is using network_mode=host instead of the bridge mode by default; this is due to issues encountered when the container is trying to pull data from your scanners from a different VLAN than the one you currently are. The host network mode uses the DNS and interface from the host itself, fixing those issues, but it breaks the network isolation from the container (this is due to docker creating bridge interfaces to route the traffic, blocking both container's and host's network). If you change this to bridge, you might need to add your DNS to the config in order to resolve internal hostnames.
- ElasticSearch requires having the value vm.max_map_count with a minimum of 262144; otherwise, it will probably break at launch. Please check https://elk-docker.readthedocs.io/#prerequisites to solve that.
- If you want to change the "data" folder for storing the results, remember to change it from both the docker-compose.yml file and the logstash files that are in the root "docker/" folder.
- Hostnames do NOT allow _ (underscores) on it, if you change the hostname configuration from the docker-compose file and add underscores, config files from logstash will fail.
- If you are having issues with the connection between hosts, to troubleshoot them you can spawn a shell in said host doing the following:
```shell
docker ps #check the images from the containers
docker exec -i -t 665b4a1e17b6 /bin/bash #where 665b4a1e17b6 is the container image you want to troubleshoot
```
You can also make sure that all ELK components are working by doing "curl -i host:9200 (elastic)/ host:5601 (kibana) /host:9600 (logstash). WARNING! It is possible that logstash is not exposing to the external network the port but it does to its internal docker network "esnet".
- If Kibana is not showing the results, check that you are searching on the whole ES range, as by default it shows logs for the last 15 minutes (you can choose up to last 5 years)
- X-Pack has been disabled by default due to the noise, plus being a trial version. You can enable it modifying the docker-compose.yml and docker/logstash.conf files. Logstash.conf contains the default credentials for the X-Pack enabled ES.
- On Logstash container, "/usr/share/logstash/pipeline/" is the default path for pipelines and "/usr/share/logstash/config/" for logstash.yml file, instead of "/etc/logstash/conf.d/" and "/etc/logstash/".
- In order to make vulnwhisperer run periodically, add to crontab the following:
```shell
0 8 * * * /usr/bin/docker-compose run vulnwhisp-vulnwhisperer
```
To launch docker-compose, do:
```shell
docker-compose -f docker-compose.yml up
```
Running Nightly
---------------
If you're running linux, be sure to setup a cronjob to remove old files that get stored in the database. Be sure to change .csv if you're using json.
@ -198,10 +246,14 @@ Video Walkthrough -- Featured on ElasticWebinar
" target="_blank"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/elastic_webinar.png"
alt="Elastic presentation on VulnWhisperer" border="10" /></a>
Credit
Authors
------
Big thank you to <a href="https://github.com/SMAPPER">Justin Henderson</a> for his contributions to vulnWhisperer!
- [Austin Taylor (@HuntOperator)](https://github.com/austin-taylor)
- [Justin Henderson (@smapper)](https://github.com/SMAPPER)
Contributors
------------
- [Quim Montal (@qmontal)](https://github.com/qmontal)
AS SEEN ON TV
-------------

View File

@ -5,6 +5,7 @@ __author__ = 'Austin Taylor'
from vulnwhisp.vulnwhisp import vulnWhisperer
from vulnwhisp.utils.cli import bcolors
from vulnwhisp.base.config import vwConfig
import os
import argparse
import sys
@ -31,10 +32,25 @@ def main():
try:
if args.config and not args.section:
print('{red} ERROR: {error}{endc}'.format(red=bcolors.FAIL,
error='Please specify a section using -s. \
print('{yellow}WARNING: {warning}{endc}'.format(yellow=bcolors.WARNING,
warning='No section was specified, vulnwhisperer will scrape enabled modules from config file. \
\nPlease specify a section using -s. \
\nExample vuln_whisperer -c config.ini -s nessus',
endc=bcolors.ENDC))
config = vwConfig(config_in=args.config)
enabled_sections = config.get_enabled()
for section in enabled_sections:
vw = vulnWhisperer(config=args.config,
profile=section,
verbose=args.verbose,
username=args.username,
password=args.password)
vw.whisper_vulnerabilities()
sys.exit(1)
else:
vw = vulnWhisperer(config=args.config,
profile=args.section,

View File

@ -4,8 +4,8 @@ hostname=localhost
port=8834
username=nessus_username
password=nessus_password
write_path=/opt/vulnwhisperer/nessus/
db_path=/opt/vulnwhisperer/database
write_path=/opt/VulnWhisperer/data/nessus/
db_path=/opt/VulnWhisperer/data/database
trash=false
verbose=true
@ -15,8 +15,8 @@ hostname=cloud.tenable.com
port=443
username=tenable.io_username
password=tenable.io_password
write_path=/opt/vulnwhisperer/tenable/
db_path=/opt/vulnwhisperer/database
write_path=/opt/VulnWhisperer/data/tenable/
db_path=/opt/VulnWhisperer/data/database
trash=false
verbose=true
@ -26,8 +26,8 @@ enabled = true
hostname = qualysapi.qg2.apps.qualys.com
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisperer/qualys/
db_path=/opt/vulnwhisperer/database
write_path=/opt/VulnWhisperer/data/qualys/
db_path=/opt/VulnWhisperer/data/database
verbose=true
# Set the maximum number of retries each connection should attempt.
@ -42,8 +42,8 @@ enabled = true
hostname = qualysapi.qg2.apps.qualys.com
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisperer/qualys/
db_path=/opt/vulnwhisperer/database
write_path=/opt/VulnWhisperer/data/qualys/
db_path=/opt/VulnWhisperer/data/database
verbose=true
# Set the maximum number of retries each connection should attempt.
@ -52,14 +52,26 @@ max_retries = 10
# Template ID will need to be retrieved for each document. Please follow the reference guide above for instructions on how to get your template ID.
template_id = 126024
[detectify]
#Reference https://developer.detectify.com/
enabled = false
hostname = api.detectify.com
#username variable used as apiKey
username = exampleuser
#password variable used as secretKey
password = examplepass
write_path =/opt/VulnWhisperer/data/detectify/
db_path = /opt/VulnWhisperer/data/database
verbose = true
[openvas]
enabled = false
hostname = localhost
port = 4000
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisperer/openvas/
db_path=/opt/vulnwhisperer/database
write_path=/opt/VulnWhisperer/data/openvas/
db_path=/opt/VulnWhisperer/data/database
verbose=true
#[proxy]

View File

@ -1,8 +1,8 @@
version: '2'
services:
vulnwhisp_es1:
vulnwhisp-es1:
image: docker.elastic.co/elasticsearch/elasticsearch:5.6.2
container_name: vulnwhisp_es1
container_name: vulnwhisp-es1
environment:
- cluster.name=vulnwhisperer
- bootstrap.memory_lock=true
@ -11,27 +11,58 @@ services:
memlock:
soft: -1
hard: -1
mem_limit: 1g
nofile:
soft: 65536
hard: 65536
mem_limit: 8g
volumes:
- esdata1:/usr/share/elasticsearch/data
ports:
- 19200:9200
- 9200:9200
environment:
- xpack.security.enabled=false
restart: always
networks:
- esnet
vulnwhisp_ks1:
esnet:
aliases:
- vulnwhisp-es1.local
vulnwhisp-ks1:
image: docker.elastic.co/kibana/kibana:5.6.2
environment:
SERVER_NAME: vulnwhisp_ks1
ELASTICSEARCH_URL: http://vulnwhisp_es1:9200
SERVER_NAME: vulnwhisp-ks1
ELASTICSEARCH_URL: http://vulnwhisp-es1:9200
ports:
- 15601:5601
- 5601:5601
depends_on:
- vulnwhisp-es1
networks:
- esnet
vulnwhisp_ls1:
esnet:
aliases:
- vulnwhisp-ks1.local
vulnwhisp-ls1:
image: docker.elastic.co/logstash/logstash:5.6.2
container_name: vulnwhisp-ls1
volumes:
- ./docker/1000_nessus_process_file.conf:/usr/share/logstash/pipeline/1000_nessus_process_file.conf
- ./docker/2000_qualys_web_scans.conf:/usr/share/logstash/pipeline/2000_qualys_web_scans.conf
- ./docker/3000_openvas.conf:/usr/share/logstash/pipeline/3000_openvas.conf
- ./docker/logstash.yml:/usr/share/logstash/config/logstash.yml
- ./data/:/opt/VulnWhisperer/data
environment:
- xpack.monitoring.enabled=false
depends_on:
- vulnwhisp-es1
networks:
- esnet
esnet:
aliases:
- vulnwhisp-ls1.local
vulnwhisp-vulnwhisperer:
image: hasecuritysolutions/docker_vulnwhisperer:latest
container_name: vulnwhisp-vulnwhisperer
volumes:
- ./data/:/opt/VulnWhisperer/data
- ./configs/frameworks_example.ini:/opt/VulnWhisperer/frameworks_example.ini
network_mode: host
volumes:
esdata1:
driver: local

View File

@ -0,0 +1,220 @@
# Author: Austin Taylor and Justin Henderson
# Email: email@austintaylor.io
# Last Update: 12/20/2017
# Version 0.3
# Description: Take in nessus reports from vulnWhisperer and pumps into logstash
input {
file {
path => "/opt/VulnWhisperer/data/nessus/**/*"
start_position => "beginning"
tags => "nessus"
type => "nessus"
}
file {
path => "/opt/VulnWhisperer/data/tenable/*.csv"
start_position => "beginning"
tags => "tenable"
type => "tenable"
}
}
filter {
if "nessus" in [tags] or "tenable" in [tags] {
# Drop the header column
if [message] =~ "^Plugin ID" { drop {} }
csv {
# columns => ["plugin_id", "cve", "cvss", "risk", "asset", "protocol", "port", "plugin_name", "synopsis", "description", "solution", "see_also", "plugin_output"]
columns => ["plugin_id", "cve", "cvss", "risk", "asset", "protocol", "port", "plugin_name", "synopsis", "description", "solution", "see_also", "plugin_output", "asset_uuid", "vulnerability_state", "ip", "fqdn", "netbios", "operating_system", "mac_address", "plugin_family", "cvss_base", "cvss_temporal", "cvss_temporal_vector", "cvss_vector", "cvss3_base", "cvss3_temporal", "cvss3_temporal_vector", "cvss_vector", "system_type", "host_start", "host_end"]
separator => ","
source => "message"
}
ruby {
code => "if event.get('description')
event.set('description', event.get('description').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('synopsis')
event.set('synopsis', event.get('synopsis').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('solution')
event.set('solution', event.get('solution').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('see_also')
event.set('see_also', event.get('see_also').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('plugin_output')
event.set('plugin_output', event.get('plugin_output').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end"
}
#If using filebeats as your source, you will need to replace the "path" field to "source"
grok {
match => { "path" => "(?<scan_name>[a-zA-Z0-9_.\-]+)_%{INT:scan_id}_%{INT:history_id}_%{INT:last_updated}.csv$" }
tag_on_failure => []
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => ["last_updated"]
}
if [risk] == "None" {
mutate { add_field => { "risk_number" => 0 }}
}
if [risk] == "Low" {
mutate { add_field => { "risk_number" => 1 }}
}
if [risk] == "Medium" {
mutate { add_field => { "risk_number" => 2 }}
}
if [risk] == "High" {
mutate { add_field => { "risk_number" => 3 }}
}
if [risk] == "Critical" {
mutate { add_field => { "risk_number" => 4 }}
}
if ![cve] or [cve] == "nan" {
mutate { remove_field => [ "cve" ] }
}
if ![cvss] or [cvss] == "nan" {
mutate { remove_field => [ "cvss" ] }
}
if ![cvss_base] or [cvss_base] == "nan" {
mutate { remove_field => [ "cvss_base" ] }
}
if ![cvss_temporal] or [cvss_temporal] == "nan" {
mutate { remove_field => [ "cvss_temporal" ] }
}
if ![cvss_temporal_vector] or [cvss_temporal_vector] == "nan" {
mutate { remove_field => [ "cvss_temporal_vector" ] }
}
if ![cvss_vector] or [cvss_vector] == "nan" {
mutate { remove_field => [ "cvss_vector" ] }
}
if ![cvss3_base] or [cvss3_base] == "nan" {
mutate { remove_field => [ "cvss3_base" ] }
}
if ![cvss3_temporal] or [cvss3_temporal] == "nan" {
mutate { remove_field => [ "cvss3_temporal" ] }
}
if ![cvss3_temporal_vector] or [cvss3_temporal_vector] == "nan" {
mutate { remove_field => [ "cvss3_temporal_vector" ] }
}
if ![description] or [description] == "nan" {
mutate { remove_field => [ "description" ] }
}
if ![mac_address] or [mac_address] == "nan" {
mutate { remove_field => [ "mac_address" ] }
}
if ![netbios] or [netbios] == "nan" {
mutate { remove_field => [ "netbios" ] }
}
if ![operating_system] or [operating_system] == "nan" {
mutate { remove_field => [ "operating_system" ] }
}
if ![plugin_output] or [plugin_output] == "nan" {
mutate { remove_field => [ "plugin_output" ] }
}
if ![see_also] or [see_also] == "nan" {
mutate { remove_field => [ "see_also" ] }
}
if ![synopsis] or [synopsis] == "nan" {
mutate { remove_field => [ "synopsis" ] }
}
if ![system_type] or [system_type] == "nan" {
mutate { remove_field => [ "system_type" ] }
}
mutate {
remove_field => [ "message" ]
add_field => { "risk_score" => "%{cvss}" }
}
mutate {
convert => { "risk_score" => "float" }
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
# Compensating controls - adjust risk_score
# Adobe and Java are not allowed to run in browser unless whitelisted
# Therefore, lower score by dividing by 3 (score is subjective to risk)
#Modify and uncomment when ready to use
#if [risk_score] != 0 {
# if [plugin_name] =~ "Adobe" and [risk_score] > 6 or [plugin_name] =~ "Java" and [risk_score] > 6 {
# ruby {
# code => "event.set('risk_score', event.get('risk_score') / 3)"
# }
# mutate {
# add_field => { "compensating_control" => "Adobe and Flash removed from browsers unless whitelisted site." }
# }
# }
#}
# Add tags for reporting based on assets or criticality
if [asset] == "dc01" or [asset] == "dc02" or [asset] == "pki01" or [asset] == "192.168.0.54" or [asset] =~ "^192\.168\.0\." or [asset] =~ "^42.42.42." {
mutate {
add_tag => [ "critical_asset" ]
}
}
#if [asset] =~ "^192\.168\.[45][0-9][0-9]\.1$" or [asset] =~ "^192.168\.[50]\.[0-9]{1,2}\.1$"{
# mutate {
# add_tag => [ "has_hipaa_data" ]
# }
#}
#if [asset] =~ "^192\.168\.[45][0-9][0-9]\." {
# mutate {
# add_tag => [ "hipaa_asset" ]
# }
#}
if [asset] =~ "^hr" {
mutate {
add_tag => [ "pci_asset" ]
}
}
#if [asset] =~ "^10\.0\.50\." {
# mutate {
# add_tag => [ "web_servers" ]
# }
#}
}
}
output {
if "nessus" in [tags] or "tenable" in [tags] or [type] in [ "nessus", "tenable" ] {
# stdout { codec => rubydebug }
elasticsearch {
hosts => [ "vulnwhisp-es1.local:9200" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}

View File

@ -0,0 +1,153 @@
# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 12/30/2017
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash
input {
file {
path => "/opt/VulnWhisperer/data/qualys/*.json"
type => json
codec => json
start_position => "beginning"
tags => [ "qualys" ]
}
}
filter {
if "qualys" in [tags] {
grok {
match => { "path" => [ "(?<tags>qualys_vuln)_scan_%{DATA}_%{INT:last_updated}.json$", "(?<tags>qualys_web)_%{INT:app_id}_%{INT:last_updated}.json$" ] }
tag_on_failure => []
}
mutate {
replace => [ "message", "%{message}" ]
#gsub => [
# "message", "\|\|\|", " ",
# "message", "\t\t", " ",
# "message", " ", " ",
# "message", " ", " ",
# "message", " ", " ",
# "message", "nan", " ",
# "message",'\n',''
#]
}
if "qualys_web" in [tags] {
mutate {
add_field => { "asset" => "%{web_application_name}" }
add_field => { "risk_score" => "%{cvss}" }
}
} else if "qualys_vuln" in [tags] {
mutate {
add_field => { "asset" => "%{ip}" }
add_field => { "risk_score" => "%{cvss}" }
}
}
if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}
mutate { replace => { "risk" => "info" }}
}
if [risk] == "2" {
mutate { add_field => { "risk_number" => 1 }}
mutate { replace => { "risk" => "low" }}
}
if [risk] == "3" {
mutate { add_field => { "risk_number" => 2 }}
mutate { replace => { "risk" => "medium" }}
}
if [risk] == "4" {
mutate { add_field => { "risk_number" => 3 }}
mutate { replace => { "risk" => "high" }}
}
if [risk] == "5" {
mutate { add_field => { "risk_number" => 4 }}
mutate { replace => { "risk" => "critical" }}
}
mutate {
remove_field => "message"
}
if [first_time_detected] {
date {
match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_detected"
}
}
if [first_time_tested] {
date {
match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_tested"
}
}
if [last_time_detected] {
date {
match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_detected"
}
}
if [last_time_tested] {
date {
match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_tested"
}
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => "last_updated"
}
mutate {
convert => { "plugin_id" => "integer"}
convert => { "id" => "integer"}
convert => { "risk_number" => "integer"}
convert => { "risk_score" => "float"}
convert => { "total_times_detected" => "integer"}
convert => { "cvss_temporal" => "float"}
convert => { "cvss" => "float"}
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
if [asset] =~ "\.yourdomain\.(com|net)$" {
mutate {
add_tag => [ "critical_asset" ]
}
}
}
}
output {
if "qualys" in [tags] {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "vulnwhisp-es1.local:9200" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}

146
docker/3000_openvas.conf Normal file
View File

@ -0,0 +1,146 @@
# Author: Austin Taylor and Justin Henderson
# Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018
# Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash
input {
file {
path => "/opt/VulnWhisperer/data/openvas/*.json"
type => json
codec => json
start_position => "beginning"
tags => [ "openvas_scan", "openvas" ]
}
}
filter {
if "openvas_scan" in [tags] {
mutate {
replace => [ "message", "%{message}" ]
gsub => [
"message", "\|\|\|", " ",
"message", "\t\t", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " ",
"message", "nan", " ",
"message",'\n',''
]
}
grok {
match => { "path" => "openvas_scan_%{DATA:scan_id}_%{INT:last_updated}.json$" }
tag_on_failure => []
}
mutate {
add_field => { "risk_score" => "%{cvss}" }
}
if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}
mutate { replace => { "risk" => "info" }}
}
if [risk] == "2" {
mutate { add_field => { "risk_number" => 1 }}
mutate { replace => { "risk" => "low" }}
}
if [risk] == "3" {
mutate { add_field => { "risk_number" => 2 }}
mutate { replace => { "risk" => "medium" }}
}
if [risk] == "4" {
mutate { add_field => { "risk_number" => 3 }}
mutate { replace => { "risk" => "high" }}
}
if [risk] == "5" {
mutate { add_field => { "risk_number" => 4 }}
mutate { replace => { "risk" => "critical" }}
}
mutate {
remove_field => "message"
}
if [first_time_detected] {
date {
match => [ "first_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_detected"
}
}
if [first_time_tested] {
date {
match => [ "first_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "first_time_tested"
}
}
if [last_time_detected] {
date {
match => [ "last_time_detected", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_detected"
}
}
if [last_time_tested] {
date {
match => [ "last_time_tested", "dd MMM yyyy HH:mma 'GMT'ZZ", "dd MMM yyyy HH:mma 'GMT'" ]
target => "last_time_tested"
}
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => "last_updated"
}
mutate {
convert => { "plugin_id" => "integer"}
convert => { "id" => "integer"}
convert => { "risk_number" => "integer"}
convert => { "risk_score" => "float"}
convert => { "total_times_detected" => "integer"}
convert => { "cvss_temporal" => "float"}
convert => { "cvss" => "float"}
}
if [risk_score] == 0 {
mutate {
add_field => { "risk_score_name" => "info" }
}
}
if [risk_score] > 0 and [risk_score] < 3 {
mutate {
add_field => { "risk_score_name" => "low" }
}
}
if [risk_score] >= 3 and [risk_score] < 6 {
mutate {
add_field => { "risk_score_name" => "medium" }
}
}
if [risk_score] >=6 and [risk_score] < 9 {
mutate {
add_field => { "risk_score_name" => "high" }
}
}
if [risk_score] >= 9 {
mutate {
add_field => { "risk_score_name" => "critical" }
}
}
# Add your critical assets by subnet or by hostname. Comment this field out if you don't want to tag any, but the asset panel will break.
if [asset] =~ "^10\.0\.100\." {
mutate {
add_tag => [ "critical_asset" ]
}
}
}
}
output {
if "openvas" in [tags] {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "vulnwhisp-es1.local:9200" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
}

5
docker/logstash.yml Normal file
View File

@ -0,0 +1,5 @@
path.config: /usr/share/logstash/pipeline/
xpack.monitoring.elasticsearch.password: changeme
xpack.monitoring.elasticsearch.url: vulnwhisp-es1.local:9200
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.enabled: false

View File

@ -6,7 +6,7 @@
input {
file {
path => "/opt/vulnwhisperer/qualys/scans/**/*.json"
path => "/opt/vulnwhisperer/qualys/*.json"
type => json
codec => json
start_position => "beginning"

View File

@ -20,3 +20,11 @@ class vwConfig(object):
def getbool(self, section, option):
return self.config.getboolean(section, option)
def get_enabled(self):
enabled = []
check = ["true", "True", "1"]
for section in self.config.sections():
if self.get(section, "enabled") in check:
enabled.append(section)
return enabled

View File

@ -1,10 +1,17 @@
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
import pytz
from datetime import datetime
import json
import sys
import time
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
class NessusAPI(object):
SESSION = '/session'

View File

@ -292,7 +292,7 @@ class vulnWhispererNessus(vulnWhispererBase):
if not scan_list:
self.vprint('{info} No new scans to process. Exiting...'.format(info=bcolors.INFO))
exit(0)
return 0
# Create scan subfolders
@ -612,7 +612,7 @@ class vulnWhispererQualys(vulnWhispererBase):
else:
self.vprint('{info} No new scans to process. Exiting...'.format(info=bcolors.INFO))
self.conn.close()
exit(0)
return 0
class vulnWhispererOpenVAS(vulnWhispererBase):
@ -748,7 +748,7 @@ class vulnWhispererOpenVAS(vulnWhispererBase):
else:
self.vprint('{info} No new scans to process. Exiting...'.format(info=bcolors.INFO))
self.conn.close()
exit(0)
return 0
class vulnWhispererQualysVuln(vulnWhispererBase):
@ -872,7 +872,7 @@ class vulnWhispererQualysVuln(vulnWhispererBase):
else:
self.vprint('{info} No new scans to process. Exiting...'.format(info=bcolors.INFO))
self.conn.close()
exit(0)
return 0
class vulnWhisperer(object):