12 Commits
1.5.0 ... 1.70

Author SHA1 Message Date
7f2c59f531 Qualys Vulnerability Management integration (#74)
* Add Qualys vulnerability scans

* Use non-zero exit codes for failures

* Convert to strings for Logstash

* Update logstash config for vulnerability scans

* Update README

* Grab all scans statuses

* Add Qualys vulnerability scans

* Use non-zero exit codes for failures

* Convert to strings for Logstash

* Update logstash config for vulnerability scans

* Update README

* Grab all scans statuses

* Fix error: "Cannot convert non-finite values (NA or inf) to integer"

When trying to download the results of Qualys Vulnerability Management scans, the following error pops up:

[FAIL] - Could not process scan/xxxxxxxxxx.xxxxx - Cannot convert non-finite values (NA or inf) to integer

This error is due to pandas operating with the scan results json file, as the last element from the json doesn't fir with the rest of the response's scheme: that element is "target_distribution_across_scanner_appliances", which contains the scanners used and the IP ranges that each scanner went through.

Taking out the last line solves the issue.

Also adding the qualys_vuln scheme to the frameworks_example.ini
2018-07-05 10:34:02 -07:00
3ac9a8156a Update template to version 5.x (#73)
* Update template to Elasticsearch 5.x

* Update template to Elasticsearch 5.x

I think _all field is no longer needed from ES 5.x because of the search all field execution if _all is disabled
2018-06-30 13:25:29 -07:00
9a08acb2d6 Update README.md 2018-06-26 13:04:40 -04:00
38d2eec065 Tenable.io support (#70)
* Basic tenable.io support

* Add tenable config section

* Use existing variable

* Fix indent

* Fix paren

* Use ternary syntax

* Update Logstash config for tenable.io
2018-06-26 13:03:08 -04:00
9b10711d34 Nessus bugfixes (#68)
* Handle cases where no scans are present

* Prevent infinite login loop with incorrect creds

* Print actual config file path

* Don't overwrite Nessus Synopsis with Description
2018-06-13 02:56:06 -04:00
9049b1ff0f Fix to apt-get install 2018-06-04 20:23:17 -04:00
d1d679b12f Update vulnwhisp.py 2018-05-04 10:03:58 -04:00
8ca1c3540d Removed no longer supported InsecureRequestWarning workaround. (#55)
* Removed no longer supported InsecureRequestWarning workaround.

* Add dependencies to README.md
2018-04-17 13:27:23 -04:00
e4e9ed7f28 Preserving newlines & carriage returns (#48)
* Preserve newlines & carriage returns

* Convert '\n' & '\r' to newlines & carriage returns
2018-04-10 08:54:21 -04:00
0982e26197 Updating config to be consistent with conf files 2018-04-02 17:53:24 -04:00
9fc9af37f7 VulnFramework Links (#39)
Quick update regarding issue #33
2018-03-07 14:21:15 -05:00
3984c879cd Update vulnwhisp.py 2018-03-05 07:03:49 -05:00
12 changed files with 431 additions and 111 deletions

BIN
.DS_Store vendored Normal file

Binary file not shown.

View File

@ -17,13 +17,14 @@ Currently Supports
### Vulnerability Frameworks
- [X] Nessus (v6 & **v7**)
- [X] Qualys Web Applications
- [ ] Qualys Vulnerability Management (Need license)
- [X] OpenVAS
- [ ] Nexpose
- [ ] Insight VM
- [ ] NMAP
- [X] [Nessus (v6 & **v7**)](https://www.tenable.com/products/nessus/nessus-professional)
- [X] [Qualys Web Applications](https://www.qualys.com/apps/web-app-scanning/)
- [X] [Qualys Vulnerability Management (Need license)](https://www.qualys.com/apps/vulnerability-management/)
- [X] [OpenVAS](http://www.openvas.org/)
- [X] [Tenable.io](https://www.tenable.com/products/tenable-io)
- [ ] [Nexpose](https://www.rapid7.com/products/nexpose/)
- [ ] [Insight VM](https://www.rapid7.com/products/insightvm/)
- [ ] [NMAP](https://nmap.org/)
- [ ] More to come
Getting Started
@ -45,8 +46,13 @@ Requirements
<a id="installreq">Install Requirements-VulnWhisperer(may require sudo)</a>
--------------------
**First, install requirement dependencies**
```shell
**First, install dependant modules**
sudo apt-get install zlib1g-dev libxml2-dev libxslt1-dev
```
**Second, install dependant modules**
```python
cd deps/qualysapi
@ -54,7 +60,7 @@ python setup.py install
```
**Second, install requirements**
**Third, install requirements**
```python
pip install -r /path/to/VulnWhisperer/requirements.txt

View File

@ -4,8 +4,19 @@ hostname=localhost
port=8834
username=nessus_username
password=nessus_password
write_path=/opt/vulnwhisp/nessus/
db_path=/opt/vulnwhisp/database
write_path=/opt/vulnwhisperer/nessus/
db_path=/opt/vulnwhisperer/database
trash=false
verbose=true
[tenable]
enabled=true
hostname=cloud.tenable.com
port=443
username=tenable.io_username
password=tenable.io_password
write_path=/opt/vulnwhisperer/tenable/
db_path=/opt/vulnwhisperer/database
trash=false
verbose=true
@ -15,8 +26,24 @@ enabled = true
hostname = qualysapi.qg2.apps.qualys.com
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisp/qualys/
db_path=/opt/vulnwhisp/database
write_path=/opt/vulnwhisperer/qualys/
db_path=/opt/vulnwhisperer/database
verbose=true
# Set the maximum number of retries each connection should attempt.
#Note, this applies only to failed connections and timeouts, never to requests where the server returns a response.
max_retries = 10
# Template ID will need to be retrieved for each document. Please follow the reference guide above for instructions on how to get your template ID.
template_id = 126024
[qualys_vuln]
#Reference https://www.qualys.com/docs/qualys-was-api-user-guide.pdf to find your API
enabled = true
hostname = qualysapi.qg2.apps.qualys.com
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisperer/qualys/
db_path=/opt/vulnwhisperer/database
verbose=true
# Set the maximum number of retries each connection should attempt.
@ -26,13 +53,13 @@ max_retries = 10
template_id = 126024
[openvas]
enabled = true
enabled = false
hostname = localhost
port = 4000
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisp/openvas/
db_path=/opt/vulnwhisp/database
write_path=/opt/vulnwhisperer/openvas/
db_path=/opt/vulnwhisperer/database
verbose=true
#[proxy]

View File

@ -21,8 +21,7 @@
"mappings": {
"_default_": {
"_all": {
"enabled": true,
"norms": false
"enabled": false
},
"dynamic_templates": [
{
@ -57,28 +56,23 @@
"type": "integer"
},
"last_updated": {
"type": "date",
"doc_values": true
"type": "date"
},
"geoip": {
"dynamic": true,
"type": "object",
"properties": {
"ip": {
"type": "ip",
"doc_values": true
"type": "ip"
},
"latitude": {
"type": "float",
"doc_values": true
"type": "float"
},
"location": {
"type": "geo_point",
"doc_values": true
"type": "geo_point"
},
"longitude": {
"type": "float",
"doc_values": true
"type": "float"
}
}
},
@ -86,44 +80,34 @@
"type": "float"
},
"source": {
"index": "not_analyzed",
"type": "string"
"type": "keyword"
},
"synopsis": {
"index": "not_analyzed",
"type": "string"
"type": "keyword"
},
"see_also": {
"index": "not_analyzed",
"type": "string"
"type": "keyword"
},
"@timestamp": {
"type": "date",
"doc_values": true
"type": "date"
},
"cve": {
"index": "not_analyzed",
"type": "string"
"type": "keyword"
},
"solution": {
"index": "not_analyzed",
"type": "string"
"type": "keyword"
},
"port": {
"index": "not_analyzed",
"type": "integer"
},
"host": {
"type": "string"
"type": "text"
},
"@version": {
"index": "not_analyzed",
"type": "string",
"doc_values": true
"type": "keyword"
},
"risk": {
"index": "not_analyzed",
"type": "string"
"type": "keyword"
},
"assign_ip": {
"type": "ip"

View File

@ -12,29 +12,44 @@ input {
tags => "nessus"
type => "nessus"
}
file {
path => "/opt/vulnwhisperer/tenable/*.csv"
start_position => "beginning"
tags => "tenable"
type => "tenable"
}
}
filter {
if "nessus" in [tags]{
if "nessus" in [tags] or "tenable" in [tags] {
# Drop the header column
if [message] =~ "^Plugin ID" { drop {} }
mutate {
gsub => [
"message", "\|\|\|", " ",
"message", "\t\t", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " "
]
}
csv {
columns => ["plugin_id", "cve", "cvss", "risk", "asset", "protocol", "port", "plugin_name", "synopsis", "description", "solution", "see_also", "plugin_output"]
# columns => ["plugin_id", "cve", "cvss", "risk", "asset", "protocol", "port", "plugin_name", "synopsis", "description", "solution", "see_also", "plugin_output"]
columns => ["plugin_id", "cve", "cvss", "risk", "asset", "protocol", "port", "plugin_name", "synopsis", "description", "solution", "see_also", "plugin_output", "asset_uuid", "vulnerability_state", "ip", "fqdn", "netbios", "operating_system", "mac_address", "plugin_family", "cvss_base", "cvss_temporal", "cvss_temporal_vector", "cvss_vector", "cvss3_base", "cvss3_temporal", "cvss3_temporal_vector", "cvss_vector", "system_type", "host_start", "host_end"]
separator => ","
source => "message"
}
ruby {
code => "if event.get('description')
event.set('description', event.get('description').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('synopsis')
event.set('synopsis', event.get('synopsis').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('solution')
event.set('solution', event.get('solution').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('see_also')
event.set('see_also', event.get('see_also').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end
if event.get('plugin_output')
event.set('plugin_output', event.get('plugin_output').gsub(92.chr + 'n', 10.chr).gsub(92.chr + 'r', 13.chr))
end"
}
#If using filebeats as your source, you will need to replace the "path" field to "source"
grok {
match => { "path" => "(?<scan_name>[a-zA-Z0-9_.\-]+)_%{INT:scan_id}_%{INT:history_id}_%{INT:last_updated}.csv$" }
@ -63,24 +78,57 @@ filter {
mutate { add_field => { "risk_number" => 4 }}
}
if [cve] == "nan" {
if ![cve] or [cve] == "nan" {
mutate { remove_field => [ "cve" ] }
}
if [cvss] == "nan" {
if ![cvss] or [cvss] == "nan" {
mutate { remove_field => [ "cvss" ] }
}
if [see_also] == "nan" {
mutate { remove_field => [ "see_also" ] }
if ![cvss_base] or [cvss_base] == "nan" {
mutate { remove_field => [ "cvss_base" ] }
}
if [description] == "nan" {
if ![cvss_temporal] or [cvss_temporal] == "nan" {
mutate { remove_field => [ "cvss_temporal" ] }
}
if ![cvss_temporal_vector] or [cvss_temporal_vector] == "nan" {
mutate { remove_field => [ "cvss_temporal_vector" ] }
}
if ![cvss_vector] or [cvss_vector] == "nan" {
mutate { remove_field => [ "cvss_vector" ] }
}
if ![cvss3_base] or [cvss3_base] == "nan" {
mutate { remove_field => [ "cvss3_base" ] }
}
if ![cvss3_temporal] or [cvss3_temporal] == "nan" {
mutate { remove_field => [ "cvss3_temporal" ] }
}
if ![cvss3_temporal_vector] or [cvss3_temporal_vector] == "nan" {
mutate { remove_field => [ "cvss3_temporal_vector" ] }
}
if ![description] or [description] == "nan" {
mutate { remove_field => [ "description" ] }
}
if [plugin_output] == "nan" {
if ![mac_address] or [mac_address] == "nan" {
mutate { remove_field => [ "mac_address" ] }
}
if ![netbios] or [netbios] == "nan" {
mutate { remove_field => [ "netbios" ] }
}
if ![operating_system] or [operating_system] == "nan" {
mutate { remove_field => [ "operating_system" ] }
}
if ![plugin_output] or [plugin_output] == "nan" {
mutate { remove_field => [ "plugin_output" ] }
}
if [synopsis] == "nan" {
if ![see_also] or [see_also] == "nan" {
mutate { remove_field => [ "see_also" ] }
}
if ![synopsis] or [synopsis] == "nan" {
mutate { remove_field => [ "synopsis" ] }
}
if ![system_type] or [system_type] == "nan" {
mutate { remove_field => [ "system_type" ] }
}
mutate {
remove_field => [ "message" ]
@ -162,7 +210,7 @@ filter {
}
output {
if "nessus" in [tags] or [type] == "nessus" {
if "nessus" in [tags] or "tenable" in [tags] or [type] in [ "nessus", "tenable" ] {
# stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost:9200" ]

View File

@ -10,12 +10,17 @@ input {
type => json
codec => json
start_position => "beginning"
tags => [ "qualys_web", "qualys" ]
tags => [ "qualys" ]
}
}
filter {
if "qualys_web" in [tags] {
if "qualys" in [tags] {
grok {
match => { "path" => [ "(?<tags>qualys_vuln)_scan_%{DATA}_%{INT:last_updated}.json$", "(?<tags>qualys_web)_%{INT:app_id}_%{INT:last_updated}.json$" ] }
tag_on_failure => []
}
mutate {
replace => [ "message", "%{message}" ]
#gsub => [
@ -29,16 +34,17 @@ filter {
#]
}
grok {
match => { "path" => "qualys_web_%{INT:app_id}_%{INT:last_updated}.json$" }
tag_on_failure => []
}
if "qualys_web" in [tags] {
mutate {
add_field => { "asset" => "%{web_application_name}" }
add_field => { "risk_score" => "%{cvss}" }
}
} else if "qualys_vuln" in [tags] {
mutate {
add_field => { "asset" => "%{ip}" }
add_field => { "risk_score" => "%{cvss}" }
}
}
if [risk] == "1" {
mutate { add_field => { "risk_number" => 0 }}

BIN
vulnwhisp/.DS_Store vendored Normal file

Binary file not shown.

View File

@ -1,7 +1,4 @@
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
import pytz
from datetime import datetime
import json
@ -72,6 +69,8 @@ class NessusAPI(object):
while (timeout <= 10) and (not success):
data = methods[method](url, data=data, headers=self.headers, verify=False)
if data.status_code == 401:
if url == self.base + self.SESSION:
break
try:
self.login()
timeout += 1
@ -105,7 +104,7 @@ class NessusAPI(object):
def get_scan_ids(self):
scans = self.get_scans()
scan_ids = [scan_id['id'] for scan_id in scans['scans']]
scan_ids = [scan_id['id'] for scan_id in scans['scans']] if scans['scans'] else []
return scan_ids
def count_scan(self, scans, folder_id):
@ -150,7 +149,7 @@ class NessusAPI(object):
req = self.request(query, data=data, method='POST')
return req
def download_scan(self, scan_id=None, history=None, export_format="", chapters="", dbpasswd=""):
def download_scan(self, scan_id=None, history=None, export_format="", chapters="", dbpasswd="", profile=""):
running = True
counter = 0
@ -163,7 +162,7 @@ class NessusAPI(object):
req = self.request(query, data=json.dumps(data), method='POST', json=True)
try:
file_id = req['file']
token_id = req['token']
token_id = req['token'] if 'token' in req else req['temp_token']
except Exception as e:
print("[ERROR] %s" % e)
print('Download for file id ' + str(file_id) + '.')
@ -179,6 +178,9 @@ class NessusAPI(object):
print("")
print("")
if profile=='tenable':
content = self.request(self.EXPORT_FILE_DOWNLOAD.format(scan_id=scan_id, file_id=file_id), method='GET', download=True)
else:
content = self.request(self.EXPORT_TOKEN_DOWNLOAD.format(token_id=token_id), method='GET', download=True)
return content

View File

@ -8,11 +8,8 @@ import io
import pandas as pd
import requests
from bs4 import BeautifulSoup
from requests.packages.urllib3.exceptions import InsecureRequestWarning
from ..utils.cli import bcolors
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
class OpenVAS_API(object):
OMP = '/omp'

View File

@ -9,10 +9,6 @@ import pandas as pd
import qualysapi
import qualysapi.config as qcconf
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
import sys
import os
import csv

View File

@ -0,0 +1,114 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
__author__ = 'Nathan Young'
import xml.etree.ElementTree as ET
import pandas as pd
import qualysapi
import requests
import sys
import os
import dateutil.parser as dp
class qualysWhisperAPI(object):
SCANS = 'api/2.0/fo/scan'
def __init__(self, config=None):
self.config = config
try:
self.qgc = qualysapi.connect(config)
# Fail early if we can't make a request or auth is incorrect
self.qgc.request('about.php')
print('[SUCCESS] - Connected to Qualys at %s' % self.qgc.server)
except Exception as e:
print('[ERROR] Could not connect to Qualys - %s' % e)
exit(1)
def scan_xml_parser(self, xml):
all_records = []
root = ET.XML(xml)
for child in root.find('.//SCAN_LIST'):
all_records.append({
'name': child.find('TITLE').text,
'id': child.find('REF').text,
'date': child.find('LAUNCH_DATETIME').text,
'type': child.find('TYPE').text,
'duration': child.find('DURATION').text,
'status': child.find('.//STATE').text,
})
return pd.DataFrame(all_records)
def get_all_scans(self):
parameters = {
'action': 'list',
'echo_request': 0,
'show_op': 0,
'launched_after_datetime': '0001-01-01'
}
scans_xml = self.qgc.request(self.SCANS, parameters)
return self.scan_xml_parser(scans_xml)
def get_scan_details(self, scan_id=None):
parameters = {
'action': 'fetch',
'echo_request': 0,
'output_format': 'json_extended',
'mode': 'extended',
'scan_ref': scan_id
}
scan_json = self.qgc.request(self.SCANS, parameters)
# First two columns are metadata we already have
# Last column corresponds to "target_distribution_across_scanner_appliances" element
# which doesn't follow the schema and breaks the pandas data manipulation
return pd.read_json(scan_json).iloc[2:-1]
class qualysUtils:
def __init__(self):
pass
def iso_to_epoch(self, dt):
return dp.parse(dt).strftime('%s')
class qualysVulnScan:
def __init__(
self,
config=None,
file_in=None,
file_stream=False,
delimiter=',',
quotechar='"',
):
self.file_in = file_in
self.file_stream = file_stream
self.report = None
self.utils = qualysUtils()
if config:
try:
self.qw = qualysWhisperAPI(config=config)
except Exception as e:
print('Could not load config! Please check settings for %s' \
% e)
if file_stream:
self.open_file = file_in.splitlines()
elif file_in:
self.open_file = open(file_in, 'rb')
self.downloaded_file = None
def process_data(self, scan_id=None):
"""Downloads a file from Qualys and normalizes it"""
print('[ACTION] - Downloading scan ID: %s' % scan_id)
scan_report = self.qw.get_scan_details(scan_id=scan_id)
keep_columns = ['category', 'cve_id', 'cvss3_base', 'cvss3_temporal', 'cvss_base', 'cvss_temporal', 'dns', 'exploitability', 'fqdn', 'impact', 'ip', 'ip_status', 'netbios', 'os', 'pci_vuln', 'port', 'protocol', 'qid', 'results', 'severity', 'solution', 'ssl', 'threat', 'title', 'type', 'vendor_reference']
scan_report = scan_report.filter(keep_columns)
scan_report['severity'] = scan_report['severity'].astype(int).astype(str)
scan_report['qid'] = scan_report['qid'].astype(int).astype(str)
return scan_report

View File

@ -5,6 +5,7 @@ __author__ = 'Austin Taylor'
from base.config import vwConfig
from frameworks.nessus import NessusAPI
from frameworks.qualys import qualysScanReport
from frameworks.qualys_vuln import qualysVulnScan
from frameworks.openvas import OpenVAS_API
from utils.cli import bcolors
import pandas as pd
@ -88,7 +89,7 @@ class vulnWhispererBase(object):
else:
self.vprint('{fail} Please specify a database to connect to!'.format(fail=bcolors.FAIL))
exit(0)
exit(1)
self.table_columns = [
'scan_name',
@ -131,7 +132,7 @@ class vulnWhispererBase(object):
self.create_table()
def cleanser(self, _data):
repls = (('\n', '|||'), ('\r', '|||'), (',', ';'))
repls = (('\n', r'\n'), ('\r', r'\r'))
data = reduce(lambda a, kv: a.replace(*kv), repls, _data)
return data
@ -176,7 +177,7 @@ class vulnWhispererBase(object):
class vulnWhispererNessus(vulnWhispererBase):
CONFIG_SECTION = 'nessus'
CONFIG_SECTION = None
def __init__(
self,
@ -187,7 +188,10 @@ class vulnWhispererNessus(vulnWhispererBase):
debug=False,
username=None,
password=None,
profile='nessus'
):
self.CONFIG_SECTION=profile
super(vulnWhispererNessus, self).__init__(config=config)
self.port = int(self.config.get(self.CONFIG_SECTION, 'port'))
@ -217,13 +221,13 @@ class vulnWhispererNessus(vulnWhispererBase):
self.vprint(e)
raise Exception(
'{fail} Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
config=self.config,
config=self.config.config_in,
fail=bcolors.FAIL, e=e))
except Exception as e:
self.vprint('{fail} Could not properly load your config!\nReason: {e}'.format(fail=bcolors.FAIL,
e=e))
sys.exit(0)
sys.exit(1)
@ -275,7 +279,7 @@ class vulnWhispererNessus(vulnWhispererBase):
if self.nessus_connect:
scan_data = self.nessus.get_scans()
folders = scan_data['folders']
scans = scan_data['scans']
scans = scan_data['scans'] if scan_data['scans'] else []
all_scans = self.scan_count(scans)
if self.uuids:
scan_list = [scan for scan in all_scans if scan['uuid']
@ -332,8 +336,10 @@ class vulnWhispererNessus(vulnWhispererBase):
folder_id = s['folder_id']
scan_history = self.nessus.get_scan_history(scan_id)
folder_name = next(f['name'] for f in folders if f['id'
] == folder_id)
if self.CONFIG_SECTION == 'tenable':
folder_name = ''
else:
folder_name = next(f['name'] for f in folders if f['id'] == folder_id)
if status == 'completed':
file_name = '%s_%s_%s_%s.%s' % (scan_name, scan_id,
history_id, norm_time, 'csv')
@ -361,8 +367,8 @@ class vulnWhispererNessus(vulnWhispererBase):
filename=relative_path_name))
else:
file_req = \
self.nessus.download_scan(scan_id=scan_id,
history=history_id, export_format='csv')
self.nessus.download_scan(scan_id=scan_id, history=history_id,
export_format='csv', profile=self.CONFIG_SECTION)
clean_csv = \
pd.read_csv(io.StringIO(file_req.decode('utf-8'
)))
@ -375,11 +381,7 @@ class vulnWhispererNessus(vulnWhispererBase):
for col in columns_to_cleanse:
clean_csv[col] = clean_csv[col].astype(str).apply(self.cleanser)
clean_csv['Synopsis'] = \
clean_csv['Description'
].astype(str).apply(self.cleanser)
clean_csv.to_csv(relative_path_name,
index=False)
clean_csv.to_csv(relative_path_name, index=False)
record_meta = (
scan_name,
scan_id,
@ -562,6 +564,7 @@ class vulnWhispererQualys(vulnWhispererBase):
if output_format == 'json':
with open(relative_path_name, 'w') as f:
f.write(vuln_ready.to_json(orient='records', lines=True))
f.write('\n')
elif output_format == 'csv':
vuln_ready.to_csv(relative_path_name, index=False, header=True) # add when timestamp occured
@ -715,6 +718,7 @@ class vulnWhispererOpenVAS(vulnWhispererBase):
if output_format == 'json':
with open(relative_path_name, 'w') as f:
f.write(vuln_ready.to_json(orient='records', lines=True))
f.write('\n')
print('{success} - Report written to %s'.format(success=bcolors.SUCCESS) \
% report_name)
@ -747,6 +751,129 @@ class vulnWhispererOpenVAS(vulnWhispererBase):
exit(0)
class vulnWhispererQualysVuln(vulnWhispererBase):
CONFIG_SECTION = 'qualys'
COLUMN_MAPPING = {'cvss_base': 'cvss',
'cvss3_base': 'cvss3',
'cve_id': 'cve',
'os': 'operating_system',
'qid': 'plugin_id',
'severity': 'risk',
'title': 'plugin_name'}
def __init__(
self,
config=None,
db_name='report_tracker.db',
purge=False,
verbose=None,
debug=False,
username=None,
password=None,
):
super(vulnWhispererQualysVuln, self).__init__(config=config)
self.qualys_scan = qualysVulnScan(config=config)
self.directory_check()
self.scans_to_process = None
def whisper_reports(self,
report_id=None,
launched_date=None,
scan_name=None,
scan_reference=None,
output_format='json',
cleanup=True):
try:
launched_date
if 'Z' in launched_date:
launched_date = self.qualys_scan.utils.iso_to_epoch(launched_date)
report_name = 'qualys_vuln_' + report_id.replace('/','_') \
+ '_{last_updated}'.format(last_updated=launched_date) \
+ '.json'
relative_path_name = self.path_check(report_name)
if os.path.isfile(relative_path_name):
#TODO Possibly make this optional to sync directories
file_length = len(open(relative_path_name).readlines())
record_meta = (
scan_name,
scan_reference,
launched_date,
report_name,
time.time(),
file_length,
self.CONFIG_SECTION,
report_id,
1,
)
self.record_insert(record_meta)
self.vprint('{info} File {filename} already exist! Updating database'.format(info=bcolors.INFO, filename=relative_path_name))
else:
print('Processing report ID: %s' % report_id)
vuln_ready = self.qualys_scan.process_data(scan_id=report_id)
vuln_ready['scan_name'] = scan_name
vuln_ready['scan_reference'] = report_id
vuln_ready.rename(columns=self.COLUMN_MAPPING, inplace=True)
record_meta = (
scan_name,
scan_reference,
launched_date,
report_name,
time.time(),
vuln_ready.shape[0],
self.CONFIG_SECTION,
report_id,
1,
)
self.record_insert(record_meta)
if output_format == 'json':
with open(relative_path_name, 'w') as f:
f.write(vuln_ready.to_json(orient='records', lines=True))
f.write('\n')
print('{success} - Report written to %s'.format(success=bcolors.SUCCESS) \
% report_name)
except Exception as e:
print('{error} - Could not process %s - %s'.format(error=bcolors.FAIL) % (report_id, e))
def identify_scans_to_process(self):
self.latest_scans = self.qualys_scan.qw.get_all_scans()
if self.uuids:
self.scans_to_process = self.latest_scans.loc[
(~self.latest_scans['id'].isin(self.uuids))
& (self.latest_scans['status'] == 'Finished')]
else:
self.scans_to_process = self.latest_scans
self.vprint('{info} Identified {new} scans to be processed'.format(info=bcolors.INFO,
new=len(self.scans_to_process)))
def process_vuln_scans(self):
counter = 0
self.identify_scans_to_process()
if self.scans_to_process.shape[0]:
for app in self.scans_to_process.iterrows():
counter += 1
r = app[1]
print('Processing %s/%s' % (counter, len(self.scans_to_process)))
self.whisper_reports(report_id=r['id'],
launched_date=r['date'],
scan_name=r['name'],
scan_reference=r['type'])
else:
self.vprint('{info} No new scans to process. Exiting...'.format(info=bcolors.INFO))
self.conn.close()
exit(0)
class vulnWhisperer(object):
@ -770,7 +897,8 @@ class vulnWhisperer(object):
vw = vulnWhispererNessus(config=self.config,
username=self.username,
password=self.password,
verbose=self.verbose)
verbose=self.verbose,
profile=self.profile)
vw.whisper_nessus()
elif self.profile == 'qualys':
@ -780,3 +908,15 @@ class vulnWhisperer(object):
elif self.profile == 'openvas':
vw_openvas = vulnWhispererOpenVAS(config=self.config)
vw_openvas.process_openvas_scans()
elif self.profile == 'tenable':
vw = vulnWhispererNessus(config=self.config,
username=self.username,
password=self.password,
verbose=self.verbose,
profile=self.profile)
vw.whisper_nessus()
elif self.profile == 'qualys_vuln':
vw = vulnWhispererQualysVuln(config=self.config)
vw.process_vuln_scans()