60 Commits
1.0.0 ... 1.1.0

Author SHA1 Message Date
bb776bd9f2 Update to requirements.txt 2017-12-29 02:32:15 -05:00
55c0713baf Update to README 2017-12-28 23:52:53 -05:00
caa64b4ca2 Update to README 2017-12-28 23:50:09 -05:00
fb9f86634e Update to README 2017-12-28 23:49:16 -05:00
24417cd1bb Update README.md 2017-12-28 23:39:31 -05:00
34638bcf42 Fix for str casting 2017-12-28 23:25:05 -05:00
c041693018 Column update for scans and N/A cleanup 2017-12-28 22:47:58 -05:00
d03ba15772 Addition of category class and special class for Qualys Scanning Reports. Also added additional enrichments to reports 2017-12-28 21:57:21 -05:00
a274341d23 Removing commented code 2017-12-27 10:42:08 -05:00
ee1e79dcd5 Refactored classes to be more modular, update to ini file and submodules 2017-12-27 10:39:25 -05:00
2997e2d2b6 Refactored classes to be more modular, update to ini file and submodules 2017-12-27 10:38:44 -05:00
abe8925ebc Addition of submodules, update to connectors, base class start 2017-12-27 02:18:28 -05:00
b26ff7d9c9 Addition of submodules, update to connectors, base class start 2017-12-27 02:17:56 -05:00
cec794daa8 Addition of submodules, update to connectors, base class start 2017-12-27 02:17:01 -05:00
bf537df475 Field Cleanup 2017-12-26 07:53:46 -05:00
4f6003066e Adding custom version of QualysAPI 2017-12-25 22:47:34 -05:00
61ba3f0804 Fixed multiple bugs, cleaned up formatting, produces solid csv output for Qualys Web App scans 2017-12-25 22:44:30 -05:00
10f8809723 Merge branch 'master' of github.com:austin-taylor/VulnWhisperer 2017-12-22 17:28:38 -05:00
796db314f3 Addition of Qualys WebApp Processing 2017-12-22 17:28:33 -05:00
d9ff8532ee Addition of Qualys WebApp Processing 2017-12-22 17:28:01 -05:00
dc2491e8b0 Merge pull request #12 from HASecuritySolutions/master
Fork Sync
2017-12-20 01:39:26 -07:00
a9a21c2e90 Allow for any directories to be monitored 2017-12-20 03:00:04 -05:00
16369f0e40 Update README.md 2017-12-20 01:11:28 -05:00
2d8a50d1ad Merge pull request #10 from cybergoof/trim-input
Trim input
2017-12-07 22:43:11 -07:00
4657241b70 Merge pull request #9 from cybergoof/file-test
Checks to make sure config file exists.  Provides descriptive error if it doesn't
2017-12-07 22:42:43 -07:00
c1c4a45562 remove leading and trailing spaces around all input switches. Fixes austi-taylor/VulnWhisperer#6 2017-12-08 00:40:25 -05:00
fcd938b75a Put in a check to make sure that the config file exists. FIXES austin-taylor/VulnWhisperer#4 2017-12-08 00:25:15 -05:00
f8905e8c4b Silence NoneType object 2017-12-07 01:47:06 -05:00
ac61390b88 Update README.md 2017-12-07 01:19:45 -05:00
eb22d9475c Merge pull request #8 from cybergoof/passwords-argument
Added an argument for username and password
2017-11-28 22:18:50 -08:00
35b7093762 Added an argument for username and password, which takes precendece over nessus. Fixed #5 2017-11-27 10:02:53 -05:00
fedbb18bb2 Visualizations for elastic 5.5+ 2017-10-06 15:35:59 -04:00
8808b9e458 Update 9000_output_nessus.conf 2017-10-06 14:33:11 -05:00
b108c1fbeb Create docker-compose.yml 2017-10-06 14:25:09 -05:00
6ea508503d Renaming template 2017-10-06 15:14:25 -04:00
39662cc4cc Merge pull request #3 from alias454/master
Add logstash config for local file pickup
2017-08-06 16:30:07 -04:00
a63f69b3d4 Add logstash config for local file pickup 2017-08-06 15:15:09 -05:00
8be2527ff4 Merge pull request #2 from alias454/master
Add requirements file
2017-08-06 15:29:04 -04:00
d35645363d Update README 2017-08-06 14:08:01 -05:00
df03e7b928 Add requirements file 2017-08-06 14:03:46 -05:00
6a29cb7b84 Addition of logstash configs 2017-07-25 12:23:47 -04:00
dab91faff8 Update to README 2017-07-10 01:51:17 -04:00
db5ab0a265 Update to README 2017-07-10 01:48:38 -04:00
11d2e91321 Update to README 2017-07-10 01:48:04 -04:00
be3938daa5 Update to README 2017-07-10 01:45:11 -04:00
e8c7c5e13e Update to README 2017-07-10 01:42:53 -04:00
e645c33eea Update to README 2017-07-10 01:41:27 -04:00
8cd4e0cc19 Update to README 2017-07-10 01:38:39 -04:00
aed171de81 Update to README 2017-07-10 01:38:18 -04:00
72e6043b09 Update to README 2017-07-10 01:35:16 -04:00
68116635a2 Update to README 2017-07-10 01:32:38 -04:00
4df413d11f Update to README 2017-07-10 01:32:04 -04:00
3b64f7c27d Kibana Dashboards, filebeat and logstash configs 2017-07-10 00:20:17 -04:00
034b204255 Kibana Dashboards, filebeat and logstash configs 2017-07-10 00:15:28 -04:00
ccf774099f Only retrieves completed scans 2017-06-22 02:05:03 -04:00
34d4821c24 Update to README 2017-06-20 02:02:33 -04:00
0d96664209 Update to README 2017-06-19 22:30:54 -04:00
35c5119390 Update to README 2017-06-19 22:26:11 -04:00
69230af210 Update to README 2017-06-19 22:25:42 -04:00
14a451a492 Update to README and removed uneeded modules 2017-06-19 22:24:34 -04:00
44 changed files with 4639 additions and 139 deletions

3
.gitmodules vendored Normal file
View File

@ -0,0 +1,3 @@
[submodule "qualysapi"]
path = deps/qualysapi
url = git@github.com:austin-taylor/qualysapi.git

View File

@ -1,2 +1,78 @@
# VulnWhisperer
Create actionable data from your Vulnerability Scans
<p align="center"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/vuln_whisperer_logo_s.png" width="400px"></p>
<p align="center"> <i>Create <u><b>actionable data</b></u> from your vulnerability scans </i> </p>
<p align="center" style="width:400px"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/vulnwhisp_dashboard.jpg" style="width:400px"></p>
VulnWhisperer is a vulnerability report aggregator. VulnWhisperer will pull all the reports
and create a file with a unique filename which is then fed into logstash. Logstash extracts data from the filename and tags all of the information inside the report (see logstash_vulnwhisp.conf file). Data is then shipped to elasticsearch to be indexed.
Requirements
-------------
####
* ElasticStack
* Python 2.7
* Vulnerability Scanner
* Optional: Message broker such as Kafka or RabbitMQ
Currently Supports
-------------
####
* Elasticsearch 2.x
* Python 2.7
* Nessus
* Qualys - Web Application Scanner
Setup
===============
```python
Install pip:
sudo <pkg-manager> install python-pip
sudo pip install --upgrade pip
Manually install requirements:
sudo pip install pytz
sudo pip install pandas
Using requirements file:
sudo pip install -r /path/to/VulnWhisperer/requirements.txt
cd /path/to/VulnWhisperer
sudo python setup.py install
```
Configuration
-----
There are a few configuration steps to setting up VulnWhisperer:
* Configure Ini file
* Setup Logstash File
* Import ElasticSearch Templates
* Import Kibana Dashboards
Run
-----
To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.
```python
vuln_whisperer -c configs/example.ini -s nessus
or
vuln_whisperer -c configs/example.ini -s qualys
```
Next you'll need to import the visualizations into Kibana and setup your logstash config. A more thorough README is underway with setup instructions.
_For windows, you may need to type the full path of the binary in vulnWhisperer located in the bin directory._
Credit
------
Big thank you to <a href="https://github.com/SMAPPER">Justin Henderson</a> for his contributions to vulnWhisperer!
AS SEEN ON TV
-------------
<p align="center" style="width:400px"><a href="https://twitter.com/MalwareJake/status/935654519471353856"><img src="https://github.com/austin-taylor/vulnwhisperer/blob/master/docs/source/as_seen_on_tv.png" style="width:400px"></a></p>

View File

@ -1,39 +1,58 @@
#!/usr/bin/env python
#!/usr/bin/python
# -*- coding: utf-8 -*-
__author__ = 'Austin Taylor'
#Written by Austin Taylor
#www.austintaylor.io
from vulnwhisp.vulnwhisp import vulnWhisperer
from vulnwhisp.utils.cli import bcolors
import os
import argparse
import sys
def isFileValid(parser, arg):
if not os.path.exists(arg):
parser.error("The file %s does not exist!" % arg)
else:
return arg
def main():
parser = argparse.ArgumentParser(description=""" VulnWhisperer is designed to create actionable data from\
your vulnerability scans through aggregation of historical scans.""")
parser.add_argument('-c', '--config', dest='config', required=False, default='frameworks.ini',
help='Path of config file')
help='Path of config file', type=lambda x: isFileValid(parser, x.strip()))
parser.add_argument('-s', '--section', dest='section', required=False,
help='Section in config')
parser.add_argument('-v', '--verbose', dest='verbose', action='store_true', default=True,
help='Prints status out to screen (defaults to True)')
parser.add_argument('-u', '--username', dest='username', required=False, default=None, type=lambda x: x.strip(), help='The NESSUS username')
parser.add_argument('-p', '--password', dest='password', required=False, default=None, type=lambda x: x.strip(), help='The NESSUS password')
args = parser.parse_args()
vw = vulnWhisperer(config=args.config,
profile=args.section,
verbose=args.verbose,
username=args.username,
password=args.password)
vw.whisper_vulnerabilities()
'''
try:
vw = vulnWhisperer(config=args.config,
verbose=args.verbose)
profile=args.section,
verbose=args.verbose,
username=args.username,
password=args.password)
vw.whisper_nessus()
vw.whisper_vulnerabilities()
sys.exit(1)
except Exception as e:
if args.verbose:
print('{red} ERROR: {error}{endc}'.format(red=bcolors.FAIL, error=e, endc=bcolors.ENDC))
sys.exit(2)
'''
if __name__ == '__main__':
main()

View File

@ -4,8 +4,36 @@ hostname=localhost
port=8834
username=nessus_username
password=nessus_password
write_path=/opt/vulnwhisp/scans
write_path=/opt/vulnwhisp/nessus/
db_path=/opt/vulnwhisp/database
trash=false
verbose=true
[qualys]
#Reference https://www.qualys.com/docs/qualys-was-api-user-guide.pdf to find your API
enabled = true
hostname = qualysapi.qg2.apps.qualys.com
username = exampleuser
password = examplepass
write_path=/opt/vulnwhisp/qualys/
db_path=/opt/vulnwhisp/database
verbose=true
# Set the maximum number of retries each connection should attempt.
#Note, this applies only to failed connections and timeouts, never to requests where the server returns a response.
max_retries = 10
template_id = 126024
#[proxy]
; This section is optional. Leave it out if you're not using a proxy.
; You can use environmental variables as well: http://www.python-requests.org/en/latest/user/advanced/#proxies
; proxy_protocol set to https, if not specified.
#proxy_url = proxy.mycorp.com
; proxy_port will override any port specified in proxy_url
#proxy_port = 8080
; proxy authentication
#proxy_username = proxyuser
#proxy_password = proxypass

47
deps/qualysapi/.gitignore vendored Normal file
View File

@ -0,0 +1,47 @@
*.py[cod]
# C extensions
*.so
# Packages
*.egg
*.egg-info
dist
build
eggs
parts
bin
var
sdist
develop-eggs
.installed.cfg
lib
lib64
# Installer logs
pip-log.txt
# Unit test / coverage reports
.coverage
.tox
nosetests.xml
# Translations
*.mo
# Mr Developer
.mr.developer.cfg
.project
.pydevproject
# Mac
.DS_Store
# Authenticatin configuration
*.qcrc
config.qcrc
config.ini
# PyCharm
.idea
.qcrc.swp

2
deps/qualysapi/MANIFEST.in vendored Normal file
View File

@ -0,0 +1,2 @@
include README.md
recursive-include examples *.py

107
deps/qualysapi/README.md vendored Normal file
View File

@ -0,0 +1,107 @@
qualysapi
=========
Python package, qualysapi, that makes calling any Qualys API very simple. Qualys API versions v1, v2, & WAS & AM (asset management) are all supported.
My focus was making the API super easy to use. The only parameters the user needs to provide is the call, and data (optional). It automates the following:
* Automatically identifies API version through the call requested.
* Automatically identifies url from the above step.
* Automatically identifies http method as POST or GET for the request per Qualys documentation.
Usage
=====
Check out the example scripts in the [/examples directory](https://github.com/paragbaxi/qualysapi/blob/master/examples/).
Example
-------
Detailed example found at [qualysapi-example.py](https://github.com/paragbaxi/qualysapi/blob/master/examples/qualysapi-example.py).
Sample example below.
```python
>>> import qualysapi
>>> a = qualysapi.connect()
QualysGuard Username: my_username
QualysGuard Password:
>>> print a.request('about.php')
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE ABOUT SYSTEM "https://qualysapi.qualys.com/about.dtd">
<ABOUT>
<API-VERSION MAJOR="1" MINOR="4" />
<WEB-VERSION>7.10.61-1</WEB-VERSION>
<SCANNER-VERSION>7.1.10-1</SCANNER-VERSION>
<VULNSIGS-VERSION>2.2.475-2</VULNSIGS-VERSION>
</ABOUT>
<!-- Generated for username="my_username" date="2013-07-03T10:31:57Z" -->
<!-- CONFIDENTIAL AND PROPRIETARY INFORMATION. Qualys provides the QualysGuard Service "As Is," without any warranty of any kind. Qualys makes no warranty that the information contained in this report is complete or error-free. Copyright 2013, Qualys, Inc. //-->
```
Installation
============
Use pip to install:
```Shell
pip install qualysapi
```
NOTE: If you would like to experiment without installing globally, look into 'virtualenv'.
Requirements
------------
* requests (http://docs.python-requests.org)
* lxml (http://lxml.de/)
Tested successfully on Python 2.7.
Configuration
=============
By default, the package will ask at the command prompt for username and password. By default, the package connects to the Qualys documented host (qualysapi.qualys.com).
You can override these settings and prevent yourself from typing credentials by doing any of the following:
1. By running the following Python, `qualysapi.connect(remember_me=True)`. This automatically generates a .qcrc file in your current working directory, scoping the configuration to that directory.
2. By running the following Python, `qualysapi.connect(remember_me_always=True)`. This automatically generates a .qcrc file in your home directory, scoping the configuratoin to all calls to qualysapi, regardless of the directory.
3. By creating a file called '.qcrc' (for Windows, the default filename is 'config.ini') in your home directory or directory of the Python script.
4. This supports multiple configuration files. Just add the filename in your call to qualysapi.connect('config.txt').
Example config file
-------------------
```INI
; Note, it should be possible to omit any of these entries.
[info]
hostname = qualysapi.serviceprovider.com
username = jerry
password = I<3Elaine
# Set the maximum number of retries each connection should attempt. Note, this applies only to failed connections and timeouts, never to requests where the server returns a response.
max_retries = 10
[proxy]
; This section is optional. Leave it out if you're not using a proxy.
; You can use environmental variables as well: http://www.python-requests.org/en/latest/user/advanced/#proxies
; proxy_protocol set to https, if not specified.
proxy_url = proxy.mycorp.com
; proxy_port will override any port specified in proxy_url
proxy_port = 8080
; proxy authentication
proxy_username = kramer
proxy_password = giddy up!
```
License
=======
Apache License, Version 2.0
http://www.apache.org/licenses/LICENSE-2.0.html
Acknowledgements
================
Special thank you to Colin Bell for qualysconnect.

12
deps/qualysapi/changelog.txt vendored Normal file
View File

@ -0,0 +1,12 @@
3.5.0
- Retooled authentication.
3.4.0
- Allows choice of configuration filenames. Easy to support those with multiple Qualys accounts, and need to automate tasks.
3.3.0
- Remove curl capability. Requests 2.0 and latest urllib3 can handle https proxy.
- Workaround for audience that does not have lxml. Warning: cannot handle lxml.builder E objects for AM & WAS APIs.
3.0.0
Proxy support.

1
deps/qualysapi/examples/__init__.py vendored Normal file
View File

@ -0,0 +1 @@
__author__ = 'pbaxi'

View File

@ -0,0 +1,113 @@
__author__ = 'Parag Baxi <parag.baxi@gmail.com>'
__license__ = 'Apache License 2.0'
import qualysapi
from lxml import objectify
from lxml.builder import E
# Setup connection to QualysGuard API.
qgc = qualysapi.connect('config.txt')
#
# API v1 call: Scan the New York & Las Vegas asset groups
# The call is our request's first parameter.
call = 'scan.php'
# The parameters to append to the url is our request's second parameter.
parameters = {'scan_title': 'Go big or go home', 'asset_groups': 'New York&Las Vegas', 'option': 'Initial+Options'}
# Note qualysapi will automatically convert spaces into plus signs for API v1 & v2.
# Let's call the API and store the result in xml_output.
xml_output = qgc.request(call, parameters, concurrent_scans_retries=2, concurrent_scans_retry_delay=600)
# concurrent_retries: Retry the call this many times if your subscription hits the concurrent scans limit.
# concurrent_retries: Delay in seconds between retrying when subscription hits the concurrent scans limit.
# Example XML response when this happens below:
# <?xml version="1.0" encoding="UTF-8"?>
# <ServiceResponse xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://localhost:50205/qps/rest/app//xsd/3.0/was/wasscan.xsd">
# <responseCode>INVALID_REQUEST</responseCode>
# <responseErrorDetails>
# <errorMessage>You have reached the maximum number of concurrent running scans (10) for your account</errorMessage>
# <errorResolution>Please wait until your previous scans have completed</errorResolution>
# </responseErrorDetails>
#
print(xml_output)
#
# API v1 call: Print out all IPs associated with asset group "Looneyville Texas".
# Note that the question mark at the end is optional.
call = 'asset_group_list.php?'
# We can still use strings for the data (not recommended).
parameters = 'title=Looneyville Texas'
# Let's call the API and store the result in xml_output.
xml_output = qgc.request(call, parameters)
# Let's objectify the xml_output string.
root = objectify.fromstring(xml_output)
# Print out the IPs.
print(root.ASSET_GROUP.SCANIPS.IP.text)
# Prints out:
# 10.0.0.102
#
# API v2 call: Print out DNS name for a range of IPs.
call = '/api/2.0/fo/asset/host/'
parameters = {'action': 'list', 'ips': '10.0.0.10-10.0.0.11'}
xml_output = qgc.request(call, parameters)
root = objectify.fromstring(xml_output)
# Iterate hosts and print out DNS name.
for host in root.RESPONSE.HOST_LIST.HOST:
print(host.IP.text, host.DNS.text)
# Prints out:
# 10.0.0.10 mydns1.qualys.com
# 10.0.0.11 mydns2.qualys.com
#
# API v3 WAS call: Print out number of webapps.
call = '/count/was/webapp'
# Note that this call does not have a payload so we don't send any data parameters.
xml_output = qgc.request(call)
root = objectify.fromstring(xml_output)
# Print out count of webapps.
print(root.count.text)
# Prints out:
# 89
#
# API v3 WAS call: Print out number of webapps containing title 'Supafly'.
call = '/count/was/webapp'
# We can send a string XML for the data.
parameters = '<ServiceRequest><filters><Criteria operator="CONTAINS" field="name">Supafly</Criteria></filters></ServiceRequest>'
xml_output = qgc.request(call, parameters)
root = objectify.fromstring(xml_output)
# Print out count of webapps.
print(root.count.text)
# Prints out:
# 3
#
# API v3 WAS call: Print out number of webapps containing title 'Lightsabertooth Tiger'.
call = '/count/was/webapp'
# We can also send an lxml.builder E object.
parameters = (
E.ServiceRequest(
E.filters(
E.Criteria('Lightsabertooth Tiger', field='name',operator='CONTAINS'))))
xml_output = qgc.request(call, parameters)
root = objectify.fromstring(xml_output)
# Print out count of webapps.
print(root.count.text)
# Prints out:
# 0
# Too bad, because that is an awesome webapp name!
#
# API v3 Asset Management call: Count tags.
call = '/count/am/tag'
xml_output = qgc.request(call)
root = objectify.fromstring(xml_output)
# We can use XPATH to find the count.
print(root.xpath('count')[0].text)
# Prints out:
# 840
#
# API v3 Asset Management call: Find asset by name.
call = '/search/am/tag'
parameters = '''<ServiceRequest>
<preferences>
<limitResults>10</limitResults>
</preferences>
<filters>
<Criteria field="name" operator="CONTAINS">PB</Criteria>
</filters>
</ServiceRequest>'''
xml_output = qgc.request(call, parameters)

View File

@ -0,0 +1,42 @@
#!/usr/bin/env python
import sys
import logging
import qualysapi
# Questions? See:
# https://bitbucket.org/uWaterloo_IST_ISS/python-qualysconnect
if __name__ == '__main__':
# Basic command line processing.
if len(sys.argv) != 2:
print('A single IPv4 address is expected as the only argument')
sys.exit(2)
# Set the MAXIMUM level of log messages displayed @ runtime.
logging.basicConfig(level=logging.INFO)
# Call helper that creates a connection w/ HTTP-Basic to QualysGuard API.
qgs=qualysapi.connect()
# Logging must be set after instanciation of connector class.
logger = logging.getLogger('qualysapi.connector')
logger.setLevel(logging.DEBUG)
# Log to sys.out.
logger_console = logging.StreamHandler()
logger_console.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
logging.getLogger(__name__).addHandler(logger)
# Formulate a request to the QualysGuard V1 API.
# docs @
# https://community.qualys.com/docs/DOC-1324
# http://www.qualys.com/docs/QualysGuard_API_User_Guide.pdf
#
# Old way still works:
# ret = qgs.request(1,'asset_search.php', "target_ips=%s&"%(sys.argv[1]))
# New way is cleaner:
ret = qgs.request(1,'asset_search.php', {'target_ips': sys.argv[1]})
print(ret)

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python
import sys
import logging
import qualysapi
if __name__ == '__main__':
# Basic command line processing.
if len(sys.argv) != 3:
print('A report template and scan reference respectively are expected as the only arguments.')
sys.exit(2)
# Set the MAXIMUM level of log messages displayed @ runtime.
logging.basicConfig(level=logging.DEBUG)
# Call helper that creates a connection w/ HTTP-Basic to QualysGuard v1 API
qgs=qualysapi.connect()
# Logging must be set after instanciation of connector class.
logger = logging.getLogger('qualysapi.connector')
logger.setLevel(logging.DEBUG)
# Log to sys.out.
logger_console = logging.StreamHandler()
logger_console.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
logging.getLogger(__name__).addHandler(logger)
# Formulate a request to the QualysGuard V1 API
# docs @
# https://community.qualys.com/docs/DOC-1324
# http://www.qualys.com/docs/QualysGuard_API_User_Guide.pdf
#
ret = qgs.request('/api/2.0/fo/report',{'action': 'launch', 'report_refs': sys.argv[2], 'output_format': 'xml', 'template_id': sys.argv[1], 'report_type': 'Scan'})
print(ret)

View File

@ -0,0 +1,43 @@
#!/usr/bin/env python
import sys
import logging
import qualysapi
# Questions? See:
# https://bitbucket.org/uWaterloo_IST_ISS/python-qualysconnect
if __name__ == '__main__':
# Basic command line processing.
if len(sys.argv) != 2:
print('A single IPv4 address is expected as the only argument.')
sys.exit(2)
# Set the MAXIMUM level of log messages displayed @ runtime.
logging.basicConfig(level=logging.INFO)
# Call helper that creates a connection w/ HTTP-Basic to QualysGuard v1 API
qgs=qualysapi.connect()
# Logging must be set after instanciation of connector class.
logger = logging.getLogger('qualysapi.connector')
logger.setLevel(logging.DEBUG)
# Log to sys.out.
logger_console = logging.StreamHandler()
logger_console.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
logging.getLogger(__name__).addHandler(logger)
# Formulate a request to the QualysGuard V1 API
# docs @
# https://community.qualys.com/docs/DOC-1324
# http://www.qualys.com/docs/QualysGuard_API_User_Guide.pdf
#
# Old way still works:
# ret = qgs.request(2, "asset/host","?action=list&ips=%s&"%(sys.argv[1]))
# New way is cleaner:
ret = qgs.request('/api/2.0/fo/asset/host',{'action': 'list', 'ips': sys.argv[1]})
print(ret)

201
deps/qualysapi/license vendored Normal file
View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2017 Parag Baxi
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

10
deps/qualysapi/qualysapi/__init__.py vendored Normal file
View File

@ -0,0 +1,10 @@
# This is the version string assigned to the entire egg post
# setup.py install
# Ownership and Copyright Information.
from __future__ import absolute_import
__author__ = "Parag Baxi <parag.baxi@gmail.com>"
__copyright__ = "Copyright 2011-2013, Parag Baxi"
__license__ = "BSD-new"
from qualysapi.util import connect

181
deps/qualysapi/qualysapi/api_actions.py vendored Normal file
View File

@ -0,0 +1,181 @@
from __future__ import absolute_import
from lxml import objectify
import qualysapi.api_objects
from qualysapi.api_objects import *
class QGActions(object):
def getHost(host):
call = '/api/2.0/fo/asset/host/'
parameters = {'action': 'list', 'ips': host, 'details': 'All'}
hostData = objectify.fromstring(self.request(call, parameters)).RESPONSE
try:
hostData = hostData.HOST_LIST.HOST
return Host(hostData.DNS, hostData.ID, hostData.IP, hostData.LAST_VULN_SCAN_DATETIME, hostData.NETBIOS, hostData.OS, hostData.TRACKING_METHOD)
except AttributeError:
return Host("", "", host, "never", "", "", "")
def getHostRange(self, start, end):
call = '/api/2.0/fo/asset/host/'
parameters = {'action': 'list', 'ips': start + '-' + end}
hostData = objectify.fromstring(self.request(call, parameters))
hostArray = []
for host in hostData.RESPONSE.HOST_LIST.HOST:
hostArray.append(Host(host.DNS, host.ID, host.IP, host.LAST_VULN_SCAN_DATETIME, host.NETBIOS, host.OS, host.TRACKING_METHOD))
return hostArray
def listAssetGroups(self, groupName=''):
call = 'asset_group_list.php'
if groupName == '':
agData = objectify.fromstring(self.request(call))
else:
agData = objectify.fromstring(self.request(call, 'title=' + groupName)).RESPONSE
groupsArray = []
scanipsArray = []
scandnsArray = []
scannersArray = []
for group in agData.ASSET_GROUP:
try:
for scanip in group.SCANIPS:
scanipsArray.append(scanip.IP)
except AttributeError:
scanipsArray = [] # No IPs defined to scan.
try:
for scanner in group.SCANNER_APPLIANCES.SCANNER_APPLIANCE:
scannersArray.append(scanner.SCANNER_APPLIANCE_NAME)
except AttributeError:
scannersArray = [] # No scanner appliances defined for this group.
try:
for dnsName in group.SCANDNS:
scandnsArray.append(dnsName.DNS)
except AttributeError:
scandnsArray = [] # No DNS names assigned to group.
groupsArray.append(AssetGroup(group.BUSINESS_IMPACT, group.ID, group.LAST_UPDATE, scanipsArray, scandnsArray, scannersArray, group.TITLE))
return groupsArray
def listReportTemplates(self):
call = 'report_template_list.php'
rtData = objectify.fromstring(self.request(call))
templatesArray = []
for template in rtData.REPORT_TEMPLATE:
templatesArray.append(ReportTemplate(template.GLOBAL, template.ID, template.LAST_UPDATE, template.TEMPLATE_TYPE, template.TITLE, template.TYPE, template.USER))
return templatesArray
def listReports(self, id=0):
call = '/api/2.0/fo/report'
if id == 0:
parameters = {'action': 'list'}
repData = objectify.fromstring(self.request(call, parameters)).RESPONSE
reportsArray = []
for report in repData.REPORT_LIST.REPORT:
reportsArray.append(Report(report.EXPIRATION_DATETIME, report.ID, report.LAUNCH_DATETIME, report.OUTPUT_FORMAT, report.SIZE, report.STATUS, report.TYPE, report.USER_LOGIN))
return reportsArray
else:
parameters = {'action': 'list', 'id': id}
repData = objectify.fromstring(self.request(call, parameters)).RESPONSE.REPORT_LIST.REPORT
return Report(repData.EXPIRATION_DATETIME, repData.ID, repData.LAUNCH_DATETIME, repData.OUTPUT_FORMAT, repData.SIZE, repData.STATUS, repData.TYPE, repData.USER_LOGIN)
def notScannedSince(self, days):
call = '/api/2.0/fo/asset/host/'
parameters = {'action': 'list', 'details': 'All'}
hostData = objectify.fromstring(self.request(call, parameters))
hostArray = []
today = datetime.date.today()
for host in hostData.RESPONSE.HOST_LIST.HOST:
last_scan = str(host.LAST_VULN_SCAN_DATETIME).split('T')[0]
last_scan = datetime.date(int(last_scan.split('-')[0]), int(last_scan.split('-')[1]), int(last_scan.split('-')[2]))
if (today - last_scan).days >= days:
hostArray.append(Host(host.DNS, host.ID, host.IP, host.LAST_VULN_SCAN_DATETIME, host.NETBIOS, host.OS, host.TRACKING_METHOD))
return hostArray
def addIP(self, ips, vmpc):
# 'ips' parameter accepts comma-separated list of IP addresses.
# 'vmpc' parameter accepts 'vm', 'pc', or 'both'. (Vulnerability Managment, Policy Compliance, or both)
call = '/api/2.0/fo/asset/ip/'
enablevm = 1
enablepc = 0
if vmpc == 'pc':
enablevm = 0
enablepc = 1
elif vmpc == 'both':
enablevm = 1
enablepc = 1
parameters = {'action': 'add', 'ips': ips, 'enable_vm': enablevm, 'enable_pc': enablepc}
self.request(call, parameters)
def listScans(self, launched_after="", state="", target="", type="", user_login=""):
# 'launched_after' parameter accepts a date in the format: YYYY-MM-DD
# 'state' parameter accepts "Running", "Paused", "Canceled", "Finished", "Error", "Queued", and "Loading".
# 'title' parameter accepts a string
# 'type' parameter accepts "On-Demand", and "Scheduled".
# 'user_login' parameter accepts a user name (string)
call = '/api/2.0/fo/scan/'
parameters = {'action': 'list', 'show_ags': 1, 'show_op': 1, 'show_status': 1}
if launched_after != "":
parameters['launched_after_datetime'] = launched_after
if state != "":
parameters['state'] = state
if target != "":
parameters['target'] = target
if type != "":
parameters['type'] = type
if user_login != "":
parameters['user_login'] = user_login
scanlist = objectify.fromstring(self.request(call, parameters))
scanArray = []
for scan in scanlist.RESPONSE.SCAN_LIST.SCAN:
try:
agList = []
for ag in scan.ASSET_GROUP_TITLE_LIST.ASSET_GROUP_TITLE:
agList.append(ag)
except AttributeError:
agList = []
scanArray.append(Scan(agList, scan.DURATION, scan.LAUNCH_DATETIME, scan.OPTION_PROFILE.TITLE, scan.PROCESSED, scan.REF, scan.STATUS, scan.TARGET, scan.TITLE, scan.TYPE, scan.USER_LOGIN))
return scanArray
def launchScan(self, title, option_title, iscanner_name, asset_groups="", ip=""):
# TODO: Add ability to scan by tag.
call = '/api/2.0/fo/scan/'
parameters = {'action': 'launch', 'scan_title': title, 'option_title': option_title, 'iscanner_name': iscanner_name, 'ip': ip, 'asset_groups': asset_groups}
if ip == "":
parameters.pop("ip")
if asset_groups == "":
parameters.pop("asset_groups")
scan_ref = objectify.fromstring(self.request(call, parameters)).RESPONSE.ITEM_LIST.ITEM[1].VALUE
call = '/api/2.0/fo/scan/'
parameters = {'action': 'list', 'scan_ref': scan_ref, 'show_status': 1, 'show_ags': 1, 'show_op': 1}
scan = objectify.fromstring(self.request(call, parameters)).RESPONSE.SCAN_LIST.SCAN
try:
agList = []
for ag in scan.ASSET_GROUP_TITLE_LIST.ASSET_GROUP_TITLE:
agList.append(ag)
except AttributeError:
agList = []
return Scan(agList, scan.DURATION, scan.LAUNCH_DATETIME, scan.OPTION_PROFILE.TITLE, scan.PROCESSED, scan.REF, scan.STATUS, scan.TARGET, scan.TITLE, scan.TYPE, scan.USER_LOGIN)

155
deps/qualysapi/qualysapi/api_methods.py vendored Normal file
View File

@ -0,0 +1,155 @@
from __future__ import absolute_import
__author__ = 'pbaxi'
from collections import defaultdict
api_methods = defaultdict(set)
api_methods['1'] = set([
'about.php',
'action_log_report.php',
'asset_data_report.php',
'asset_domain.php',
'asset_domain_list.php',
'asset_group_delete.php',
'asset_group_list.php',
'asset_ip_list.php',
'asset_range_info.php',
'asset_search.php',
'get_host_info.php',
'ignore_vuln.php',
'iscanner_list.php',
'knowledgebase_download.php',
'map-2.php',
'map.php',
'map_report.php',
'map_report_list.php',
'password_change.php',
'scan.php',
'scan_cancel.php',
'scan_options.php',
'scan_report.php',
'scan_report_delete.php',
'scan_report_list.php',
'scan_running_list.php',
'scan_target_history.php',
'scheduled_scans.php',
'ticket_delete.php',
'ticket_edit.php',
'ticket_list.php',
'ticket_list_deleted.php',
'time_zone_code.php',
'user.php',
'user_list.php',
])
# API v1 POST methods.
api_methods['1 post'] = set([
'action_log_report.php',
'asset_group.php',
'asset_ip.php',
'ignore_vuln.php',
'knowledgebase_download.php',
'map-2.php',
'map.php',
'password_change.php',
'scan.php',
'scan_report.php',
'scan_target_history.php',
'scheduled_scans.php',
'ticket_delete.php',
'ticket_edit.php',
'ticket_list.php',
'ticket_list_deleted.php',
'user.php',
'user_list.php',
])
# API v2 methods (they're all POST).
api_methods['2'] = set([
'api/2.0/fo/appliance/',
'api/2.0/fo/asset/excluded_ip/',
'api/2.0/fo/asset/excluded_ip/history/',
'api/2.0/fo/asset/host/',
'api/2.0/fo/asset/host/cyberscope/',
'api/2.0/fo/asset/host/cyberscope/fdcc/policy/',
'api/2.0/fo/asset/host/cyberscope/fdcc/scan/',
'api/2.0/fo/asset/host/vm/detection/',
'api/2.0/fo/asset/ip/',
'api/2.0/fo/asset/ip/v4_v6/',
'api/2.0/fo/asset/vhost/',
'api/2.0/fo/auth/',
# 'api/2.0/fo/auth/{type}/', # Added below.
'api/2.0/fo/compliance/',
'api/2.0/fo/compliance/control',
'api/2.0/fo/compliance/fdcc/policy',
'api/2.0/fo/compliance/policy/',
'api/2.0/fo/compliance/posture/info/',
'api/2.0/fo/compliance/scap/arf/',
'api/2.0/fo/knowledge_base/vuln/',
'api/2.0/fo/report/',
'api/2.0/fo/report/scorecard/',
'api/2.0/fo/scan/',
'api/2.0/fo/scan/compliance/',
'api/2.0/fo/session/',
'api/2.0/fo/setup/restricted_ips/',
])
for auth_type in set([
'ibm_db2',
'ms_sql',
'oracle',
'oracle_listener',
'snmp',
'unix',
'windows',
]):
api_methods['2'].add('api/2.0/fo/auth/%s/' % auth_type)
# WAS GET methods when no POST data.
api_methods['was no data get'] = set([
'count/was/report',
'count/was/wasscan',
'count/was/wasscanschedule',
'count/was/webapp',
'download/was/report/',
'download/was/wasscan/',
])
# WAS GET methods.
api_methods['was get'] = set([
'download/was/report/',
'download/was/wasscan/',
'get/was/report/',
'get/was/wasscan/',
'get/was/wasscanschedule/',
'get/was/webapp/',
'status/was/report/',
'status/was/wasscan/',
])
# Asset Management GET methods.
api_methods['am get'] = set([
'count/am/asset',
'count/am/hostasset',
'count/am/tag',
'get/am/asset/',
'get/am/hostasset/',
'get/am/tag/',
])
# Asset Management v2 GET methods.
api_methods['am2 get'] = set([
'get/am/asset/',
'get/am/hostasset/',
'get/am/tag/',
'get/am/hostinstancevuln/',
'get/am/assetdataconnector/',
'get/am/awsassetdataconnector/',
'get/am/awsauthrecord/',
])
# Keep track of methods with ending slashes to autocorrect user when they forgot slash.
api_methods_with_trailing_slash = defaultdict(set)
for method_group in set(['1', '2', 'was', 'am', 'am2']):
for method in api_methods[method_group]:
if method[-1] == '/':
# Add applicable method with api_version preceding it.
# Example:
# WAS API has 'get/was/webapp/'.
# method_group = 'was get'
# method_group.split()[0] = 'was'
# Take off slash to match user provided method.
# api_methods_with_trailing_slash['was'] contains 'get/was/webapp'
api_methods_with_trailing_slash[method_group.split()[0]].add(method[:-1])

120
deps/qualysapi/qualysapi/api_objects.py vendored Normal file
View File

@ -0,0 +1,120 @@
from __future__ import absolute_import
import datetime
from lxml import objectify
class Host(object):
def __init__(self, dns, id, ip, last_scan, netbios, os, tracking_method):
self.dns = str(dns)
self.id = int(id)
self.ip = str(ip)
last_scan = str(last_scan).replace('T', ' ').replace('Z', '').split(' ')
date = last_scan[0].split('-')
time = last_scan[1].split(':')
self.last_scan = datetime.datetime(int(date[0]), int(date[1]), int(date[2]), int(time[0]), int(time[1]), int(time[2]))
self.netbios = str(netbios)
self.os = str(os)
self.tracking_method = str(tracking_method)
class AssetGroup(object):
def __init__(self, business_impact, id, last_update, scanips, scandns, scanner_appliances, title):
self.business_impact = str(business_impact)
self.id = int(id)
self.last_update = str(last_update)
self.scanips = scanips
self.scandns = scandns
self.scanner_appliances = scanner_appliances
self.title = str(title)
def addAsset(conn, ip):
call = '/api/2.0/fo/asset/group/'
parameters = {'action': 'edit', 'id': self.id, 'add_ips': ip}
conn.request(call, parameters)
self.scanips.append(ip)
def setAssets(conn, ips):
call = '/api/2.0/fo/asset/group/'
parameters = {'action': 'edit', 'id': self.id, 'set_ips': ips}
conn.request(call, parameters)
class ReportTemplate(object):
def __init__(self, isGlobal, id, last_update, template_type, title, type, user):
self.isGlobal = int(isGlobal)
self.id = int(id)
self.last_update = str(last_update).replace('T', ' ').replace('Z', '').split(' ')
self.template_type = template_type
self.title = title
self.type = type
self.user = user.LOGIN
class Report(object):
def __init__(self, expiration_datetime, id, launch_datetime, output_format, size, status, type, user_login):
self.expiration_datetime = str(expiration_datetime).replace('T', ' ').replace('Z', '').split(' ')
self.id = int(id)
self.launch_datetime = str(launch_datetime).replace('T', ' ').replace('Z', '').split(' ')
self.output_format = output_format
self.size = size
self.status = status.STATE
self.type = type
self.user_login = user_login
def download(self, conn):
call = '/api/2.0/fo/report'
parameters = {'action': 'fetch', 'id': self.id}
if self.status == 'Finished':
return conn.request(call, parameters)
class Scan(object):
def __init__(self, assetgroups, duration, launch_datetime, option_profile, processed, ref, status, target, title, type, user_login):
self.assetgroups = assetgroups
self.duration = str(duration)
launch_datetime = str(launch_datetime).replace('T', ' ').replace('Z', '').split(' ')
date = launch_datetime[0].split('-')
time = launch_datetime[1].split(':')
self.launch_datetime = datetime.datetime(int(date[0]), int(date[1]), int(date[2]), int(time[0]), int(time[1]), int(time[2]))
self.option_profile = str(option_profile)
self.processed = int(processed)
self.ref = str(ref)
self.status = str(status.STATE)
self.target = str(target).split(', ')
self.title = str(title)
self.type = str(type)
self.user_login = str(user_login)
def cancel(self, conn):
cancelled_statuses = ['Cancelled', 'Finished', 'Error']
if any(self.status in s for s in cancelled_statuses):
raise ValueError("Scan cannot be cancelled because its status is " + self.status)
else:
call = '/api/2.0/fo/scan/'
parameters = {'action': 'cancel', 'scan_ref': self.ref}
conn.request(call, parameters)
parameters = {'action': 'list', 'scan_ref': self.ref, 'show_status': 1}
self.status = objectify.fromstring(conn.request(call, parameters)).RESPONSE.SCAN_LIST.SCAN.STATUS.STATE
def pause(self, conn):
if self.status != "Running":
raise ValueError("Scan cannot be paused because its status is " + self.status)
else:
call = '/api/2.0/fo/scan/'
parameters = {'action': 'pause', 'scan_ref': self.ref}
conn.request(call, parameters)
parameters = {'action': 'list', 'scan_ref': self.ref, 'show_status': 1}
self.status = objectify.fromstring(conn.request(call, parameters)).RESPONSE.SCAN_LIST.SCAN.STATUS.STATE
def resume(self, conn):
if self.status != "Paused":
raise ValueError("Scan cannot be resumed because its status is " + self.status)
else:
call = '/api/2.0/fo/scan/'
parameters = {'action': 'resume', 'scan_ref': self.ref}
conn.request(call, parameters)
parameters = {'action': 'list', 'scan_ref': self.ref, 'show_status': 1}
self.status = objectify.fromstring(conn.request(call, parameters)).RESPONSE.SCAN_LIST.SCAN.STATUS.STATE

221
deps/qualysapi/qualysapi/config.py vendored Normal file
View File

@ -0,0 +1,221 @@
""" Module providing a single class (QualysConnectConfig) that parses a config
file and provides the information required to build QualysGuard sessions.
"""
from __future__ import absolute_import
from __future__ import print_function
import os
import stat
import getpass
import logging
from six.moves import input
from six.moves.configparser import *
import qualysapi.settings as qcs
# Setup module level logging.
logger = logging.getLogger(__name__)
# try:
# from requests_ntlm import HttpNtlmAuth
# except ImportError, e:
# logger.warning('Warning: Cannot support NTML authentication.')
__author__ = "Parag Baxi <parag.baxi@gmail.com> & Colin Bell <colin.bell@uwaterloo.ca>"
__updated_by__ = "Austin Taylor <vulnWhisperer@austintaylor.io>"
__copyright__ = "Copyright 2011-2013, Parag Baxi & University of Waterloo"
__license__ = "BSD-new"
class QualysConnectConfig:
""" Class to create a ConfigParser and read user/password details
from an ini file.
"""
def __init__(self, filename=qcs.default_filename, remember_me=False, remember_me_always=False):
self._cfgfile = None
# Prioritize local directory filename.
# Check for file existence.
if os.path.exists(filename):
self._cfgfile = filename
elif os.path.exists(os.path.join(os.path.expanduser("~"), filename)):
# Set home path for file.
self._cfgfile = os.path.join(os.path.expanduser("~"), filename)
# create ConfigParser to combine defaults and input from config file.
self._cfgparse = ConfigParser(qcs.defaults)
if self._cfgfile:
self._cfgfile = os.path.realpath(self._cfgfile)
mode = stat.S_IMODE(os.stat(self._cfgfile)[stat.ST_MODE])
# apply bitmask to current mode to check ONLY user access permissions.
if (mode & (stat.S_IRWXG | stat.S_IRWXO)) != 0:
logging.warning('%s permissions allows more than user access.' % (filename,))
self._cfgparse.read(self._cfgfile)
# if 'info' doesn't exist, create the section.
if not self._cfgparse.has_section('qualys'):
self._cfgparse.add_section('qualys')
# Use default hostname (if one isn't provided).
if not self._cfgparse.has_option('qualys', 'hostname'):
if self._cfgparse.has_option('DEFAULT', 'hostname'):
hostname = self._cfgparse.get('DEFAULT', 'hostname')
self._cfgparse.set('qualys', 'hostname', hostname)
else:
raise Exception("No 'hostname' set. QualysConnect does not know who to connect to.")
# Use default max_retries (if one isn't provided).
if not self._cfgparse.has_option('qualys', 'max_retries'):
self.max_retries = qcs.defaults['max_retries']
else:
self.max_retries = self._cfgparse.get('qualys', 'max_retries')
try:
self.max_retries = int(self.max_retries)
except Exception:
logger.error('Value max_retries must be an integer.')
print('Value max_retries must be an integer.')
exit(1)
self._cfgparse.set('qualys', 'max_retries', str(self.max_retries))
self.max_retries = int(self.max_retries)
#Get template ID... user will need to set this to pull back CSV reports
if not self._cfgparse.has_option('qualys', 'template_id'):
self.report_template_id = qcs.defaults['template_id']
else:
self.report_template_id = self._cfgparse.get('qualys', 'template_id')
try:
self.report_template_id = int(self.report_template_id)
except Exception:
logger.error('Report Template ID Must be set and be an integer')
print('Value template ID must be an integer.')
exit(1)
self._cfgparse.set('qualys', 'template_id', str(self.report_template_id))
self.report_template_id = int(self.report_template_id)
# Proxy support
proxy_config = proxy_url = proxy_protocol = proxy_port = proxy_username = proxy_password = None
# User requires proxy?
if self._cfgparse.has_option('proxy', 'proxy_url'):
proxy_url = self._cfgparse.get('proxy', 'proxy_url')
# Remove protocol prefix from url if included.
for prefix in ('http://', 'https://'):
if proxy_url.startswith(prefix):
proxy_protocol = prefix
proxy_url = proxy_url[len(prefix):]
# Default proxy protocol is http.
if not proxy_protocol:
proxy_protocol = 'https://'
# Check for proxy port request.
if ':' in proxy_url:
# Proxy port already specified in url.
# Set proxy port.
proxy_port = proxy_url[proxy_url.index(':') + 1:]
# Remove proxy port from proxy url.
proxy_url = proxy_url[:proxy_url.index(':')]
if self._cfgparse.has_option('proxy', 'proxy_port'):
# Proxy requires specific port.
if proxy_port:
# Warn that a proxy port was already specified in the url.
proxy_port_url = proxy_port
proxy_port = self._cfgparse.get('proxy', 'proxy_port')
logger.warning('Proxy port from url overwritten by specified proxy_port from config:')
logger.warning('%s --> %s' % (proxy_port_url, proxy_port))
else:
proxy_port = self._cfgparse.get('proxy', 'proxy_port')
if not proxy_port:
# No proxy port specified.
if proxy_protocol == 'http://':
# Use default HTTP Proxy port.
proxy_port = '8080'
else:
# Use default HTTPS Proxy port.
proxy_port = '443'
# Check for proxy authentication request.
if self._cfgparse.has_option('proxy', 'proxy_username'):
# Proxy requires username & password.
proxy_username = self._cfgparse.get('proxy', 'proxy_username')
proxy_password = self._cfgparse.get('proxy', 'proxy_password')
# Not sure if this use case below is valid.
# # Support proxy with username and empty password.
# try:
# proxy_password = self._cfgparse.get('proxy','proxy_password')
# except NoOptionError, e:
# # Set empty password.
# proxy_password = ''
# Sample proxy config:f
# 'http://user:pass@10.10.1.10:3128'
if proxy_url:
# Proxy requested.
proxy_config = proxy_url
if proxy_port:
# Proxy port requested.
proxy_config += ':' + proxy_port
if proxy_username:
# Proxy authentication requested.
proxy_config = proxy_username + ':' + proxy_password + '@' + proxy_config
# Prefix by proxy protocol.
proxy_config = proxy_protocol + proxy_config
# Set up proxy if applicable.
if proxy_config:
self.proxies = {'https': proxy_config}
else:
self.proxies = None
# ask username (if one doesn't exist)
if not self._cfgparse.has_option('qualys', 'username'):
username = input('QualysGuard Username: ')
self._cfgparse.set('qualys', 'username', username)
# ask password (if one doesn't exist)
if not self._cfgparse.has_option('qualys', 'password'):
password = getpass.getpass('QualysGuard Password: ')
self._cfgparse.set('qualys', 'password', password)
logging.debug(self._cfgparse.items('qualys'))
if remember_me or remember_me_always:
# Let's create that config file for next time...
# Where to store this?
if remember_me:
# Store in current working directory.
config_path = filename
if remember_me_always:
# Store in home directory.
config_path = os.path.expanduser("~")
if not os.path.exists(config_path):
# Write file only if it doesn't already exists.
# http://stackoverflow.com/questions/5624359/write-file-with-specific-permissions-in-python
mode = stat.S_IRUSR | stat.S_IWUSR # This is 0o600 in octal and 384 in decimal.
umask_original = os.umask(0)
try:
config_file = os.fdopen(os.open(config_path, os.O_WRONLY | os.O_CREAT, mode), 'w')
finally:
os.umask(umask_original)
# Add the settings to the structure of the file, and lets write it out...
self._cfgparse.write(config_file)
config_file.close()
def get_config_filename(self):
return self._cfgfile
def get_config(self):
return self._cfgparse
def get_auth(self):
''' Returns username from the configfile. '''
return (self._cfgparse.get('qualys', 'username'), self._cfgparse.get('qualys', 'password'))
def get_hostname(self):
''' Returns hostname. '''
return self._cfgparse.get('qualys', 'hostname')
def get_template_id(self):
return self._cfgparse.get('qualys','template_id')

363
deps/qualysapi/qualysapi/connector.py vendored Normal file
View File

@ -0,0 +1,363 @@
from __future__ import absolute_import
from __future__ import print_function
__author__ = 'Parag Baxi <parag.baxi@gmail.com>'
__copyright__ = 'Copyright 2013, Parag Baxi'
__license__ = 'Apache License 2.0'
""" Module that contains classes for setting up connections to QualysGuard API
and requesting data from it.
"""
import logging
import time
try:
from urllib.parse import urlparse
except ImportError:
from urlparse import urlparse
from collections import defaultdict
import requests
import qualysapi.version
import qualysapi.api_methods
import qualysapi.api_actions
import qualysapi.api_actions as api_actions
# Setup module level logging.
logger = logging.getLogger(__name__)
try:
from lxml import etree
except ImportError as e:
logger.warning(
'Warning: Cannot consume lxml.builder E objects without lxml. Send XML strings for AM & WAS API calls.')
class QGConnector(api_actions.QGActions):
""" Qualys Connection class which allows requests to the QualysGuard API using HTTP-Basic Authentication (over SSL).
"""
def __init__(self, auth, server='qualysapi.qualys.com', proxies=None, max_retries=3):
# Read username & password from file, if possible.
self.auth = auth
# Remember QualysGuard API server.
self.server = server
# Remember rate limits per call.
self.rate_limit_remaining = defaultdict(int)
# api_methods: Define method algorithm in a dict of set.
# Naming convention: api_methods[api_version optional_blah] due to api_methods_with_trailing_slash testing.
self.api_methods = qualysapi.api_methods.api_methods
#
# Keep track of methods with ending slashes to autocorrect user when they forgot slash.
self.api_methods_with_trailing_slash = qualysapi.api_methods.api_methods_with_trailing_slash
self.proxies = proxies
logger.debug('proxies = \n%s' % proxies)
# Set up requests max_retries.
logger.debug('max_retries = \n%s' % max_retries)
self.session = requests.Session()
http_max_retries = requests.adapters.HTTPAdapter(max_retries=max_retries)
https_max_retries = requests.adapters.HTTPAdapter(max_retries=max_retries)
self.session.mount('http://', http_max_retries)
self.session.mount('https://', https_max_retries)
def __call__(self):
return self
def format_api_version(self, api_version):
""" Return QualysGuard API version for api_version specified.
"""
# Convert to int.
if type(api_version) == str:
api_version = api_version.lower()
if api_version[0] == 'v' and api_version[1].isdigit():
# Remove first 'v' in case the user typed 'v1' or 'v2', etc.
api_version = api_version[1:]
# Check for input matching Qualys modules.
if api_version in ('asset management', 'assets', 'tag', 'tagging', 'tags'):
# Convert to Asset Management API.
api_version = 'am'
elif api_version in ('am2'):
# Convert to Asset Management API v2
api_version = 'am2'
elif api_version in ('webapp', 'web application scanning', 'webapp scanning'):
# Convert to WAS API.
api_version = 'was'
elif api_version in ('pol', 'pc'):
# Convert PC module to API number 2.
api_version = 2
else:
api_version = int(api_version)
return api_version
def which_api_version(self, api_call):
""" Return QualysGuard API version for api_call specified.
"""
# Leverage patterns of calls to API methods.
if api_call.endswith('.php'):
# API v1.
return 1
elif api_call.startswith('api/2.0/'):
# API v2.
return 2
elif '/am/' in api_call:
# Asset Management API.
return 'am'
elif '/was/' in api_call:
# WAS API.
return 'was'
return False
def url_api_version(self, api_version):
""" Return base API url string for the QualysGuard api_version and server.
"""
# Set base url depending on API version.
if api_version == 1:
# QualysGuard API v1 url.
url = "https://%s/msp/" % (self.server,)
elif api_version == 2:
# QualysGuard API v2 url.
url = "https://%s/" % (self.server,)
elif api_version == 'was':
# QualysGuard REST v3 API url (Portal API).
url = "https://%s/qps/rest/3.0/" % (self.server,)
elif api_version == 'am':
# QualysGuard REST v1 API url (Portal API).
url = "https://%s/qps/rest/1.0/" % (self.server,)
elif api_version == 'am2':
# QualysGuard REST v1 API url (Portal API).
url = "https://%s/qps/rest/2.0/" % (self.server,)
else:
raise Exception("Unknown QualysGuard API Version Number (%s)" % (api_version,))
logger.debug("Base url =\n%s" % (url))
return url
def format_http_method(self, api_version, api_call, data):
""" Return QualysGuard API http method, with POST preferred..
"""
# Define get methods for automatic http request methodology.
#
# All API v2 requests are POST methods.
if api_version == 2:
return 'post'
elif api_version == 1:
if api_call in self.api_methods['1 post']:
return 'post'
else:
return 'get'
elif api_version == 'was':
# WAS API call.
# Because WAS API enables user to GET API resources in URI, let's chop off the resource.
# '/download/was/report/18823' --> '/download/was/report/'
api_call_endpoint = api_call[:api_call.rfind('/') + 1]
if api_call_endpoint in self.api_methods['was get']:
return 'get'
# Post calls with no payload will result in HTTPError: 415 Client Error: Unsupported Media Type.
if data is None:
# No post data. Some calls change to GET with no post data.
if api_call_endpoint in self.api_methods['was no data get']:
return 'get'
else:
return 'post'
else:
# Call with post data.
return 'post'
else:
# Asset Management API call.
if api_call in self.api_methods['am get']:
return 'get'
else:
return 'post'
def preformat_call(self, api_call):
""" Return properly formatted QualysGuard API call.
"""
# Remove possible starting slashes or trailing question marks in call.
api_call_formatted = api_call.lstrip('/')
api_call_formatted = api_call_formatted.rstrip('?')
if api_call != api_call_formatted:
# Show difference
logger.debug('api_call post strip =\n%s' % api_call_formatted)
return api_call_formatted
def format_call(self, api_version, api_call):
""" Return properly formatted QualysGuard API call according to api_version etiquette.
"""
# Remove possible starting slashes or trailing question marks in call.
api_call = api_call.lstrip('/')
api_call = api_call.rstrip('?')
logger.debug('api_call post strip =\n%s' % api_call)
# Make sure call always ends in slash for API v2 calls.
if (api_version == 2 and api_call[-1] != '/'):
# Add slash.
logger.debug('Adding "/" to api_call.')
api_call += '/'
if api_call in self.api_methods_with_trailing_slash[api_version]:
# Add slash.
logger.debug('Adding "/" to api_call.')
api_call += '/'
return api_call
def format_payload(self, api_version, data):
""" Return appropriate QualysGuard API call.
"""
# Check if payload is for API v1 or API v2.
if (api_version in (1, 2)):
# Check if string type.
if type(data) == str:
# Convert to dictionary.
logger.debug('Converting string to dict:\n%s' % data)
# Remove possible starting question mark & ending ampersands.
data = data.lstrip('?')
data = data.rstrip('&')
# Convert to dictionary.
#data = urllib.parse.parse_qs(data)
data = urlparse(data)
logger.debug('Converted:\n%s' % str(data))
elif api_version in ('am', 'was', 'am2'):
if type(data) == etree._Element:
logger.debug('Converting lxml.builder.E to string')
data = etree.tostring(data)
logger.debug('Converted:\n%s' % data)
return data
def request(self, api_call, data=None, api_version=None, http_method=None, concurrent_scans_retries=0,
concurrent_scans_retry_delay=0):
""" Return QualysGuard API response.
"""
logger.debug('api_call =\n%s' % api_call)
logger.debug('api_version =\n%s' % api_version)
logger.debug('data %s =\n %s' % (type(data), str(data)))
logger.debug('http_method =\n%s' % http_method)
logger.debug('concurrent_scans_retries =\n%s' % str(concurrent_scans_retries))
logger.debug('concurrent_scans_retry_delay =\n%s' % str(concurrent_scans_retry_delay))
concurrent_scans_retries = int(concurrent_scans_retries)
concurrent_scans_retry_delay = int(concurrent_scans_retry_delay)
#
# Determine API version.
# Preformat call.
api_call = self.preformat_call(api_call)
if api_version:
# API version specified, format API version inputted.
api_version = self.format_api_version(api_version)
else:
# API version not specified, determine automatically.
api_version = self.which_api_version(api_call)
#
# Set up base url.
url = self.url_api_version(api_version)
#
# Set up headers.
headers = {"X-Requested-With": "QualysAPI (python) v%s - VulnWhisperer" % (qualysapi.version.__version__,)}
logger.debug('headers =\n%s' % (str(headers)))
# Portal API takes in XML text, requiring custom header.
if api_version in ('am', 'was', 'am2'):
headers['Content-type'] = 'text/xml'
#
# Set up http request method, if not specified.
if not http_method:
http_method = self.format_http_method(api_version, api_call, data)
logger.debug('http_method =\n%s' % http_method)
#
# Format API call.
api_call = self.format_call(api_version, api_call)
logger.debug('api_call =\n%s' % (api_call))
# Append api_call to url.
url += api_call
#
# Format data, if applicable.
if data is not None:
data = self.format_payload(api_version, data)
# Make request at least once (more if concurrent_retry is enabled).
retries = 0
#
# set a warning threshold for the rate limit
rate_warn_threshold = 10
while retries <= concurrent_scans_retries:
# Make request.
logger.debug('url =\n%s' % (str(url)))
logger.debug('data =\n%s' % (str(data)))
logger.debug('headers =\n%s' % (str(headers)))
if http_method == 'get':
# GET
logger.debug('GET request.')
request = self.session.get(url, params=data, auth=self.auth, headers=headers, proxies=self.proxies)
else:
# POST
logger.debug('POST request.')
# Make POST request.
request = self.session.post(url, data=data, auth=self.auth, headers=headers, proxies=self.proxies)
logger.debug('response headers =\n%s' % (str(request.headers)))
#
# Remember how many times left user can make against api_call.
try:
self.rate_limit_remaining[api_call] = int(request.headers['x-ratelimit-remaining'])
logger.debug('rate limit for api_call, %s = %s' % (api_call, self.rate_limit_remaining[api_call]))
if (self.rate_limit_remaining[api_call] > rate_warn_threshold):
logger.debug('rate limit for api_call, %s = %s' % (api_call, self.rate_limit_remaining[api_call]))
elif (self.rate_limit_remaining[api_call] <= rate_warn_threshold) and (self.rate_limit_remaining[api_call] > 0):
logger.warning('Rate limit is about to being reached (remaining api calls = %s)' % self.rate_limit_remaining[api_call])
elif self.rate_limit_remaining[api_call] <= 0:
logger.critical('ATTENTION! RATE LIMIT HAS BEEN REACHED (remaining api calls = %s)!' % self.rate_limit_remaining[api_call])
except KeyError as e:
# Likely a bad api_call.
logger.debug(e)
pass
except TypeError as e:
# Likely an asset search api_call.
logger.debug(e)
pass
# Response received.
response = str(request.content)
logger.debug('response text =\n%s' % (response))
# Keep track of how many retries.
retries += 1
# Check for concurrent scans limit.
if not ('<responseCode>INVALID_REQUEST</responseCode>' in response and
'<errorMessage>You have reached the maximum number of concurrent running scans' in response and
'<errorResolution>Please wait until your previous scans have completed</errorResolution>' in response):
# Did not hit concurrent scan limit.
break
else:
# Hit concurrent scan limit.
logger.critical(response)
# If trying again, delay next try by concurrent_scans_retry_delay.
if retries <= concurrent_scans_retries:
logger.warning('Waiting %d seconds until next try.' % concurrent_scans_retry_delay)
time.sleep(concurrent_scans_retry_delay)
# Inform user of how many retries.
logger.critical('Retry #%d' % retries)
else:
# Ran out of retries. Let user know.
print('Alert! Ran out of concurrent_scans_retries!')
logger.critical('Alert! Ran out of concurrent_scans_retries!')
return False
# Check to see if there was an error.
try:
request.raise_for_status()
except requests.HTTPError as e:
# Error
print('Error! Received a 4XX client error or 5XX server error response.')
print('Content = \n', response)
logger.error('Content = \n%s' % response)
print('Headers = \n', request.headers)
logger.error('Headers = \n%s' % str(request.headers))
request.raise_for_status()
if '<RETURN status="FAILED" number="2007">' in response:
print('Error! Your IP address is not in the list of secure IPs. Manager must include this IP (QualysGuard VM > Users > Security).')
print('Content = \n', response)
logger.error('Content = \n%s' % response)
print('Headers = \n', request.headers)
logger.error('Headers = \n%s' % str(request.headers))
return False
return response

290
deps/qualysapi/qualysapi/contrib.py vendored Normal file
View File

@ -0,0 +1,290 @@
# File for 3rd party contributions.
from __future__ import absolute_import
from __future__ import print_function
import six
from six.moves import range
__author__ = 'Parag Baxi <parag.baxi@gmail.com>'
__license__ = 'Apache License 2.0'
import logging
import time
import types
import unicodedata
from collections import defaultdict
from lxml import etree, objectify
# Set module level logger.
logger = logging.getLogger(__name__)
def generate_vm_report(self, report_details, startup_delay=60, polling_delay=30, max_checks=10):
''' Spool and download QualysGuard VM report.
startup_delay: Time in seconds to wait before initially checking.
polling_delay: Time in seconds to wait between checks.
max_checks: Maximum number of times to check for report spooling completion.
'''
# Merge parameters.
report_details['action'] = 'launch'
logger.debug(report_details)
xml_output = qualysapi_instance.request(2, 'report', report_details)
report_id = etree.XML(xml_output).find('.//VALUE').text
logger.debug('report_id: %s' % (report_id))
# Wait for report to finish spooling.
# Maximum number of times to check for report. About 10 minutes.
MAX_CHECKS = 10
logger.info('Report sent to spooler. Checking for report in %s seconds.' % (startup_delay))
time.sleep(startup_delay)
for n in range(0, max_checks):
# Check to see if report is done.
xml_output = qualysapi_instance.request(2, 'report', {'action': 'list', 'id': report_id})
tag_status = etree.XML(xml_output).findtext(".//STATE")
logger.debug('tag_status: %s' % (tag_status))
tag_status = etree.XML(xml_output).findtext(".//STATE")
logger.debug('tag_status: %s' % (tag_status))
if tag_status is not None:
# Report is showing up in the Report Center.
if tag_status == 'Finished':
# Report creation complete.
break
# Report not finished, wait.
logger.info('Report still spooling. Trying again in %s seconds.' % (polling_delay))
time.sleep(polling_delay)
# We now have to fetch the report. Use the report id.
report_xml = qualysapi_instance.request(2, 'report', {'action': 'fetch', 'id': report_id})
return report_xml
def qg_html_to_ascii(qg_html_text):
"""Convert and return QualysGuard's quasi HTML text to ASCII text."""
text = qg_html_text
# Handle tagged line breaks (<p>, <br>)
text = re.sub(r'(?i)<br>[ ]*', '\n', text)
text = re.sub(r'(?i)<p>[ ]*', '\n', text)
# Remove consecutive line breaks
text = re.sub(r"^\s+", "", text, flags=re.MULTILINE)
# Remove empty lines at the end.
text = re.sub('[\n]+$', '$', text)
# Store anchor tags href attribute
links = list(lxml.html.iterlinks(text))
# Remove anchor tags
html_element = lxml.html.fromstring(text)
# Convert anchor tags to "link_text (link: link_url )".
logging.debug('Converting anchor tags...')
text = html_element.text_content().encode('ascii', 'ignore')
# Convert each link.
for l in links:
# Find and replace each link.
link_text = l[0].text_content().encode('ascii', 'ignore').strip()
link_url = l[2].strip()
# Replacing link_text
if link_text != link_url:
# Link text is different, most likely a description.
text = string.replace(text, link_text, '%s (link: %s )' % (link_text, link_url))
else:
# Link text is the same as the href. No need to duplicate link.
text = string.replace(text, link_text, '%s' % (link_url))
logging.debug('Done.')
return text
def qg_parse_informational_qids(xml_report):
"""Return vulnerabilities of severity 1 and 2 levels due to a restriction of
QualysGuard's inability to report them in the internal ticketing system.
"""
# asset_group's vulnerability data map:
# {'qid_number': {
# # CSV info
# 'hosts': [{'ip': '10.28.0.1', 'dns': 'hostname', 'netbios': 'blah', 'vuln_id': 'remediation_ticket_number'}, {'ip': '10.28.0.3', 'dns': 'hostname2', 'netbios': '', 'vuln_id': 'remediation_ticket_number'}, ...],
# 'solution': '',
# 'impact': '',
# 'threat': '',
# 'severity': '',
# }
# 'qid_number2': ...
# }
# Add all vulnerabilities to list of dictionaries.
# Use defaultdict in case a new QID is encountered.
info_vulns = defaultdict(dict)
# Parse vulnerabilities in xml string.
tree = objectify.fromstring(xml_report)
# Write IP, DNS, & Result into each QID CSV file.
logging.debug('Parsing report...')
# TODO: Check against c_args.max to prevent creating CSV content for QIDs that we won't use.
for host in tree.HOST_LIST.HOST:
# Extract possible extra hostname information.
try:
netbios = unicodedata.normalize('NFKD', six.text_type(host.NETBIOS)).encode('ascii', 'ignore').strip()
except AttributeError:
netbios = ''
try:
dns = unicodedata.normalize('NFKD', six.text_type(host.DNS)).encode('ascii', 'ignore').strip()
except AttributeError:
dns = ''
ip = unicodedata.normalize('NFKD', six.text_type(host.IP)).encode('ascii', 'ignore').strip()
# Extract vulnerabilities host is affected by.
for vuln in host.VULN_INFO_LIST.VULN_INFO:
try:
result = unicodedata.normalize('NFKD', six.text_type(vuln.RESULT)).encode('ascii', 'ignore').strip()
except AttributeError:
result = ''
qid = unicodedata.normalize('NFKD', six.text_type(vuln.QID)).encode('ascii', 'ignore').strip()
# Attempt to add host to QID's list of affected hosts.
try:
info_vulns[qid]['hosts'].append({'ip': '%s' % (ip),
'dns': '%s' % (dns),
'netbios': '%s' % (netbios),
'vuln_id': '',
# Informational QIDs do not have vuln_id numbers. This is a flag to write the CSV file.
'result': '%s' % (result), })
except KeyError:
# New QID.
logging.debug('New QID found: %s' % (qid))
info_vulns[qid]['hosts'] = []
info_vulns[qid]['hosts'].append({'ip': '%s' % (ip),
'dns': '%s' % (dns),
'netbios': '%s' % (netbios),
'vuln_id': '',
# Informational QIDs do not have vuln_id numbers. This is a flag to write the CSV file.
'result': '%s' % (result), })
# All vulnerabilities added.
# Add all vulnerabilty information.
for vuln_details in tree.GLOSSARY.VULN_DETAILS_LIST.VULN_DETAILS:
qid = unicodedata.normalize('NFKD', six.text_type(vuln_details.QID)).encode('ascii', 'ignore').strip()
info_vulns[qid]['title'] = unicodedata.normalize('NFKD', six.text_type(vuln_details.TITLE)).encode('ascii',
'ignore').strip()
info_vulns[qid]['severity'] = unicodedata.normalize('NFKD', six.text_type(vuln_details.SEVERITY)).encode('ascii',
'ignore').strip()
info_vulns[qid]['solution'] = qg_html_to_ascii(
unicodedata.normalize('NFKD', six.text_type(vuln_details.SOLUTION)).encode('ascii', 'ignore').strip())
info_vulns[qid]['threat'] = qg_html_to_ascii(
unicodedata.normalize('NFKD', six.text_type(vuln_details.THREAT)).encode('ascii', 'ignore').strip())
info_vulns[qid]['impact'] = qg_html_to_ascii(
unicodedata.normalize('NFKD', six.text_type(vuln_details.IMPACT)).encode('ascii', 'ignore').strip())
# Ready to report informational vulnerabilities.
return info_vulns
# TODO: Implement required function qg_remediation_tickets(asset_group, status, qids)
# TODO: Remove static 'report_template' value. Parameterize and document required report template.
def qg_ticket_list(asset_group, severity, qids=None):
"""Return dictionary of each vulnerability reported against asset_group of severity."""
global asset_group_details
# All vulnerabilities imported to list of dictionaries.
vulns = qg_remediation_tickets(asset_group, 'OPEN', qids) # vulns now holds all open remediation tickets.
if not vulns:
# No tickets to report.
return False
#
# Sort the vulnerabilities in order of prevalence -- number of hosts affected.
vulns = OrderedDict(sorted(list(vulns.items()), key=lambda t: len(t[1]['hosts'])))
logging.debug('vulns sorted = %s' % (vulns))
#
# Remove QIDs that have duplicate patches.
#
# Read in patch report.
# TODO: Allow for lookup of report_template.
# Report template is Patch report "Sev 5 confirmed patchable".
logging.debug('Retrieving patch report from QualysGuard.')
print('Retrieving patch report from QualysGuard.')
report_template = '1063695'
# Call QualysGuard for patch report.
csv_output = qg_command(2, 'report', {'action': 'launch', 'output_format': 'csv',
'asset_group_ids': asset_group_details['qg_asset_group_id'],
'template_id': report_template,
'report_title': 'QGIR Patch %s' % (asset_group)})
logging.debug('csv_output =')
logging.debug(csv_output)
logging.debug('Improving remediation efficiency by removing unneeded, redundant patches.')
print('Improving remediation efficiency by removing unneeded, redundant patches.')
# Find the line for Patches by Host data.
logging.debug('Header found at %s.' % (csv_output.find('Patch QID, IP, DNS, NetBIOS, OS, Vulnerability Count')))
starting_pos = csv_output.find('Patch QID, IP, DNS, NetBIOS, OS, Vulnerability Count') + 52
logging.debug('starting_pos = %s' % str(starting_pos))
# Data resides between line ending in 'Vulnerability Count' and a blank line.
patches_by_host = csv_output[starting_pos:csv_output[starting_pos:].find(
'Host Vulnerabilities Fixed by Patch') + starting_pos - 3]
logging.debug('patches_by_host =')
logging.debug(patches_by_host)
# Read in string patches_by_host csv to a dictionary.
f = patches_by_host.split(os.linesep)
reader = csv.DictReader(f, ['Patch QID', 'IP', 'DNS', 'NetBIOS', 'OS', 'Vulnerability Count'], delimiter=',')
# Mark Patch QIDs that fix multiple vulnerabilities with associated IP addresses.
redundant_qids = defaultdict(list)
for row in reader:
if int(row['Vulnerability Count']) > 1:
# Add to list of redundant QIDs.
redundant_qids[row['Patch QID']].append(row['IP'])
logging.debug('%s, %s, %s, %s' % (
row['Patch QID'],
row['IP'],
int(row['Vulnerability Count']),
redundant_qids[row['Patch QID']]))
# Log for debugging.
logging.debug('len(redundant_qids) = %s, redundant_qids =' % (len(redundant_qids)))
for patch_qid in list(redundant_qids.keys()):
logging.debug('%s, %s' % (str(patch_qid), str(redundant_qids[patch_qid])))
# Extract redundant QIDs with associated IP addresses.
# Find the line for Patches by Host data.
starting_pos = csv_output.find('Patch QID, IP, QID, Severity, Type, Title, Instance, Last Detected') + 66
# Data resides between line ending in 'Vulnerability Count' and end of string.
host_vulnerabilities_fixed_by_patch = csv_output[starting_pos:]
# Read in string host_vulnerabilities_fixed_by_patch csv to a dictionary.
f = host_vulnerabilities_fixed_by_patch.split(os.linesep)
reader = csv.DictReader(f, ['Patch QID', 'IP', 'QID', 'Severity', 'Type', 'Title', 'Instance', 'Last Detected'],
delimiter=',')
# Remove IP addresses associated with redundant QIDs.
qids_to_remove = defaultdict(list)
for row in reader:
# If the row's IP address's Patch QID was found to have multiple vulnerabilities...
if len(redundant_qids[row['Patch QID']]) > 0 and redundant_qids[row['Patch QID']].count(row['IP']) > 0:
# Add the QID column to the list of dictionaries {QID: [IP address, IP address, ...], QID2: [IP address], ...}
qids_to_remove[row['QID']].append(row['IP'])
# Log for debugging.
logging.debug('len(qids_to_remove) = %s, qids_to_remove =' % (len(qids_to_remove)))
for a_qid in list(qids_to_remove.keys()):
logging.debug('%s, %s' % (str(a_qid), str(qids_to_remove[a_qid])))
#
# Diff vulns against qids_to_remove and against open incidents.
#
vulns_length = len(vulns)
# Iterate over list of keys rather than original dictionary as some keys may be deleted changing the size of the dictionary.
for a_qid in list(vulns.keys()):
# Debug log original qid's hosts.
logging.debug('Before diffing vulns[%s] =' % (a_qid))
logging.debug(vulns[a_qid]['hosts'])
# Pop each host.
# The [:] returns a "slice" of x, which happens to contain all its elements, and is thus effectively a copy of x.
for host in vulns[a_qid]['hosts'][:]:
# If the QID for the host is a dupe or if a there is an open Reaction incident.
if qids_to_remove[a_qid].count(host['ip']) > 0 or reaction_open_issue(host['vuln_id']):
# Remove the host from the QID's list of target hosts.
logging.debug('Removing remediation ticket %s.' % (host['vuln_id']))
vulns[a_qid]['hosts'].remove(host)
else:
# Do not remove this vuln
logging.debug('Will report remediation %s.' % (host['vuln_id']))
# Debug log diff'd qid's hosts.
logging.debug('After diffing vulns[%s]=' % (a_qid))
logging.debug(vulns[a_qid]['hosts'])
# If there are no more hosts left to patch for the qid.
if len(vulns[a_qid]['hosts']) == 0:
# Remove the QID.
logging.debug('Deleting vulns[%s].' % (a_qid))
del vulns[a_qid]
# Diff completed
if not vulns_length == len(vulns):
print('A count of %s vulnerabilities have been consolidated to %s vulnerabilities, a reduction of %s%%.' % (
int(vulns_length),
int(len(vulns)),
int(round((int(vulns_length) - int(len(vulns))) / float(vulns_length) * 100))))
# Return vulns to report.
logging.debug('vulns =')
logging.debug(vulns)
return vulns

21
deps/qualysapi/qualysapi/settings.py vendored Normal file
View File

@ -0,0 +1,21 @@
''' Module to hold global settings reused throughout qualysapi. '''
from __future__ import absolute_import
__author__ = "Colin Bell <colin.bell@uwaterloo.ca>"
__copyright__ = "Copyright 2011-2013, University of Waterloo"
__license__ = "BSD-new"
import os
global defaults
global default_filename
if os.name == 'nt':
default_filename = "config.ini"
else:
default_filename = ".qcrc"
defaults = {'hostname': 'qualysapi.qualys.com',
'max_retries': '3',
'template_id': '00000'}

29
deps/qualysapi/qualysapi/util.py vendored Normal file
View File

@ -0,0 +1,29 @@
""" A set of utility functions for QualysConnect module. """
from __future__ import absolute_import
import logging
import qualysapi.config as qcconf
import qualysapi.connector as qcconn
import qualysapi.settings as qcs
__author__ = "Parag Baxi <parag.baxi@gmail.com> & Colin Bell <colin.bell@uwaterloo.ca>"
__copyright__ = "Copyright 2011-2013, Parag Baxi & University of Waterloo"
__license__ = 'Apache License 2.0'
# Set module level logger.
logger = logging.getLogger(__name__)
def connect(config_file=qcs.default_filename, remember_me=False, remember_me_always=False):
""" Return a QGAPIConnect object for v1 API pulling settings from config
file.
"""
# Retrieve login credentials.
conf = qcconf.QualysConnectConfig(filename=config_file, remember_me=remember_me,
remember_me_always=remember_me_always)
connect = qcconn.QGConnector(conf.get_auth(),
conf.get_hostname(),
conf.proxies,
conf.max_retries)
logger.info("Finished building connector.")
return connect

3
deps/qualysapi/qualysapi/version.py vendored Normal file
View File

@ -0,0 +1,3 @@
__author__ = 'Austin Taylor'
__pkgname__ = 'qualysapi'
__version__ = '4.1.0'

51
deps/qualysapi/setup.py vendored Normal file
View File

@ -0,0 +1,51 @@
#!/usr/bin/env python
from __future__ import absolute_import
import os
import setuptools
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
__author__ = 'Austin Taylor <vulnWhisperer@austintaylor.io>'
__copyright__ = 'Copyright 2017, Austin Taylor'
__license__ = 'BSD-new'
# Make pyflakes happy.
__pkgname__ = None
__version__ = None
exec(compile(open('qualysapi/version.py').read(), 'qualysapi/version.py', 'exec'))
# A utility function to read the README file into the long_description field.
def read(fname):
""" Takes a filename and returns the contents of said file relative to
the current directory.
"""
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(name=__pkgname__,
version=__version__,
author='Austin Taylor',
author_email='vulnWhisperer@austintaylor.io',
description='QualysGuard(R) Qualys API Package modified for VulnWhisperer',
license='BSD-new',
keywords='Qualys QualysGuard API helper network security',
url='https://github.com/austin-taylor/qualysapi',
package_dir={'': '.'},
#packages=setuptools.find_packages(),
packages=['qualysapi',],
# package_data={'qualysapi':['LICENSE']},
# scripts=['src/scripts/qhostinfo.py', 'src/scripts/qscanhist.py', 'src/scripts/qreports.py'],
long_description=read('README.md'),
classifiers=[
'Development Status :: 5 - Production/Stable',
'Topic :: Utilities',
'License :: OSI Approved :: Apache Software License',
'Intended Audience :: Developers',
],
install_requires=[
'requests',
],
)

40
docker-compose.yml Normal file
View File

@ -0,0 +1,40 @@
version: '2'
services:
vulnwhisp_es1:
image: docker.elastic.co/elasticsearch/elasticsearch:5.6.2
container_name: vulnwhisp_es1
environment:
- cluster.name=vulnwhisperer
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
ulimits:
memlock:
soft: -1
hard: -1
mem_limit: 1g
volumes:
- esdata1:/usr/share/elasticsearch/data
ports:
- 19200:9200
networks:
- esnet
vulnwhisp_ks1:
image: docker.elastic.co/kibana/kibana:5.6.2
environment:
SERVER_NAME: vulnwhisp_ks1
ELASTICSEARCH_URL: http://vulnwhisp_es1:9200
ports:
- 15601:5601
networks:
- esnet
vulnwhisp_ls1:
image: docker.elastic.co/logstash/logstash:5.6.2
networks:
- esnet
volumes:
esdata1:
driver: local
networks:
esnet:

Binary file not shown.

After

Width:  |  Height:  |  Size: 356 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

View File

@ -0,0 +1,244 @@
{
"order": 0,
"template": "logstash-nessus-*",
"settings": {
"index": {
"routing": {
"allocation": {
"total_shards_per_node": "2"
}
},
"mapping": {
"total_fields": {
"limit": "3000"
}
},
"refresh_interval": "5s",
"number_of_shards": "1",
"number_of_replicas": "1"
}
},
"mappings": {
"_default_": {
"dynamic_templates": [
{
"message_field": {
"mapping": {
"fielddata": {
"format": "disabled"
},
"index": "analyzed",
"omit_norms": true,
"type": "string"
},
"match_mapping_type": "string",
"match": "message"
}
},
{
"string_fields": {
"mapping": {
"fielddata": {
"format": "disabled"
},
"index": "analyzed",
"omit_norms": true,
"type": "string",
"fields": {
"raw": {
"ignore_above": 256,
"index": "not_analyzed",
"type": "string",
"doc_values": true
}
}
},
"match_mapping_type": "string",
"match": "*"
}
},
{
"ip_address_fields": {
"mapping": {
"type": "ip"
},
"match": "*_ip"
}
},
{
"ipv6_address_fields": {
"mapping": {
"index": "not_analyzed",
"type": "string"
},
"match": "*_ipv6"
}
},
{
"float_fields": {
"mapping": {
"type": "float",
"doc_values": true
},
"match_mapping_type": "float",
"match": "*"
}
},
{
"double_fields": {
"mapping": {
"type": "double",
"doc_values": true
},
"match_mapping_type": "double",
"match": "*"
}
},
{
"byte_fields": {
"mapping": {
"type": "byte",
"doc_values": true
},
"match_mapping_type": "byte",
"match": "*"
}
},
{
"short_fields": {
"mapping": {
"type": "short",
"doc_values": true
},
"match_mapping_type": "short",
"match": "*"
}
},
{
"integer_fields": {
"mapping": {
"type": "integer",
"doc_values": true
},
"match_mapping_type": "integer",
"match": "*"
}
},
{
"long_fields": {
"mapping": {
"type": "long",
"doc_values": true
},
"match_mapping_type": "long",
"match": "*"
}
},
{
"date_fields": {
"mapping": {
"type": "date",
"doc_values": true
},
"match_mapping_type": "date",
"match": "*"
}
},
{
"geo_point_fields": {
"mapping": {
"type": "geo_point",
"doc_values": true
},
"match_mapping_type": "geo_point",
"match": "*"
}
}
],
"_all": {
"omit_norms": true,
"enabled": true
},
"properties": {
"plugin_id": {
"type": "integer"
},
"last_updated": {
"type": "date",
"doc_values": true
},
"geoip": {
"dynamic": true,
"type": "object",
"properties": {
"ip": {
"type": "ip",
"doc_values": true
},
"latitude": {
"type": "float",
"doc_values": true
},
"location": {
"type": "geo_point",
"doc_values": true
},
"longitude": {
"type": "float",
"doc_values": true
}
}
},
"risk_score": {
"type": "float"
},
"source": {
"index": "not_analyzed",
"type": "string"
},
"synopsis": {
"index": "not_analyzed",
"type": "string"
},
"see_also": {
"index": "not_analyzed",
"type": "string"
},
"@timestamp": {
"type": "date",
"doc_values": true
},
"cve": {
"index": "not_analyzed",
"type": "string"
},
"solution": {
"index": "not_analyzed",
"type": "string"
},
"port": {
"type": "integer"
},
"host": {
"type": "ip"
},
"@version": {
"index": "not_analyzed",
"type": "string",
"doc_values": true
},
"risk": {
"index": "not_analyzed",
"type": "string"
},
"assign_ip": {
"type": "ip"
},
"cvss": {
"type": "float"
}
}
}
},
"aliases": {}
}

116
filebeat/filebeat.yml Executable file
View File

@ -0,0 +1,116 @@
###################### Filebeat Configuration Example #########################
# This file is an example configuration file highlighting only the most common
# options. The filebeat.full.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html
#=========================== Filebeat prospectors =============================
filebeat.prospectors:
# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.
- input_type: log
# Paths that should be crawled and fetched. Glob based paths.
paths:
# Linux Example
#- /var/log/*.log
#Windows Example
- c:\nessus\My Scans\*
# Exclude lines. A list of regular expressions to match. It drops the lines that are
# matching any regular expression from the list.
#exclude_lines: ["^DBG"]
# Include lines. A list of regular expressions to match. It exports the lines that are
# matching any regular expression from the list.
#include_lines: ["^ERR", "^WARN"]
# Exclude files. A list of regular expressions to match. Filebeat drops the files that
# are matching any regular expression from the list. By default, no files are dropped.
#exclude_files: [".gz$"]
# Optional additional fields. These field can be freely picked
# to add additional information to the crawled log files for filtering
#fields:
# level: debug
# review: 1
### Multiline options
# Mutiline can be used for log messages spanning multiple lines. This is common
# for Java Stack Traces or C-Line Continuation
# The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
#multiline.pattern: ^\[
# Defines if the pattern set under pattern should be negated or not. Default is false.
#multiline.negate: false
# Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
# that was (not) matched before or after or as long as a pattern is not matched based on negate.
# Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
#multiline.match: after
#================================ General =====================================
# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:
# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]
# Optional fields that you can specify to add additional information to the
# output.
#fields:
# env: staging
#================================ Outputs =====================================
# Configure what outputs to use when sending the data collected by the beat.
# Multiple outputs may be used.
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
# Array of hosts to connect to.
# hosts: ["logstash01:9200"]
# Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["logstashserver1:5044", "logstashserver2:5044", "logstashserver3:5044"]
# Optional SSL. By default is off.
# List of root certificates for HTTPS server verifications
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
# Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"
# Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"
#================================ Logging =====================================
# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
#logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]

View File

@ -0,0 +1,28 @@
[
{
"_id": "54648700-3f74-11e7-852e-69207a3d0726",
"_type": "search",
"_source": {
"title": "Nessus - Saved Search",
"description": "",
"hits": 0,
"columns": [
"host",
"risk",
"risk_score",
"cve",
"plugin_name",
"solution",
"plugin_output"
],
"sort": [
"@timestamp",
"desc"
],
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[],\"highlight\":{\"pre_tags\":[\"@kibana-highlighted-field@\"],\"post_tags\":[\"@/kibana-highlighted-field@\"],\"fields\":{\"*\":{}},\"require_field_match\":false,\"fragment_size\":2147483647}}"
}
}
}
]

View File

@ -0,0 +1,548 @@
[
{
"_id": "7e7fbc90-3df2-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-PluginID",
"visState": "{\"title\":\"Nessus-PluginID\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"plugin_id.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "c786bc20-3df4-11e7-a3dd-33f478b7be91",
"_type": "visualization",
"_source": {
"title": "Nessus-RiskPie",
"visState": "{\"aggs\":[{\"enabled\":true,\"id\":\"1\",\"params\":{},\"schema\":\"metric\",\"type\":\"count\"},{\"enabled\":true,\"id\":\"2\",\"params\":{\"field\":\"risk.raw\",\"order\":\"desc\",\"orderBy\":\"1\",\"size\":50},\"schema\":\"segment\",\"type\":\"terms\"},{\"enabled\":true,\"id\":\"3\",\"params\":{\"field\":\"name.raw\",\"order\":\"desc\",\"orderBy\":\"1\",\"size\":50},\"schema\":\"segment\",\"type\":\"terms\"},{\"enabled\":true,\"id\":\"4\",\"params\":{\"field\":\"synopsis.raw\",\"order\":\"desc\",\"orderBy\":\"1\",\"size\":50},\"schema\":\"segment\",\"type\":\"terms\"},{\"enabled\":true,\"id\":\"5\",\"params\":{\"field\":\"host\",\"order\":\"desc\",\"orderBy\":\"1\",\"size\":50},\"schema\":\"segment\",\"type\":\"terms\"}],\"listeners\":{},\"params\":{\"addLegend\":true,\"addTooltip\":true,\"isDonut\":true,\"legendPosition\":\"right\"},\"title\":\"Nessus-RiskPie\",\"type\":\"pie\"}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"!(None)\"}},\"filter\":[]}"
}
}
},
{
"_id": "5a3c0340-3eb3-11e7-a192-93f36fbd9d05",
"_type": "visualization",
"_source": {
"title": "Nessus-CVSSHeatmap",
"visState": "{\"title\":\"Nessus-CVSSHeatmap\",\"type\":\"heatmap\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"enableHover\":false,\"legendPosition\":\"right\",\"times\":[],\"colorsNumber\":4,\"colorSchema\":\"Yellow to Red\",\"setColorRange\":false,\"colorsRange\":[],\"invertColors\":false,\"percentageMode\":false,\"valueAxes\":[{\"show\":false,\"id\":\"ValueAxis-1\",\"type\":\"value\",\"scale\":{\"type\":\"linear\",\"defaultYExtents\":false},\"labels\":{\"show\":false,\"rotate\":0,\"color\":\"#555\"}}]},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"segment\",\"params\":{\"field\":\"host\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\"}},{\"id\":\"3\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"group\",\"params\":{\"field\":\"cvss\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"_term\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"defaultColors\":{\"0 - 3500\":\"rgb(255,255,204)\",\"3500 - 7000\":\"rgb(254,217,118)\",\"7000 - 10500\":\"rgb(253,141,60)\",\"10500 - 14000\":\"rgb(227,27,28)\"}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "60418690-3eb1-11e7-90cb-918f9cb01e3d",
"_type": "visualization",
"_source": {
"title": "Nessus-TopPorts",
"visState": "{\"title\":\"Nessus-TopPorts\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"port\",\"size\":20,\"order\":\"desc\",\"orderBy\":\"1\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "983687e0-3df2-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-Protocol",
"visState": "{\"title\":\"Nessus-Protocol\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"protocol.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Protocol\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "995e2280-3df3-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-Host",
"visState": "{\"title\":\"Nessus-Host\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"host\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Host IP\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "87338510-3df2-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-PluginOutput",
"visState": "{\"title\":\"Nessus-PluginOutput\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"plugin_output.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Plugin Output\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "068d4bc0-3df3-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-SeeAlso",
"visState": "{\"title\":\"Nessus-SeeAlso\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"see_also.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"See Also\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "1de9e550-3df1-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-Description",
"visState": "{\"title\":\"Nessus-Description\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"description.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Description\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "1e59fa50-3df3-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-Synopsis",
"visState": "{\"title\":\"Nessus-Synopsis\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"synopsis.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Synopsis\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "13c7d4e0-3df3-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-Solution",
"visState": "{\"title\":\"Nessus-Solution\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"solution.raw\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Solution\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "69765d50-3f5e-11e7-98cc-d924fd28047d",
"_type": "visualization",
"_source": {
"title": "Nessus-CVE",
"visState": "{\"title\":\"Nessus-CVE\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"cve.raw\",\"size\":10,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"CVE ID\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"!(nan)\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "852816e0-3eb1-11e7-90cb-918f9cb01e3d",
"_type": "visualization",
"_source": {
"title": "Nessus-CVSS",
"visState": "{\"title\":\"Nessus-CVSS\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"cvss\",\"size\":20,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"CVSS Score\"}},{\"id\":\"4\",\"enabled\":true,\"type\":\"cardinality\",\"schema\":\"metric\",\"params\":{\"field\":\"host\",\"customLabel\":\"# of Hosts\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":0,\"direction\":\"desc\"}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "099a3820-3f68-11e7-a6bd-e764d950e506",
"_type": "visualization",
"_source": {
"title": "Timelion Nessus Example",
"visState": "{\"type\":\"timelion\",\"title\":\"Timelion Nessus Example\",\"params\":{\"expression\":\".es(index=logstash-nessus-*,q=risk:high).label(\\\"Current High Risk\\\"),.es(index=logstash-nessus-*,q=risk:high,offset=-1y).label(\\\"Last 1 Year High Risk\\\"),.es(index=logstash-nessus-*,q=risk:medium).label(\\\"Current Medium Risk\\\"),.es(index=logstash-nessus-*,q=risk:medium,offset=-1y).label(\\\"Last 1 Year Medium Risk\\\")\",\"interval\":\"auto\"}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{}"
}
}
},
{
"_id": "297df800-3f7e-11e7-bd24-6903e3283192",
"_type": "visualization",
"_source": {
"title": "Nessus - Plugin Name",
"visState": "{\"title\":\"Nessus - Plugin Name\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"plugin_name.raw\",\"size\":10,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Plugin Name\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "de1a5f40-3f85-11e7-97f9-3777d794626d",
"_type": "visualization",
"_source": {
"title": "Nessus - ScanName",
"visState": "{\"title\":\"Nessus - ScanName\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"scan_name.raw\",\"size\":20,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Scan Name\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "ecbb99c0-3f84-11e7-97f9-3777d794626d",
"_type": "visualization",
"_source": {
"title": "Nessus - Total",
"visState": "{\"title\":\"Nessus - Total\",\"type\":\"metric\",\"params\":{\"handleNoResults\":true,\"fontSize\":60},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{\"customLabel\":\"Total\"}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "471a3580-3f6b-11e7-88e7-df1abe6547fb",
"_type": "visualization",
"_source": {
"title": "Nessus - Vulnerabilities by Tag",
"visState": "{\"title\":\"Nessus - Vulnerabilities by Tag\",\"type\":\"table\",\"params\":{\"perPage\":3,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"3\",\"enabled\":true,\"type\":\"filters\",\"schema\":\"bucket\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"tags:has_hipaa_data\",\"analyze_wildcard\":true}}},\"label\":\"Systems with HIPAA data\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"tags:pci_asset\",\"analyze_wildcard\":true}}},\"label\":\"PCI Systems\"},{\"input\":{\"query\":{\"query_string\":{\"query\":\"tags:hipaa_asset\",\"analyze_wildcard\":true}}},\"label\":\"HIPAA Systems\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "35b6d320-3f7f-11e7-bd24-6903e3283192",
"_type": "visualization",
"_source": {
"title": "Nessus - Residual Risk",
"visState": "{\"title\":\"Nessus - Residual Risk\",\"type\":\"table\",\"params\":{\"perPage\":15,\"showPartialRows\":false,\"showMeticsAtAllLevels\":false,\"sort\":{\"columnIndex\":0,\"direction\":\"desc\"},\"showTotal\":false,\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"risk_score\",\"size\":50,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Risk Number\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":0,\"direction\":\"desc\"}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "a9225930-3df2-11e7-a44e-c79ca8efb780",
"_type": "visualization",
"_source": {
"title": "Nessus-Risk",
"visState": "{\"title\":\"Nessus-Risk\",\"type\":\"table\",\"params\":{\"perPage\":4,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"risk\",\"size\":10,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Risk Severity\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "2f979030-44b9-11e7-a818-f5f80dfc3590",
"_type": "visualization",
"_source": {
"title": "Nessus - ScanBarChart",
"visState": "{\"aggs\":[{\"enabled\":true,\"id\":\"1\",\"params\":{},\"schema\":\"metric\",\"type\":\"count\"},{\"enabled\":true,\"id\":\"2\",\"params\":{\"customLabel\":\"Scan Name\",\"field\":\"scan_name.raw\",\"order\":\"desc\",\"orderBy\":\"1\",\"size\":10},\"schema\":\"segment\",\"type\":\"terms\"}],\"listeners\":{},\"params\":{\"addLegend\":true,\"addTimeMarker\":false,\"addTooltip\":true,\"defaultYExtents\":false,\"legendPosition\":\"right\",\"mode\":\"stacked\",\"scale\":\"linear\",\"setYExtents\":false,\"times\":[]},\"title\":\"Nessus - ScanBarChart\",\"type\":\"histogram\"}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "67d432e0-44ec-11e7-a05f-d9719b331a27",
"_type": "visualization",
"_source": {
"title": "Nessus - TL-Critical Risk",
"visState": "{\"title\":\"Nessus - TL-Critical Risk\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*',q='(risk_score:>=9 AND risk_score:<=10)').label(\\\"Original\\\"),.es(index='logstash-nessus-*',q='(risk_score:>=9 AND risk_score:<=10)',offset=-1w).label(\\\"One week offset\\\"),.es(index='logstash-nessus-*',q='(risk_score:>=9 AND risk_score:<=10)').subtract(.es(index='logstash-nessus-*',q='(risk_score:>=9 AND risk_score:<=10)',offset=-1w)).label(\\\"Difference\\\").lines(steps=3,fill=2,width=1)\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "a91b9fe0-44ec-11e7-a05f-d9719b331a27",
"_type": "visualization",
"_source": {
"title": "Nessus - TL-Medium Risk",
"visState": "{\"title\":\"Nessus - TL-Medium Risk\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*',q='(risk_score:>=4 AND risk_score:<7)').label(\\\"Original\\\"),.es(index='logstash-nessus-*',q='(risk_score:>=4 AND risk_score:<7)',offset=-1w).label(\\\"One week offset\\\"),.es(index='logstash-nessus-*',q='(risk_score:>=4 AND risk_score:<7)').subtract(.es(index='logstash-nessus-*',q='(risk_score:>=4 AND risk_score:<7)',offset=-1w)).label(\\\"Difference\\\").lines(steps=3,fill=2,width=1)\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "8d9592d0-44ec-11e7-a05f-d9719b331a27",
"_type": "visualization",
"_source": {
"title": "Nessus - TL-High Risk",
"visState": "{\"title\":\"Nessus - TL-High Risk\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*',q='(risk_score:>=7 AND risk_score:<9)').label(\\\"Original\\\"),.es(index='logstash-nessus-*',q='(risk_score:>=7 AND risk_score:<9)',offset=-1w).label(\\\"One week offset\\\"),.es(index='logstash-nessus-*',q='(risk_score:>=7 AND risk_score:<9)').subtract(.es(index='logstash-nessus-*',q='(risk_score:>=7 AND risk_score:<9)',offset=-1w)).label(\\\"Difference\\\").lines(steps=3,fill=2,width=1)\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "a2d66660-44ec-11e7-a05f-d9719b331a27",
"_type": "visualization",
"_source": {
"title": "Nessus - TL-Low Risk",
"visState": "{\"title\":\"Nessus - TL-Low Risk\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*',q='(risk_score:>0 AND risk_score:<4)').label(\\\"Original\\\"),.es(index='logstash-nessus-*',q='(risk_score:>0 AND risk_score:<4)',offset=-1w).label(\\\"One week offset\\\"),.es(index='logstash-nessus-*',q='(risk_score:>0 AND risk_score:<4)').subtract(.es(index='logstash-nessus-*',q='(risk_score:>0 AND risk_score:<4)',offset=-1w)).label(\\\"Difference\\\").lines(steps=3,fill=2,width=1)\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "fb6eb020-49ab-11e7-8f8c-57ad64ec48a6",
"_type": "visualization",
"_source": {
"title": "Nessus - Critical Risk Score for Tagged Assets",
"visState": "{\"title\":\"Nessus - Critical Risk Score for Tagged Assets\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index=logstash-nessus-*,q='risk_score:>9 AND tags:hipaa_asset').label(\\\"HIPAA Assets\\\"),.es(index=logstash-nessus-*,q='risk_score:>9 AND tags:pci_asset').label(\\\"PCI Systems\\\"),.es(index=logstash-nessus-*,q='risk_score:>9 AND tags:has_hipaa_data').label(\\\"Has HIPAA Data\\\")\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "80158c90-57c1-11e7-b484-a970fc9d150a",
"_type": "visualization",
"_source": {
"title": "Nessus - HIPAA TL",
"visState": "{\"type\":\"timelion\",\"title\":\"Nessus - HIPAA TL\",\"params\":{\"expression\":\".es(index=logstash-nessus-*,q='risk_score:>9 AND tags:pci_asset').label(\\\"PCI Assets\\\"),.es(index=logstash-nessus-*,q='risk_score:>9 AND tags:has_hipaa_data').label(\\\"Has HIPAA Data\\\"),.es(index=logstash-nessus-*,q='risk_score:>9 AND tags:hipaa_asset').label(\\\"HIPAA Assets\\\")\",\"interval\":\"auto\"}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{}"
}
}
},
{
"_id": "a6508640-897a-11e7-bbc0-33592ce0be1e",
"_type": "visualization",
"_source": {
"title": "Nessus - Critical Assets Aggregated",
"visState": "{\"title\":\"Nessus - Critical Assets Aggregated\",\"type\":\"heatmap\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"enableHover\":true,\"legendPosition\":\"right\",\"times\":[],\"colorsNumber\":4,\"colorSchema\":\"Green to Red\",\"setColorRange\":true,\"colorsRange\":[{\"from\":0,\"to\":3},{\"from\":3,\"to\":7},{\"from\":7,\"to\":9},{\"from\":9,\"to\":11}],\"invertColors\":false,\"percentageMode\":false,\"valueAxes\":[{\"show\":false,\"id\":\"ValueAxis-1\",\"type\":\"value\",\"scale\":{\"type\":\"linear\",\"defaultYExtents\":false},\"labels\":{\"show\":true,\"rotate\":0,\"color\":\"white\"}}]},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"max\",\"schema\":\"metric\",\"params\":{\"field\":\"risk_score\",\"customLabel\":\"Residual Risk Score\"}},{\"id\":\"3\",\"enabled\":true,\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{},\"customLabel\":\"Date\"}},{\"id\":\"4\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"group\",\"params\":{\"field\":\"host\",\"size\":10,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Critical Asset IP\"}},{\"id\":\"5\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"split\",\"params\":{\"field\":\"plugin_name.raw\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"1\",\"row\":true}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"colors\":{\"0 - 3\":\"#7EB26D\",\"3 - 7\":\"#EAB839\",\"7 - 9\":\"#EF843C\",\"8 - 10\":\"#BF1B00\",\"9 - 11\":\"#BF1B00\"},\"defaultColors\":{\"0 - 3\":\"rgb(0,104,55)\",\"3 - 7\":\"rgb(135,203,103)\",\"7 - 9\":\"rgb(255,255,190)\",\"9 - 11\":\"rgb(249,142,82)\"},\"legendOpen\":false}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[{\"$state\":{\"store\":\"appState\"},\"meta\":{\"alias\":\"Critical Asset\",\"disabled\":false,\"index\":\"logstash-nessus-*\",\"key\":\"tags\",\"negate\":false,\"type\":\"phrase\",\"value\":\"critical_asset\"},\"query\":{\"match\":{\"tags\":{\"query\":\"critical_asset\",\"type\":\"phrase\"}}}}]}"
}
}
},
{
"_id": "465c5820-8977-11e7-857e-e1d56b17746d",
"_type": "visualization",
"_source": {
"title": "Nessus - Critical Assets",
"visState": "{\"title\":\"Nessus - Critical Assets\",\"type\":\"heatmap\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"enableHover\":true,\"legendPosition\":\"right\",\"times\":[],\"colorsNumber\":4,\"colorSchema\":\"Green to Red\",\"setColorRange\":true,\"colorsRange\":[{\"from\":0,\"to\":3},{\"from\":3,\"to\":7},{\"from\":7,\"to\":9},{\"from\":9,\"to\":11}],\"invertColors\":false,\"percentageMode\":false,\"valueAxes\":[{\"show\":false,\"id\":\"ValueAxis-1\",\"type\":\"value\",\"scale\":{\"type\":\"linear\",\"defaultYExtents\":false},\"labels\":{\"show\":true,\"rotate\":0,\"color\":\"white\"}}]},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"max\",\"schema\":\"metric\",\"params\":{\"field\":\"risk_score\",\"customLabel\":\"Residual Risk Score\"}},{\"id\":\"2\",\"enabled\":false,\"type\":\"terms\",\"schema\":\"split\",\"params\":{\"field\":\"host\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"1\",\"row\":true}},{\"id\":\"3\",\"enabled\":true,\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{},\"customLabel\":\"Date\"}},{\"id\":\"4\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"group\",\"params\":{\"field\":\"host\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"1\",\"customLabel\":\"Critical Asset IP\"}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"defaultColors\":{\"0 - 3\":\"rgb(0,104,55)\",\"3 - 7\":\"rgb(135,203,103)\",\"7 - 9\":\"rgb(255,255,190)\",\"9 - 11\":\"rgb(249,142,82)\"},\"colors\":{\"8 - 10\":\"#BF1B00\",\"9 - 11\":\"#BF1B00\",\"7 - 9\":\"#EF843C\",\"3 - 7\":\"#EAB839\",\"0 - 3\":\"#7EB26D\"},\"legendOpen\":false}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[{\"meta\":{\"index\":\"logstash-nessus-*\",\"negate\":false,\"disabled\":false,\"alias\":\"Critical Asset\",\"type\":\"phrase\",\"key\":\"tags\",\"value\":\"critical_asset\"},\"query\":{\"match\":{\"tags\":{\"query\":\"critical_asset\",\"type\":\"phrase\"}}},\"$state\":{\"store\":\"appState\"}}]}"
}
}
},
{
"_id": "56f0f5f0-3ebe-11e7-a192-93f36fbd9d05",
"_type": "visualization",
"_source": {
"title": "Nessus-RiskOverTime",
"visState": "{\"aggs\":[{\"enabled\":true,\"id\":\"1\",\"params\":{},\"schema\":\"metric\",\"type\":\"count\"},{\"enabled\":true,\"id\":\"2\",\"params\":{\"customInterval\":\"2h\",\"extended_bounds\":{},\"field\":\"@timestamp\",\"interval\":\"auto\",\"min_doc_count\":1},\"schema\":\"segment\",\"type\":\"date_histogram\"},{\"enabled\":true,\"id\":\"3\",\"params\":{\"field\":\"risk\",\"order\":\"desc\",\"orderBy\":\"1\",\"size\":5},\"schema\":\"group\",\"type\":\"terms\"}],\"listeners\":{},\"params\":{\"addLegend\":true,\"addTimeMarker\":false,\"addTooltip\":true,\"categoryAxes\":[{\"id\":\"CategoryAxis-1\",\"labels\":{\"show\":true,\"truncate\":100},\"position\":\"bottom\",\"scale\":{\"type\":\"linear\"},\"show\":true,\"style\":{},\"title\":{},\"type\":\"category\"}],\"defaultYExtents\":false,\"drawLinesBetweenPoints\":true,\"grid\":{\"categoryLines\":false,\"style\":{\"color\":\"#eee\"},\"valueAxis\":\"ValueAxis-1\"},\"interpolate\":\"linear\",\"legendPosition\":\"right\",\"orderBucketsBySum\":false,\"radiusRatio\":9,\"scale\":\"linear\",\"seriesParams\":[{\"data\":{\"id\":\"1\",\"label\":\"Count\"},\"drawLinesBetweenPoints\":true,\"interpolate\":\"linear\",\"mode\":\"normal\",\"show\":\"true\",\"showCircles\":true,\"type\":\"line\",\"valueAxis\":\"ValueAxis-1\"}],\"setYExtents\":false,\"showCircles\":true,\"times\":[],\"valueAxes\":[{\"id\":\"ValueAxis-1\",\"labels\":{\"filter\":false,\"rotate\":0,\"show\":true,\"truncate\":100},\"name\":\"LeftAxis-1\",\"position\":\"left\",\"scale\":{\"mode\":\"normal\",\"type\":\"linear\"},\"show\":true,\"style\":{},\"title\":{\"text\":\"Count\"},\"type\":\"value\"}]},\"title\":\"Nessus-RiskOverTime\",\"type\":\"line\"}",
"uiStateJSON": "{\"vis\":{\"colors\":{\"Critical\":\"#E24D42\",\"High\":\"#E0752D\",\"Low\":\"#7EB26D\",\"Medium\":\"#F2C96D\"}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "479deab0-8a39-11e7-a58a-9bfcb3761a3d",
"_type": "visualization",
"_source": {
"title": "Nessus - TL - TaggedAssetsPluginNames",
"visState": "{\"title\":\"Nessus - TL - TaggedAssetsPluginNames\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*', q='tags:critical_asset OR tags:hipaa_asset OR tags:pci_asset', split=\\\"plugin_name.raw:10\\\").bars(width=4).label(regex=\\\".*:(.+)>.*\\\",label=\\\"$1\\\")\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "84f5c370-8a38-11e7-a58a-9bfcb3761a3d",
"_type": "visualization",
"_source": {
"title": "Nessus - TL - CriticalAssetsPluginNames",
"visState": "{\"title\":\"Nessus - TL - CriticalAssetsPluginNames\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*', q='tags:critical_asset', split=\\\"plugin_name.raw:10\\\").bars(width=4).label(regex=\\\".*:(.+)>.*\\\",label=\\\"$1\\\")\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "307cdae0-8a38-11e7-a58a-9bfcb3761a3d",
"_type": "visualization",
"_source": {
"title": "Nessus - TL - PluginNames",
"visState": "{\"title\":\"Nessus - TL - PluginNames\",\"type\":\"timelion\",\"params\":{\"expression\":\".es(index='logstash-nessus-*', split=\\\"plugin_name.raw:25\\\").bars(width=4).label(regex=\\\".*:(.+)>.*\\\",label=\\\"$1\\\")\",\"interval\":\"auto\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "d048c220-80b3-11e7-8790-73b60225f736",
"_type": "visualization",
"_source": {
"title": "Nessus - Risk: High",
"visState": "{\"title\":\"Nessus - Risk: High\",\"type\":\"goal\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"type\":\"gauge\",\"gauge\":{\"verticalSplit\":false,\"autoExtend\":false,\"percentageMode\":false,\"gaugeType\":\"Metric\",\"gaugeStyle\":\"Full\",\"backStyle\":\"Full\",\"orientation\":\"vertical\",\"useRanges\":false,\"colorSchema\":\"Green to Red\",\"gaugeColorMode\":\"Background\",\"colorsRange\":[{\"from\":0,\"to\":1000}],\"invertColors\":false,\"labels\":{\"show\":false,\"color\":\"black\"},\"scale\":{\"show\":true,\"labels\":false,\"color\":\"#333\",\"width\":2},\"type\":\"simple\",\"style\":{\"bgFill\":\"white\",\"bgColor\":true,\"labelColor\":false,\"subText\":\"\",\"fontSize\":\"34\"},\"extendRange\":true}},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{\"customLabel\":\"High Risk\"}},{\"id\":\"2\",\"enabled\":true,\"type\":\"filters\",\"schema\":\"group\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"risk:High\",\"analyze_wildcard\":true}}},\"label\":\"\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"defaultColors\":{\"0 - 1000\":\"rgb(0,104,55)\"},\"legendOpen\":true,\"colors\":{\"0 - 10000\":\"#EF843C\",\"0 - 1000\":\"#E0752D\"}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "c1361da0-80b3-11e7-8790-73b60225f736",
"_type": "visualization",
"_source": {
"title": "Nessus - Risk: Medium",
"visState": "{\"title\":\"Nessus - Risk: Medium\",\"type\":\"goal\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"type\":\"gauge\",\"gauge\":{\"verticalSplit\":false,\"autoExtend\":false,\"percentageMode\":false,\"gaugeType\":\"Metric\",\"gaugeStyle\":\"Full\",\"backStyle\":\"Full\",\"orientation\":\"vertical\",\"useRanges\":false,\"colorSchema\":\"Green to Red\",\"gaugeColorMode\":\"Background\",\"colorsRange\":[{\"from\":0,\"to\":10000}],\"invertColors\":false,\"labels\":{\"show\":false,\"color\":\"black\"},\"scale\":{\"show\":true,\"labels\":false,\"color\":\"#333\",\"width\":2},\"type\":\"simple\",\"style\":{\"bgFill\":\"white\",\"bgColor\":true,\"labelColor\":false,\"subText\":\"\",\"fontSize\":\"34\"},\"extendRange\":false}},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{\"customLabel\":\"Medium Risk\"}},{\"id\":\"2\",\"enabled\":true,\"type\":\"filters\",\"schema\":\"group\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"risk:Medium\",\"analyze_wildcard\":true}}},\"label\":\"Medium Risk\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":true,\"colors\":{\"0 - 10000\":\"#EAB839\"}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "e46ff7f0-897d-11e7-934b-67cec0a7da65",
"_type": "visualization",
"_source": {
"title": "Nessus - Risk: Low",
"visState": "{\"title\":\"Nessus - Risk: Low\",\"type\":\"goal\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"type\":\"gauge\",\"gauge\":{\"verticalSplit\":false,\"autoExtend\":false,\"percentageMode\":false,\"gaugeType\":\"Metric\",\"gaugeStyle\":\"Full\",\"backStyle\":\"Full\",\"orientation\":\"vertical\",\"useRanges\":false,\"colorSchema\":\"Green to Red\",\"gaugeColorMode\":\"Background\",\"colorsRange\":[{\"from\":0,\"to\":10000}],\"invertColors\":false,\"labels\":{\"show\":false,\"color\":\"black\"},\"scale\":{\"show\":true,\"labels\":false,\"color\":\"#333\",\"width\":2},\"type\":\"simple\",\"style\":{\"bgFill\":\"white\",\"bgColor\":true,\"labelColor\":false,\"subText\":\"\",\"fontSize\":\"34\"},\"extendRange\":false}},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{\"customLabel\":\"Low Risk\"}},{\"id\":\"2\",\"enabled\":true,\"type\":\"filters\",\"schema\":\"group\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"risk:Low\",\"analyze_wildcard\":true}}},\"label\":\"Low Risk\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":true,\"colors\":{\"0 - 10000\":\"#629E51\"}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
},
{
"_id": "db55bce0-80b3-11e7-8790-73b60225f736",
"_type": "visualization",
"_source": {
"title": "Nessus - Risk: Critical",
"visState": "{\"title\":\"Nessus - Risk: Critical\",\"type\":\"goal\",\"params\":{\"addLegend\":true,\"addTooltip\":true,\"gauge\":{\"autoExtend\":false,\"backStyle\":\"Full\",\"colorSchema\":\"Green to Red\",\"colorsRange\":[{\"from\":0,\"to\":10000}],\"gaugeColorMode\":\"Background\",\"gaugeStyle\":\"Full\",\"gaugeType\":\"Metric\",\"invertColors\":false,\"labels\":{\"color\":\"black\",\"show\":false},\"orientation\":\"vertical\",\"percentageMode\":false,\"scale\":{\"color\":\"#333\",\"labels\":false,\"show\":true,\"width\":2},\"style\":{\"bgColor\":true,\"bgFill\":\"white\",\"fontSize\":\"34\",\"labelColor\":false,\"subText\":\"Risk\"},\"type\":\"simple\",\"useRanges\":false,\"verticalSplit\":false},\"type\":\"gauge\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{\"customLabel\":\"Critical Risk\"}},{\"id\":\"2\",\"enabled\":true,\"type\":\"filters\",\"schema\":\"group\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"risk:Critical\",\"analyze_wildcard\":true}}},\"label\":\"Critical\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"colors\":{\"0 - 10000\":\"#BF1B00\"},\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":false}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "b2f2adb0-897f-11e7-a2d2-c57bca21b3aa",
"_type": "visualization",
"_source": {
"title": "Nessus - Risk: Total",
"visState": "{\"title\":\"Nessus - Risk: Total\",\"type\":\"goal\",\"params\":{\"addLegend\":true,\"addTooltip\":true,\"gauge\":{\"autoExtend\":false,\"backStyle\":\"Full\",\"colorSchema\":\"Green to Red\",\"colorsRange\":[{\"from\":0,\"to\":10000}],\"gaugeColorMode\":\"Background\",\"gaugeStyle\":\"Full\",\"gaugeType\":\"Metric\",\"invertColors\":false,\"labels\":{\"color\":\"black\",\"show\":false},\"orientation\":\"vertical\",\"percentageMode\":false,\"scale\":{\"color\":\"#333\",\"labels\":false,\"show\":true,\"width\":2},\"style\":{\"bgColor\":true,\"bgFill\":\"white\",\"fontSize\":\"34\",\"labelColor\":false,\"subText\":\"Risk\"},\"type\":\"simple\",\"useRanges\":false,\"verticalSplit\":false},\"type\":\"gauge\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{\"customLabel\":\"Total\"}},{\"id\":\"2\",\"enabled\":true,\"type\":\"filters\",\"schema\":\"group\",\"params\":{\"filters\":[{\"input\":{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}}},\"label\":\"Critical\"}]}}],\"listeners\":{}}",
"uiStateJSON": "{\"vis\":{\"colors\":{\"0 - 10000\":\"#64B0C8\"},\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":false}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"logstash-nessus-*\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}},\"filter\":[]}"
}
}
},
{
"_id": "5093c620-44e9-11e7-8014-ede06a7e69f8",
"_type": "visualization",
"_source": {
"title": "Nessus - Mitigation Readme",
"visState": "{\"title\":\"Nessus - Mitigation Readme\",\"type\":\"markdown\",\"params\":{\"markdown\":\"** Legend **\\n\\n* [Common Vulnerability Scoring System (CVSS)](https://nvd.nist.gov/vuln-metrics/cvss) is the NIST vulnerability scoring system\\n* Risk Number is residual risk score calculated from CVSS, which is adjusted to be specific to Heartland which accounts for services not in use such as Java and Flash\\n* Vulnerabilities by Tag are systems tagged with HIPAA and PCI identification.\\n\\n\\n** Workflow **\\n* Select 10.0 under Risk Number to identify Critical Vulnerabilities. \\n* For more information about a CVE, scroll down and click the CVE link.\\n* To filter by tags, use one of the following filters:\\n** tags:has_hipaa_data, tags:pci_asset, tags:hipaa_asset, tags:critical_asset**\"},\"aggs\":[],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
}
]

View File

@ -0,0 +1,50 @@
[
{
"_id": "5dba30c0-3df3-11e7-a44e-c79ca8efb780",
"_type": "dashboard",
"_source": {
"title": "Nessus - Risk Mitigation",
"hits": 0,
"description": "",
"panelsJSON": "[{\"col\":11,\"id\":\"995e2280-3df3-11e7-a44e-c79ca8efb780\",\"panelIndex\":20,\"row\":7,\"size_x\":2,\"size_y\":6,\"type\":\"visualization\"},{\"col\":1,\"id\":\"852816e0-3eb1-11e7-90cb-918f9cb01e3d\",\"panelIndex\":21,\"row\":8,\"size_x\":3,\"size_y\":5,\"type\":\"visualization\"},{\"col\":4,\"id\":\"297df800-3f7e-11e7-bd24-6903e3283192\",\"panelIndex\":27,\"row\":8,\"size_x\":3,\"size_y\":5,\"type\":\"visualization\"},{\"col\":9,\"id\":\"35b6d320-3f7f-11e7-bd24-6903e3283192\",\"panelIndex\":28,\"row\":7,\"size_x\":2,\"size_y\":6,\"type\":\"visualization\"},{\"col\":1,\"id\":\"471a3580-3f6b-11e7-88e7-df1abe6547fb\",\"panelIndex\":30,\"row\":4,\"size_x\":3,\"size_y\":2,\"type\":\"visualization\"},{\"col\":7,\"id\":\"de1a5f40-3f85-11e7-97f9-3777d794626d\",\"panelIndex\":31,\"row\":8,\"size_x\":2,\"size_y\":5,\"type\":\"visualization\"},{\"col\":9,\"id\":\"5093c620-44e9-11e7-8014-ede06a7e69f8\",\"panelIndex\":37,\"row\":4,\"size_x\":4,\"size_y\":3,\"type\":\"visualization\"},{\"col\":1,\"columns\":[\"host\",\"risk\",\"risk_score\",\"cve\",\"plugin_name\",\"solution\",\"plugin_output\"],\"id\":\"54648700-3f74-11e7-852e-69207a3d0726\",\"panelIndex\":38,\"row\":13,\"size_x\":12,\"size_y\":6,\"sort\":[\"@timestamp\",\"desc\"],\"type\":\"search\"},{\"col\":1,\"id\":\"fb6eb020-49ab-11e7-8f8c-57ad64ec48a6\",\"panelIndex\":39,\"row\":6,\"size_x\":3,\"size_y\":2,\"type\":\"visualization\"},{\"col\":4,\"id\":\"465c5820-8977-11e7-857e-e1d56b17746d\",\"panelIndex\":40,\"row\":4,\"size_x\":5,\"size_y\":4,\"type\":\"visualization\"},{\"col\":7,\"id\":\"db55bce0-80b3-11e7-8790-73b60225f736\",\"panelIndex\":41,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":1,\"id\":\"e46ff7f0-897d-11e7-934b-67cec0a7da65\",\"panelIndex\":42,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":5,\"id\":\"d048c220-80b3-11e7-8790-73b60225f736\",\"panelIndex\":43,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":3,\"id\":\"c1361da0-80b3-11e7-8790-73b60225f736\",\"panelIndex\":44,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":9,\"id\":\"b2f2adb0-897f-11e7-a2d2-c57bca21b3aa\",\"panelIndex\":45,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"size_x\":2,\"size_y\":3,\"panelIndex\":46,\"type\":\"visualization\",\"id\":\"56f0f5f0-3ebe-11e7-a192-93f36fbd9d05\",\"col\":11,\"row\":1}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-11\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-2\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-20\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-21\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":0,\"direction\":\"desc\"}}}},\"P-27\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-28\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":0,\"direction\":\"desc\"}}}},\"P-3\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":0,\"direction\":\"asc\"}}}},\"P-30\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-31\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-40\":{\"vis\":{\"defaultColors\":{\"0 - 3\":\"rgb(0,104,55)\",\"3 - 7\":\"rgb(135,203,103)\",\"7 - 9\":\"rgb(255,255,190)\",\"9 - 11\":\"rgb(249,142,82)\"}}},\"P-41\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"}}},\"P-42\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":false}},\"P-43\":{\"vis\":{\"defaultColors\":{\"0 - 1000\":\"rgb(0,104,55)\"},\"legendOpen\":false}},\"P-44\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":false}},\"P-5\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-6\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-8\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-45\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"}}},\"P-46\":{\"vis\":{\"legendOpen\":false}}}",
"version": 1,
"timeRestore": true,
"timeTo": "now",
"timeFrom": "now-30d",
"refreshInterval": {
"display": "Off",
"pause": false,
"value": 0
},
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}}],\"highlightAll\":true,\"version\":true}"
}
}
},
{
"_id": "72051530-448e-11e7-a818-f5f80dfc3590",
"_type": "dashboard",
"_source": {
"title": "Nessus - Reporting",
"hits": 0,
"description": "",
"panelsJSON": "[{\"col\":1,\"id\":\"2f979030-44b9-11e7-a818-f5f80dfc3590\",\"panelIndex\":5,\"row\":12,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"},{\"col\":1,\"id\":\"8d9592d0-44ec-11e7-a05f-d9719b331a27\",\"panelIndex\":12,\"row\":8,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"},{\"col\":7,\"id\":\"67d432e0-44ec-11e7-a05f-d9719b331a27\",\"panelIndex\":14,\"row\":4,\"size_x\":6,\"size_y\":4,\"type\":\"visualization\"},{\"col\":10,\"id\":\"297df800-3f7e-11e7-bd24-6903e3283192\",\"panelIndex\":15,\"row\":8,\"size_x\":3,\"size_y\":4,\"type\":\"visualization\"},{\"col\":7,\"id\":\"471a3580-3f6b-11e7-88e7-df1abe6547fb\",\"panelIndex\":20,\"row\":8,\"size_x\":3,\"size_y\":2,\"type\":\"visualization\"},{\"col\":11,\"id\":\"995e2280-3df3-11e7-a44e-c79ca8efb780\",\"panelIndex\":22,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":9,\"id\":\"b2f2adb0-897f-11e7-a2d2-c57bca21b3aa\",\"panelIndex\":23,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":7,\"id\":\"db55bce0-80b3-11e7-8790-73b60225f736\",\"panelIndex\":25,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":5,\"id\":\"d048c220-80b3-11e7-8790-73b60225f736\",\"panelIndex\":26,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":1,\"id\":\"e46ff7f0-897d-11e7-934b-67cec0a7da65\",\"panelIndex\":27,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"col\":3,\"id\":\"c1361da0-80b3-11e7-8790-73b60225f736\",\"panelIndex\":28,\"row\":1,\"size_x\":2,\"size_y\":3,\"type\":\"visualization\"},{\"size_x\":6,\"size_y\":4,\"panelIndex\":29,\"type\":\"visualization\",\"id\":\"479deab0-8a39-11e7-a58a-9bfcb3761a3d\",\"col\":1,\"row\":4}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-15\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-20\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-21\":{\"vis\":{\"defaultColors\":{\"0 - 100\":\"rgb(0,104,55)\"}}},\"P-22\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}},\"P-23\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"}}},\"P-24\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"}}},\"P-25\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"}}},\"P-26\":{\"vis\":{\"defaultColors\":{\"0 - 1000\":\"rgb(0,104,55)\"},\"legendOpen\":false}},\"P-27\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":false}},\"P-28\":{\"vis\":{\"defaultColors\":{\"0 - 10000\":\"rgb(0,104,55)\"},\"legendOpen\":false}},\"P-5\":{\"vis\":{\"legendOpen\":false}}}",
"version": 1,
"timeRestore": true,
"timeTo": "now",
"timeFrom": "now-30d",
"refreshInterval": {
"display": "Off",
"pause": false,
"value": 0
},
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}}],\"highlightAll\":true,\"version\":true}"
}
}
}
]

14
logstash/0001_input_beats.conf Executable file
View File

@ -0,0 +1,14 @@
input {
beats {
port => 5044
tags => "beats"
}
}
filter {
if [beat][hostname] == "filebeathost" {
mutate {
add_tag => ["nessus"]
}
}
}

View File

@ -0,0 +1,139 @@
# Author: Austin Taylor and Justin Henderson
# Email: email@austintaylor.io
# Last Update: 12/20/2017
# Version 0.3
# Description: Take in nessus reports from vulnWhisperer and pumps into logstash
input {
file {
path => "/opt/vulnwhisp/scans/**/*"
start_position => "beginning"
tags => "nessus"
type => "nessus"
}
}
filter {
if "nessus" in [tags]{
mutate {
gsub => [
"message", "\|\|\|", " ",
"message", "\t\t", " ",
"message", " ", " ",
"message", " ", " ",
"message", " ", " "
]
}
csv {
columns => ["plugin_id", "cve", "cvss", "risk", "host", "protocol", "port", "plugin_name", "synopsis", "description", "solution", "see_also", "plugin_output"]
separator => ","
source => "message"
}
grok {
match => { "path" => "(?<scan_name>[a-zA-Z0-9_.\-]+)_%{INT:scan_id}_%{INT:history_id}_%{INT:last_updated}.csv$" }
tag_on_failure => []
}
date {
match => [ "last_updated", "UNIX" ]
target => "@timestamp"
remove_field => ["last_updated"]
}
if [risk] == "None" {
mutate { add_field => { "risk_number" => 0 }}
}
if [risk] == "Low" {
mutate { add_field => { "risk_number" => 1 }}
}
if [risk] == "Medium" {
mutate { add_field => { "risk_number" => 2 }}
}
if [risk] == "High" {
mutate { add_field => { "risk_number" => 3 }}
}
if [risk] == "Critical" {
mutate { add_field => { "risk_number" => 4 }}
}
if [cve] == "nan" {
mutate { remove_field => [ "cve" ] }
}
if [see_also] == "nan" {
mutate { remove_field => [ "see_also" ] }
}
if [description] == "nan" {
mutate { remove_field => [ "description" ] }
}
if [plugin_output] == "nan" {
mutate { remove_field => [ "plugin_output" ] }
}
if [synopsis] == "nan" {
mutate { remove_field => [ "synopsis" ] }
}
mutate {
remove_field => [ "message" ]
add_field => { "risk_score" => "%{cvss}" }
}
mutate {
convert => { "risk_score" => "float" }
}
# Compensating controls - adjust risk_score
# Adobe and Java are not allowed to run in browser unless whitelisted
# Therefore, lower score by dividing by 3 (score is subjective to risk)
#Modify and uncomment when ready to use
#if [risk_score] != 0 {
# if [plugin_name] =~ "Adobe" and [risk_score] > 6 or [plugin_name] =~ "Java" and [risk_score] > 6 {
# ruby {
# code => "event.set('risk_score', event.get('risk_score') / 3)"
# }
# mutate {
# add_field => { "compensating_control" => "Adobe and Flash removed from browsers unless whitelisted site." }
# }
# }
#}
# Add tags for reporting based on assets or criticality
#if [host] == "192.168.0.1" or [host] == "192.168.0.50" or [host] =~ "^192\.168\.10\." or [host] =~ "^42.42.42." {
# mutate {
# add_tag => [ "critical_asset" ]
# }
#}
#if [host] =~ "^192\.168\.[45][0-9][0-9]\.1$" or [host] =~ "^192.168\.[50]\.[0-9]{1,2}\.1$"{
# mutate {
# add_tag => [ "has_hipaa_data" ]
# }
#}
#if [host] =~ "^192\.168\.[45][0-9][0-9]\." {
# mutate {
# add_tag => [ "hipaa_asset" ]
# }
#}
#if [host] =~ "^192\.168\.5\." {
# mutate {
# add_tag => [ "pci_asset" ]
# }
#}
#if [host] =~ "^10\.0\.50\." {
# mutate {
# add_tag => [ "web_servers" ]
# }
#}
}
}
output {
if "nessus" in [tags] or [type] == "nessus" {
#stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-nessus-%{+YYYY.MM}"
}
}
}

View File

@ -0,0 +1,14 @@
# Author: Austin Taylor
# Email: email@austintaylor.io
# Last Update: 05/21/2017
# Creates logstash-nessus
output {
if "nessus" in [tags] or [type] == "nessus" {
#stdout { codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "logstash-nessus-%{+YYYY.MM}"
}
}
}

View File

@ -0,0 +1,13 @@
input {
rabbitmq {
key => "nessus"
queue => "nessus"
durable => true
exchange => "nessus"
user => "logstash"
password => "yourpassword"
host => "buffer01"
port => 5672
tags => [ "queue_nessus", "rabbitmq" ]
}
}

View File

@ -0,0 +1,16 @@
output {
if "nessus" in [tags]{
rabbitmq {
key => "nessus"
exchange => "nessus"
exchange_type => "direct"
user => "logstash"
password => "yourbufferpassword"
host => "buffer01"
port => 5672
durable => true
persistent => true
}
}
}

6
requirements.txt Normal file
View File

@ -0,0 +1,6 @@
pandas==0.20.3
setuptools==0.9.8
pytz==2017.2
Requests==2.18.3
qualysapi==4.1.0
lxml==4.1.1

View File

@ -0,0 +1 @@
from utils.cli import bcolors

View File

@ -2,15 +2,11 @@ import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
import pandas as pd
from pandas.io.json import json_normalize
import pytz
from datetime import datetime
import json
import sys
import os
import time
import io
class NessusAPI(object):
@ -39,7 +35,7 @@ class NessusAPI(object):
'Origin': self.base,
'Accept-Encoding': 'gzip, deflate, br',
'Accept-Language': 'en-US,en;q=0.8',
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.96 Safari/537.36',
'User-Agent': 'VulnWhisperer for Nessus',
'Content-Type': 'application/json',
'Accept': 'application/json, text/javascript, */*; q=0.01',
'Referer': self.base,

View File

@ -0,0 +1,836 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
__author__ = 'Austin Taylor'
from lxml import objectify
from lxml.builder import E
import xml.etree.ElementTree as ET
import pandas as pd
import qualysapi
import qualysapi.config as qcconf
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
import sys
import os
import csv
import dateutil.parser as dp
class qualysWhisperAPI(object):
COUNT_WEBAPP = '/count/was/webapp'
COUNT_WASSCAN = '/count/was/wasscan'
DELETE_REPORT = '/delete/was/report/{report_id}'
GET_WEBAPP_DETAILS = '/get/was/webapp/{was_id}'
QPS_REST_3 = '/qps/rest/3.0'
REPORT_DETAILS = '/get/was/report/{report_id}'
REPORT_STATUS = '/status/was/report/{report_id}'
REPORT_CREATE = '/create/was/report'
REPORT_DOWNLOAD = '/download/was/report/{report_id}'
SCAN_DETAILS = '/get/was/wasscan/{scan_id}'
SCAN_DOWNLOAD = '/download/was/wasscan/{scan_id}'
SEARCH_REPORTS = '/search/was/report'
SEARCH_WEB_APPS = '/search/was/webapp'
SEARCH_WAS_SCAN = '/search/was/wasscan'
VERSION = '/qps/rest/portal/version'
def __init__(self, config=None):
self.config = config
try:
self.qgc = qualysapi.connect(config)
print('[SUCCESS] - Connected to Qualys at %s' % self.qgc.server)
except Exception as e:
print('[ERROR] Could not connect to Qualys - %s' % e)
self.headers = {
"content-type": "text/xml"}
self.config_parse = qcconf.QualysConnectConfig(config)
try:
self.template_id = self.config_parse.get_template_id()
except:
print('ERROR - Could not retrieve template ID')
def request(self, path, method='get', data=None):
methods = {'get': requests.get,
'post': requests.post}
base = 'https://' + self.qgc.server + path
req = methods[method](base, auth=self.qgc.auth, data=data, headers=self.headers).content
return req
def get_version(self):
return self.request(self.VERSION)
def get_scan_count(self, scan_name):
parameters = (
E.ServiceRequest(
E.filters(
E.Criteria({'field': 'name', 'operator': 'CONTAINS'}, scan_name))))
xml_output = self.qgc.request(self.COUNT_WEBAPP, parameters)
root = objectify.fromstring(xml_output)
return root.count.text
def get_was_scan_count(self, status):
parameters = (
E.ServiceRequest(
E.filters(
E.Criteria({'field': 'status', 'operator': 'EQUALS'}, status))))
xml_output = self.qgc.request(self.COUNT_WASSCAN, parameters)
root = objectify.fromstring(xml_output)
return root.count.text
def get_reports(self):
return self.qgc.request(self.SEARCH_REPORTS)
def xml_parser(self, xml, dupfield=None):
all_records = []
root = ET.XML(xml)
for i, child in enumerate(root):
for subchild in child:
record = {}
dup_tracker = 0
for p in subchild:
record[p.tag] = p.text
for o in p:
if o.tag in record:
dup_tracker += 1
record[o.tag + '_%s' % dup_tracker] = o.text
else:
record[o.tag] = o.text
all_records.append(record)
return pd.DataFrame(all_records)
def get_report_list(self):
"""Returns a dataframe of reports"""
return self.xml_parser(self.get_reports(), dupfield='user_id')
def get_web_apps(self):
"""Returns webapps available for account"""
return self.qgc.request(self.SEARCH_WEB_APPS)
def get_web_app_list(self):
"""Returns dataframe of webapps"""
return self.xml_parser(self.get_web_apps(), dupfield='user_id')
def get_web_app_details(self, was_id):
"""Get webapp details - use to retrieve app ID tag"""
return self.qgc.request(self.GET_WEBAPP_DETAILS.format(was_id=was_id))
def get_scans_by_app_id(self, app_id):
data = self.generate_app_id_scan_XML(app_id)
return self.qgc.request(self.SEARCH_WAS_SCAN, data)
def get_scan_info(self, limit=1000, offset=1, status='FINISHED'):
""" Returns XML of ALL WAS Scans"""
data = self.generate_scan_result_XML(limit=limit, offset=offset, status=status)
return self.qgc.request(self.SEARCH_WAS_SCAN, data)
def get_all_scans(self, limit=1000, offset=1, status='FINISHED'):
qualys_api_limit = limit
dataframes = []
_records = []
total = int(self.get_was_scan_count(status=status))
print('Processing %s total scans' % total)
for i in range(0, total):
if i % limit == 0:
if (total - i) < limit:
qualys_api_limit = total - i
print('Making a request with a limit of %s at offset %s' % (str(qualys_api_limit), str(i + 1)))
scan_info = self.get_scan_info(limit=qualys_api_limit, offset=i + 1, status=status)
_records.append(scan_info)
print('Converting XML to DataFrame')
dataframes = [self.xml_parser(xml) for xml in _records]
return pd.concat(dataframes, axis=0).reset_index().drop('index', axis=1)
def get_scan_details(self, scan_id):
return self.qgc.request(self.SCAN_DETAILS.format(scan_id=scan_id))
def get_report_details(self, report_id):
return self.qgc.request(self.REPORT_DETAILS.format(report_id=report_id))
def get_report_status(self, report_id):
return self.qgc.request(self.REPORT_STATUS.format(report_id=report_id))
def download_report(self, report_id):
return self.qgc.request(self.REPORT_DOWNLOAD.format(report_id=report_id))
def download_scan_results(self, scan_id):
return self.qgc.request(self.SCAN_DOWNLOAD.format(scan_id=scan_id))
def generate_scan_result_XML(self, limit=1000, offset=1, status='FINISHED'):
report_xml = E.ServiceRequest(
E.filters(
E.Criteria({'field': 'status', 'operator': 'EQUALS'}, status
),
),
E.preferences(
E.startFromOffset(str(offset)),
E.limitResults(str(limit))
),
)
return report_xml
def generate_scan_report_XML(self, scan_id):
"""Generates a CSV report for an asset based on template defined in .ini file"""
report_xml = E.ServiceRequest(
E.data(
E.Report(
E.name('![CDATA[API Scan Report generated by VulnWhisperer]]>'),
E.description('<![CDATA[CSV Scanning report for VulnWhisperer]]>'),
E.format('CSV'),
E.type('WAS_SCAN_REPORT'),
E.template(
E.id(self.template_id)
),
E.config(
E.scanReport(
E.target(
E.scans(
E.WasScan(
E.id(scan_id)
)
),
),
),
)
)
)
)
return report_xml
def generate_webapp_report_XML(self, app_id):
"""Generates a CSV report for an asset based on template defined in .ini file"""
report_xml = E.ServiceRequest(
E.data(
E.Report(
E.name('![CDATA[API Web Application Report generated by VulnWhisperer]]>'),
E.description('<![CDATA[CSV WebApp report for VulnWhisperer]]>'),
E.format('CSV'),
E.template(
E.id(self.template_id)
),
E.config(
E.webAppReport(
E.target(
E.webapps(
E.WebApp(
E.id(app_id)
)
),
),
),
)
)
)
)
return report_xml
def generate_app_id_scan_XML(self, app_id):
report_xml = E.ServiceRequest(
E.filters(
E.Criteria({'field': 'webApp.id', 'operator': 'EQUALS'}, app_id
),
),
)
return report_xml
def create_report(self, report_id, kind='scan'):
mapper = {'scan': self.generate_scan_report_XML,
'webapp': self.generate_webapp_report_XML}
try:
# print lxml.etree.tostring(mapper[kind](report_id), pretty_print=True)
data = mapper[kind](report_id)
except Exception as e:
print(e)
return self.qgc.request(self.REPORT_CREATE, data)
def delete_report(self, report_id):
return self.qgc.request(self.DELETE_REPORT.format(report_id=report_id))
class qualysReportFields:
CATEGORIES = ['VULNERABILITY',
'SENSITIVECONTENT',
'INFORMATION_GATHERED']
# URL Vulnerability Information
VULN_BLOCK = [
CATEGORIES[0],
'ID',
'QID',
'Url',
'Param',
'Function',
'Form Entry Point',
'Access Path',
'Authentication',
'Ajax Request',
'Ajax Request ID',
'Ignored',
'Ignore Reason',
'Ignore Date',
'Ignore User',
'Ignore Comments',
'First Time Detected',
'Last Time Detected',
'Last Time Tested',
'Times Detected',
'Payload #1',
'Request Method #1',
'Request URL #1',
'Request Headers #1',
'Response #1',
'Evidence #1',
]
INFO_HEADER = [
'Vulnerability Category',
'ID',
'QID',
'Response #1',
'Last Time Detected',
]
INFO_BLOCK = [
CATEGORIES[2],
'ID',
'QID',
'Results',
'Detection Date',
]
QID_HEADER = [
'QID',
'Id',
'Title',
'Category',
'Severity Level',
'Groups',
'OWASP',
'WASC',
'CWE',
'CVSS Base',
'CVSS Temporal',
'Description',
'Impact',
'Solution',
]
GROUP_HEADER = ['GROUP', 'Name', 'Category']
OWASP_HEADER = ['OWASP', 'Code', 'Name']
WASC_HEADER = ['WASC', 'Code', 'Name']
SCAN_META = ['Web Application Name', 'URL', 'Owner', 'Scope', 'Operating System']
CATEGORY_HEADER = ['Category', 'Severity', 'Level', 'Description']
class qualysUtils:
def __init__(self):
pass
def grab_section(
self,
report,
section,
end=[],
pop_last=False,
):
temp_list = []
max_col_count = 0
with open(report, 'rb') as csvfile:
q_report = csv.reader(csvfile, delimiter=',', quotechar='"')
for line in q_report:
if set(line) == set(section):
break
# Reads text until the end of the block:
for line in q_report: # This keeps reading the file
temp_list.append(line)
if line in end:
break
if pop_last and len(temp_list) > 1:
temp_list.pop(-1)
return temp_list
def iso_to_epoch(self, dt):
return dp.parse(dt).strftime('%s')
def cleanser(self, _data):
repls = (('\n', '|||'), ('\r', '|||'), (',', ';'), ('\t', '|||'
))
if _data:
_data = reduce(lambda a, kv: a.replace(*kv), repls, str(_data))
return _data
class qualysWebAppReport:
# URL Vulnerability Information
WEB_APP_VULN_BLOCK = list(qualysReportFields.VULN_BLOCK)
WEB_APP_VULN_BLOCK.insert(0, 'Web Application Name')
WEB_APP_VULN_BLOCK.insert(WEB_APP_VULN_BLOCK.index('Ignored'), 'Status')
WEB_APP_VULN_HEADER = list(WEB_APP_VULN_BLOCK)
WEB_APP_VULN_HEADER[WEB_APP_VULN_BLOCK.index(qualysReportFields.CATEGORIES[0])] = \
'Vulnerability Category'
WEB_APP_SENSITIVE_HEADER = list(WEB_APP_VULN_HEADER)
WEB_APP_SENSITIVE_HEADER.insert(WEB_APP_SENSITIVE_HEADER.index('Url'
), 'Content')
WEB_APP_SENSITIVE_BLOCK = list(WEB_APP_SENSITIVE_HEADER)
WEB_APP_SENSITIVE_BLOCK[WEB_APP_SENSITIVE_BLOCK.index('Vulnerability Category'
)] = qualysReportFields.CATEGORIES[1]
WEB_APP_INFO_HEADER = list(qualysReportFields.INFO_HEADER)
WEB_APP_INFO_HEADER.insert(0, 'Web Application Name')
WEB_APP_INFO_BLOCK = list(qualysReportFields.INFO_BLOCK)
WEB_APP_INFO_BLOCK.insert(0, 'Web Application Name')
QID_HEADER = list(qualysReportFields.QID_HEADER)
GROUP_HEADER = list(qualysReportFields.GROUP_HEADER)
OWASP_HEADER = list(qualysReportFields.OWASP_HEADER)
WASC_HEADER = list(qualysReportFields.WASC_HEADER)
SCAN_META = list(qualysReportFields.SCAN_META)
CATEGORY_HEADER = list(qualysReportFields.CATEGORY_HEADER)
def __init__(
self,
config=None,
file_in=None,
file_stream=False,
delimiter=',',
quotechar='"',
):
self.file_in = file_in
self.file_stream = file_stream
self.report = None
self.utils = qualysUtils()
if config:
try:
self.qw = qualysWhisperAPI(config=config)
except Exception as e:
print('Could not load config! Please check settings for %s' \
% e)
if file_stream:
self.open_file = file_in.splitlines()
elif file_in:
self.open_file = open(file_in, 'rb')
self.downloaded_file = None
def get_hostname(self, report):
host = ''
with open(report, 'rb') as csvfile:
q_report = csv.reader(csvfile, delimiter=',', quotechar='"')
for x in q_report:
if 'Web Application Name' in x[0]:
host = q_report.next()[0]
return host
def get_scanreport_name(self, report):
scan_name = ''
with open(report, 'rb') as csvfile:
q_report = csv.reader(csvfile, delimiter=',', quotechar='"')
for x in q_report:
if 'Scans' in x[0]:
scan_name = x[1]
return scan_name
def grab_sections(self, report):
all_dataframes = []
with open(report, 'rb') as csvfile:
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.WEB_APP_VULN_BLOCK,
end=[self.WEB_APP_SENSITIVE_BLOCK,
self.WEB_APP_INFO_BLOCK],
pop_last=True),
columns=self.WEB_APP_VULN_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.WEB_APP_SENSITIVE_BLOCK,
end=[self.WEB_APP_INFO_BLOCK,
self.WEB_APP_SENSITIVE_BLOCK],
pop_last=True),
columns=self.WEB_APP_SENSITIVE_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.WEB_APP_INFO_BLOCK,
end=[self.QID_HEADER],
pop_last=True),
columns=self.WEB_APP_INFO_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.QID_HEADER,
end=[self.GROUP_HEADER],
pop_last=True),
columns=self.QID_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.GROUP_HEADER,
end=[self.OWASP_HEADER],
pop_last=True),
columns=self.GROUP_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.OWASP_HEADER,
end=[self.WASC_HEADER],
pop_last=True),
columns=self.OWASP_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.WASC_HEADER, end=[['APPENDIX']],
pop_last=True),
columns=self.WASC_HEADER))
all_dataframes.append(pd.DataFrame(self.utils.grab_section(report,
self.CATEGORY_HEADER),
columns=self.CATEGORY_HEADER))
return all_dataframes
def data_normalizer(self, dataframes):
"""
Merge and clean data
:param dataframes:
:return:
"""
merged_df = pd.concat([dataframes[0], dataframes[1],
dataframes[2]], axis=0,
ignore_index=False)
merged_df = pd.merge(merged_df, dataframes[3], left_on='QID',
right_on='Id')
if 'Content' not in merged_df:
merged_df['Content'] = ''
columns_to_cleanse = ['Payload #1', 'Request Method #1', 'Request URL #1',
'Request Headers #1', 'Response #1', 'Evidence #1',
'Description', 'Impact', 'Solution', 'Url', 'Content']
for col in columns_to_cleanse:
merged_df[col] = merged_df[col].astype(str).apply(self.utils.cleanser)
merged_df = merged_df.drop(['QID_y', 'QID_x'], axis=1)
merged_df = merged_df.rename(columns={'Id': 'QID'})
merged_df = merged_df.replace('N/A','').fillna('')
try:
merged_df = \
merged_df[~merged_df.Title.str.contains('Links Crawled|External Links Discovered'
)]
except Exception as e:
print(e)
return merged_df
def download_file(self, file_id):
report = self.qw.download_report(file_id)
filename = str(file_id) + '.csv'
file_out = open(filename, 'w')
for line in report.splitlines():
file_out.write(line + '\n')
file_out.close()
print('[ACTION] - File written to %s' % filename)
return filename
def remove_file(self, filename):
os.remove(filename)
def process_data(self, file_id, scan=True, cleanup=True):
"""Downloads a file from qualys and normalizes it"""
download_file = self.download_file(file_id)
print('[ACTION] - Downloading file ID: %s' % file_id)
report_data = self.grab_sections(download_file)
merged_data = self.data_normalizer(report_data)
if scan:
scan_name = self.get_scanreport_name(download_file)
merged_data['ScanName'] = scan_name
# TODO cleanup old data (delete)
return merged_data
def whisper_reports(self, report_id, updated_date, cleanup=False):
"""
report_id: App ID
updated_date: Last time scan was ran for app_id
"""
vuln_ready = None
try:
if 'Z' in updated_date:
updated_date = self.utils.iso_to_epoch(updated_date)
report_name = 'qualys_web_' + str(report_id) \
+ '_{last_updated}'.format(last_updated=updated_date) \
+ '.csv'
if os.path.isfile(report_name):
print('[ACTION] - File already exist! Skipping...')
pass
else:
print('[ACTION] - Generating report for %s' % report_id)
status = self.qw.create_report(report_id)
root = objectify.fromstring(status)
if root.responseCode == 'SUCCESS':
print('[INFO] - Successfully generated report for webapp: %s' \
% report_id)
generated_report_id = root.data.Report.id
print ('[INFO] - New Report ID: %s' \
% generated_report_id)
vuln_ready = self.process_data(generated_report_id)
vuln_ready.to_csv(report_name, index=False, header=True) # add when timestamp occured
print('[SUCCESS] - Report written to %s' \
% report_name)
if cleanup:
print('[ACTION] - Removing report %s' \
% generated_report_id)
cleaning_up = \
self.qw.delete_report(generated_report_id)
self.remove_file(str(generated_report_id) + '.csv')
print('[ACTION] - Deleted report: %s' \
% generated_report_id)
else:
print('Could not process report ID: %s' % status)
except Exception as e:
print('[ERROR] - Could not process %s - %s' % (report_id, e))
return vuln_ready
class qualysScanReport:
# URL Vulnerability Information
WEB_SCAN_VULN_BLOCK = list(qualysReportFields.VULN_BLOCK)
WEB_SCAN_VULN_BLOCK.insert(WEB_SCAN_VULN_BLOCK.index('QID'), 'Detection ID')
WEB_SCAN_VULN_HEADER = list(WEB_SCAN_VULN_BLOCK)
WEB_SCAN_VULN_HEADER[WEB_SCAN_VULN_BLOCK.index(qualysReportFields.CATEGORIES[0])] = \
'Vulnerability Category'
WEB_SCAN_SENSITIVE_HEADER = list(WEB_SCAN_VULN_HEADER)
WEB_SCAN_SENSITIVE_HEADER.insert(WEB_SCAN_SENSITIVE_HEADER.index('Url'
), 'Content')
WEB_SCAN_SENSITIVE_BLOCK = list(WEB_SCAN_SENSITIVE_HEADER)
WEB_SCAN_SENSITIVE_BLOCK.insert(WEB_SCAN_SENSITIVE_BLOCK.index('QID'), 'Detection ID')
WEB_SCAN_SENSITIVE_BLOCK[WEB_SCAN_SENSITIVE_BLOCK.index('Vulnerability Category'
)] = qualysReportFields.CATEGORIES[1]
WEB_SCAN_INFO_HEADER = list(qualysReportFields.INFO_HEADER)
WEB_SCAN_INFO_HEADER.insert(WEB_SCAN_INFO_HEADER.index('QID'), 'Detection ID')
WEB_SCAN_INFO_BLOCK = list(qualysReportFields.INFO_BLOCK)
WEB_SCAN_INFO_BLOCK.insert(WEB_SCAN_INFO_BLOCK.index('QID'), 'Detection ID')
QID_HEADER = list(qualysReportFields.QID_HEADER)
GROUP_HEADER = list(qualysReportFields.GROUP_HEADER)
OWASP_HEADER = list(qualysReportFields.OWASP_HEADER)
WASC_HEADER = list(qualysReportFields.WASC_HEADER)
SCAN_META = list(qualysReportFields.SCAN_META)
CATEGORY_HEADER = list(qualysReportFields.CATEGORY_HEADER)
def __init__(
self,
config=None,
file_in=None,
file_stream=False,
delimiter=',',
quotechar='"',
):
self.file_in = file_in
self.file_stream = file_stream
self.report = None
self.utils = qualysUtils()
if config:
try:
self.qw = qualysWhisperAPI(config=config)
except Exception as e:
print('Could not load config! Please check settings for %s' \
% e)
if file_stream:
self.open_file = file_in.splitlines()
elif file_in:
self.open_file = open(file_in, 'rb')
self.downloaded_file = None
def grab_sections(self, report):
all_dataframes = []
dict_tracker = {}
with open(report, 'rb') as csvfile:
dict_tracker['WEB_SCAN_VULN_BLOCK'] = pd.DataFrame(self.utils.grab_section(report,
self.WEB_SCAN_VULN_BLOCK,
end=[
self.WEB_SCAN_SENSITIVE_BLOCK,
self.WEB_SCAN_INFO_BLOCK],
pop_last=True),
columns=self.WEB_SCAN_VULN_HEADER)
dict_tracker['WEB_SCAN_SENSITIVE_BLOCK'] = pd.DataFrame(self.utils.grab_section(report,
self.WEB_SCAN_SENSITIVE_BLOCK,
end=[
self.WEB_SCAN_INFO_BLOCK,
self.WEB_SCAN_SENSITIVE_BLOCK],
pop_last=True),
columns=self.WEB_SCAN_SENSITIVE_HEADER)
dict_tracker['WEB_SCAN_INFO_BLOCK'] = pd.DataFrame(self.utils.grab_section(report,
self.WEB_SCAN_INFO_BLOCK,
end=[self.QID_HEADER],
pop_last=True),
columns=self.WEB_SCAN_INFO_HEADER)
dict_tracker['QID_HEADER'] = pd.DataFrame(self.utils.grab_section(report,
self.QID_HEADER,
end=[self.GROUP_HEADER],
pop_last=True),
columns=self.QID_HEADER)
dict_tracker['GROUP_HEADER'] = pd.DataFrame(self.utils.grab_section(report,
self.GROUP_HEADER,
end=[self.OWASP_HEADER],
pop_last=True),
columns=self.GROUP_HEADER)
dict_tracker['OWASP_HEADER'] = pd.DataFrame(self.utils.grab_section(report,
self.OWASP_HEADER,
end=[self.WASC_HEADER],
pop_last=True),
columns=self.OWASP_HEADER)
dict_tracker['WASC_HEADER'] = pd.DataFrame(self.utils.grab_section(report,
self.WASC_HEADER, end=[['APPENDIX']],
pop_last=True),
columns=self.WASC_HEADER)
dict_tracker['SCAN_META'] = pd.DataFrame(self.utils.grab_section(report,
self.SCAN_META,
end=[self.CATEGORY_HEADER],
pop_last=True),
columns=self.SCAN_META)
dict_tracker['CATEGORY_HEADER'] = pd.DataFrame(self.utils.grab_section(report,
self.CATEGORY_HEADER),
columns=self.CATEGORY_HEADER)
all_dataframes.append(dict_tracker)
return all_dataframes
def data_normalizer(self, dataframes):
"""
Merge and clean data
:param dataframes:
:return:
"""
df_dict = dataframes[0]
merged_df = pd.concat([df_dict['WEB_SCAN_VULN_BLOCK'], df_dict['WEB_SCAN_SENSITIVE_BLOCK'],
df_dict['WEB_SCAN_INFO_BLOCK']], axis=0,
ignore_index=False)
merged_df = pd.merge(merged_df, df_dict['QID_HEADER'], left_on='QID',
right_on='Id')
if 'Content' not in merged_df:
merged_df['Content'] = ''
columns_to_cleanse = ['Payload #1', 'Request Method #1', 'Request URL #1',
'Request Headers #1', 'Response #1', 'Evidence #1',
'Description', 'Impact', 'Solution', 'Url', 'Content']
for col in columns_to_cleanse:
merged_df[col] = merged_df[col].apply(self.utils.cleanser)
merged_df = merged_df.drop(['QID_y', 'QID_x'], axis=1)
merged_df = merged_df.rename(columns={'Id': 'QID'})
merged_df = merged_df.assign(**df_dict['SCAN_META'].to_dict(orient='records')[0])
merged_df = pd.merge(merged_df, df_dict['CATEGORY_HEADER'], how='left', left_on=['Category', 'Severity Level'],
right_on=['Category', 'Severity'], suffixes=('Severity', 'CatSev'))
merged_df = merged_df.replace('N/A', '').fillna('')
try:
merged_df = \
merged_df[~merged_df.Title.str.contains('Links Crawled|External Links Discovered'
)]
except Exception as e:
print(e)
return merged_df
def download_file(self, path='', file_id=None):
report = self.qw.download_report(file_id)
filename = path + str(file_id) + '.csv'
file_out = open(filename, 'w')
for line in report.splitlines():
file_out.write(line + '\n')
file_out.close()
print('[ACTION] - File written to %s' % filename)
return filename
def remove_file(self, filename):
os.remove(filename)
def process_data(self, path='', file_id=None, cleanup=True):
"""Downloads a file from qualys and normalizes it"""
download_file = self.download_file(path=path, file_id=file_id)
print('[ACTION] - Downloading file ID: %s' % file_id)
report_data = self.grab_sections(download_file)
merged_data = self.data_normalizer(report_data)
merged_data.sort_index(axis=1, inplace=True)
# TODO cleanup old data (delete)
return merged_data
def whisper_reports(self, report_id, updated_date, cleanup=False):
"""
report_id: App ID
updated_date: Last time scan was ran for app_id
"""
vuln_ready = None
try:
if 'Z' in updated_date:
updated_date = self.utils.iso_to_epoch(updated_date)
report_name = 'qualys_web_' + str(report_id) \
+ '_{last_updated}'.format(last_updated=updated_date) \
+ '.csv'
if os.path.isfile(report_name):
print('[ACTION] - File already exist! Skipping...')
pass
else:
print('[ACTION] - Generating report for %s' % report_id)
status = self.qw.create_report(report_id)
root = objectify.fromstring(status)
if root.responseCode == 'SUCCESS':
print('[INFO] - Successfully generated report for webapp: %s' \
% report_id)
generated_report_id = root.data.Report.id
print ('[INFO] - New Report ID: %s' \
% generated_report_id)
vuln_ready = self.process_data(generated_report_id)
vuln_ready.to_csv(report_name, index=False, header=True) # add when timestamp occured
print('[SUCCESS] - Report written to %s' \
% report_name)
if cleanup:
print('[ACTION] - Removing report %s' \
% generated_report_id)
cleaning_up = \
self.qw.delete_report(generated_report_id)
self.remove_file(str(generated_report_id) + '.csv')
print('[ACTION] - Deleted report: %s' \
% generated_report_id)
else:
print('Could not process report ID: %s' % status)
except Exception as e:
print('[ERROR] - Could not process %s - %s' % (report_id, e))
return vuln_ready
maxInt = sys.maxsize
decrement = True
while decrement:
decrement = False
try:
csv.field_size_limit(maxInt)
except OverflowError:
maxInt = int(maxInt/10)
decrement = True

View File

@ -12,5 +12,6 @@ class bcolors:
UNDERLINE = '\033[4m'
INFO = '{info}[INFO]{endc}'.format(info=OKBLUE, endc=ENDC)
ACTION = '{info}[ACTION]{endc}'.format(info=OKBLUE, endc=ENDC)
SUCCESS = '{green}[SUCCESS]{endc}'.format(green=OKGREEN, endc=ENDC)
FAIL = '{red}[FAIL]{endc}'.format(red=FAIL, endc=ENDC)

View File

@ -1,8 +1,13 @@
#!/usr/bin/python
# -*- coding: utf-8 -*-
__author__ = 'Austin Taylor'
from base.config import vwConfig
from frameworks.nessus import NessusAPI
from frameworks.qualys import qualysScanReport
from utils.cli import bcolors
import pandas as pd
from lxml import objectify
import sys
import os
import io
@ -10,103 +15,102 @@ import time
import sqlite3
# TODO Create logging option which stores data about scan
import logging
class vulnWhispererBase(object):
class vulnWhisperer(object):
CONFIG_SECTION = None
def __init__(self, config=None, db_name='report_tracker.db', purge=False, verbose=None, debug=False):
def __init__(
self,
config=None,
db_name='report_tracker.db',
purge=False,
verbose=None,
debug=False,
username=None,
password=None,
section=None,
):
self.verbose = verbose
self.nessus_connect = False
self.develop = True
if self.CONFIG_SECTION is None:
raise Exception('Implementing class must define CONFIG_SECTION')
self.db_name = db_name
self.purge = purge
if config is not None:
try:
self.config = vwConfig(config_in=config)
self.nessus_enabled = self.config.getbool('nessus', 'enabled')
self.config = vwConfig(config_in=config)
self.enabled = self.config.get(self.CONFIG_SECTION, 'enabled')
self.hostname = self.config.get(self.CONFIG_SECTION, 'hostname')
self.username = self.config.get(self.CONFIG_SECTION, 'username')
self.password = self.config.get(self.CONFIG_SECTION, 'password')
self.write_path = self.config.get(self.CONFIG_SECTION, 'write_path')
self.db_path = self.config.get(self.CONFIG_SECTION, 'db_path')
self.verbose = self.config.getbool(self.CONFIG_SECTION, 'verbose')
if self.nessus_enabled:
self.nessus_hostname = self.config.get('nessus', 'hostname')
self.nessus_port = self.config.get('nessus', 'port')
self.nessus_username = self.config.get('nessus', 'username')
self.nessus_password = self.config.get('nessus', 'password')
self.nessus_writepath = self.config.get('nessus', 'write_path')
self.nessus_dbpath = self.config.get('nessus', 'db_path')
self.nessus_trash = self.config.getbool('nessus', 'trash')
self.verbose = self.config.getbool('nessus', 'verbose')
try:
self.vprint(
'{info} Attempting to connect to nessus...'.format(info=bcolors.INFO))
self.nessus = NessusAPI(hostname=self.nessus_hostname,
port=self.nessus_port,
username=self.nessus_username,
password=self.nessus_password)
self.nessus_connect = True
self.vprint(
'{success} Connected to nessus on {host}:{port}'.format(success=bcolors.SUCCESS,
host=self.nessus_hostname,
port=str(self.nessus_port)))
except Exception as e:
self.vprint(e)
raise Exception(
"{fail} Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}".format(config=self.config,
fail=bcolors.FAIL,
e=e))
except Exception as e:
self.vprint('{fail} Could not properly load your config!\nReason: {e}'.format(fail=bcolors.FAIL, e=e))
sys.exit(0)
if db_name is not None:
if self.nessus_dbpath:
self.database = os.path.join(self.nessus_dbpath, db_name)
if self.db_name is not None:
if self.db_path:
self.database = os.path.join(self.db_path,
db_name)
else:
self.database = os.path.abspath(os.path.join(os.path.dirname( __file__ ), 'database', db_name))
self.database = \
os.path.abspath(os.path.join(os.path.dirname(__file__),
'database', db_name))
try:
self.conn = sqlite3.connect(self.database)
self.cur = self.conn.cursor()
self.vprint("{info} Connected to database at {loc}".format(info=bcolors.INFO, loc=self.database))
self.vprint('{info} Connected to database at {loc}'.format(info=bcolors.INFO,
loc=self.database))
except Exception as e:
self.vprint("{fail} Could not connect to database at {loc}\nReason: {e} - Please ensure the path exist".format(e=e, fail=bcolors.FAIL, loc=self.database))
self.vprint(
'{fail} Could not connect to database at {loc}\nReason: {e} - Please ensure the path exist'.format(
e=e,
fail=bcolors.FAIL, loc=self.database))
else:
self.vprint('{fail} Please specify a database to connect to!'.format(fail=bcolors.FAIL))
exit(0)
self.table_columns = ['scan_name',
'scan_id',
'last_modified',
'filename',
'download_time',
'record_count',
'source',
'uuid',
'processed']
self.table_columns = [
'scan_name',
'scan_id',
'last_modified',
'filename',
'download_time',
'record_count',
'source',
'uuid',
'processed',
]
self.init()
self.uuids = self.retrieve_uuids()
self.processed = 0
self.skipped = 0
self.scan_list = []
def vprint(self, msg):
if self.verbose:
print(msg)
def create_table(self):
self.cur.execute("create table if not exists scan_history (id integer primary key, scan_name text, scan_id integer, last_modified date, filename text, download_time date, record_count integer, source text, uuid text, processed integer)")
self.cur.execute(
'CREATE TABLE IF NOT EXISTS scan_history (id INTEGER PRIMARY KEY,'
' scan_name TEXT, scan_id INTEGER, last_modified DATE, filename TEXT,'
' download_time DATE, record_count INTEGER, source TEXT,'
' uuid TEXT, processed INTEGER)'
)
self.conn.commit()
def delete_table(self):
self.cur.execute('drop table if exists scan_history')
self.cur.execute('DROP TABLE IF EXISTS scan_history')
self.conn.commit()
def init(self):
@ -115,15 +119,88 @@ class vulnWhisperer(object):
self.create_table()
def cleanser(self, _data):
repls = ('\n', '|||'), ('\r', '|||'), (',',';')
repls = (('\n', '|||'), ('\r', '|||'), (',', ';'))
data = reduce(lambda a, kv: a.replace(*kv), repls, _data)
return data
def path_check(self, _data):
if self.nessus_writepath:
data = self.nessus_writepath + '/' + _data
if self.write_path:
data = self.write_path + '/' + _data
return data
def record_insert(self, record):
self.cur.execute('insert into scan_history({table_columns}) values (?,?,?,?,?,?,?,?,?)'.format(
table_columns=', '.join(self.table_columns)),
record)
self.conn.commit()
def retrieve_uuids(self):
"""
Retrieves UUIDs from database and checks list to determine which files need to be processed.
:return:
"""
try:
self.conn.text_factory = str
self.cur.execute('SELECT uuid FROM scan_history where source = {config_section}'.format(config_section=self.CONFIG_SECTION))
results = frozenset([r[0] for r in self.cur.fetchall()])
except:
results = []
return results
class vulnWhispererNessus(vulnWhispererBase):
CONFIG_SECTION = 'nessus'
def __init__(
self,
config=None,
db_name='report_tracker.db',
purge=False,
verbose=None,
debug=False,
username=None,
password=None,
):
super(vulnWhispererNessus, self).__init__(config=config)
self.port = int(self.config.get(self.CONFIG_NAME, 'port'))
self.develop = True
self.purge = purge
if config is not None:
try:
#if self.enabled:
self.nessus_port = self.config.get(self.CONFIG_SECTION, 'port')
self.nessus_trash = self.config.getbool(self.CONFIG_SECTION,
'trash')
try:
self.vprint('{info} Attempting to connect to nessus...'.format(info=bcolors.INFO))
self.nessus = \
NessusAPI(hostname=self.hostname,
port=self.nessus_port,
username=self.username,
password=self.password)
self.nessus_connect = True
self.vprint('{success} Connected to nessus on {host}:{port}'.format(success=bcolors.SUCCESS,
host=self.hostname,
port=str(self.nessus_port)))
except Exception as e:
self.vprint(e)
raise Exception(
'{fail} Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
config=self.config,
fail=bcolors.FAIL, e=e))
except Exception as e:
self.vprint('{fail} Could not properly load your config!\nReason: {e}'.format(fail=bcolors.FAIL,
e=e))
sys.exit(0)
def scan_count(self, scans, completed=False):
"""
@ -131,6 +208,7 @@ class vulnWhisperer(object):
:param completed: Only return completed scans
:return:
"""
self.vprint('{info} Gathering all scan data... this may take a while...'.format(info=bcolors.INFO))
scan_records = []
for s in scans:
@ -148,14 +226,18 @@ class vulnWhisperer(object):
record['uuid'] = h.get('uuid', '')
record['status'] = h.get('status', '')
record['history_id'] = h.get('history_id', '')
record['last_modification_date'] = h.get('last_modification_date', '')
record['norm_time'] = self.nessus.get_utc_from_local(int(record['last_modification_date']),
local_tz=self.nessus.tz_conv(record['timezone']))
record['last_modification_date'] = \
h.get('last_modification_date', '')
record['norm_time'] = \
self.nessus.get_utc_from_local(int(record['last_modification_date'
]),
local_tz=self.nessus.tz_conv(record['timezone'
]))
scan_records.append(record.copy())
except Exception as e:
print(e)
# Generates error each time nonetype is encountered.
# print(e)
pass
if completed:
@ -163,21 +245,6 @@ class vulnWhisperer(object):
return scan_records
def record_insert(self, record):
self.cur.execute("insert into scan_history({table_columns}) values (?,?,?,?,?,?,?,?,?)".format(
table_columns=', '.join(self.table_columns)), record)
def retrieve_uuids(self):
"""
Retrieves UUIDs from database and checks list to determine which files need to be processed.
:return:
"""
self.conn.text_factory = str
self.cur.execute('select uuid from scan_history')
results = frozenset([r[0] for r in self.cur.fetchall()])
return results
def whisper_nessus(self):
if self.nessus_connect:
scan_data = self.nessus.get_scans()
@ -185,12 +252,20 @@ class vulnWhisperer(object):
scans = scan_data['scans']
all_scans = self.scan_count(scans)
if self.uuids:
scan_list = [scan for scan in all_scans if scan['uuid'] not in self.uuids]
scan_list = [scan for scan in all_scans if scan['uuid']
not in self.uuids and scan['status']
== 'completed']
else:
scan_list = all_scans
self.vprint("{info} Identified {new} scans to be processed".format(info=bcolors.INFO, new=len(scan_list)))
self.vprint('{info} Identified {new} scans to be processed'.format(info=bcolors.INFO,
new=len(scan_list)))
if not scan_list:
self.vprint('{info} No new scans to process. Exiting...'.format(info=bcolors.INFO))
exit(0)
# Create scan subfolders
for f in folders:
if not os.path.exists(self.path_check(f['name'])):
if f['name'] == 'Trash' and self.nessus_trash:
@ -200,26 +275,43 @@ class vulnWhisperer(object):
else:
os.path.exists(self.path_check(f['name']))
self.vprint('{info} Directory already exist for {scan} - Skipping creation'.format(
scan=self.path_check(f['name']), info=bcolors.INFO))
scan=self.path_check(f['name'
]), info=bcolors.INFO))
# try download and save scans into each folder the belong to
scan_count = 0
# TODO Rewrite this part to go through the scans that have aleady been processed
for s in scan_list:
scan_count += 1
#self.vprint('%s/%s' % (scan_count, len(scan_list)))
scan_name, scan_id, history_id,\
norm_time, status, uuid = s['scan_name'], s['scan_id'], s['history_id'],\
s['norm_time'], s['status'], s['uuid']
(
scan_name,
scan_id,
history_id,
norm_time,
status,
uuid,
) = (
s['scan_name'],
s['scan_id'],
s['history_id'],
s['norm_time'],
s['status'],
s['uuid'],
)
# TODO Create directory sync function which scans the directory for files that exist already and populates the database
folder_id = s['folder_id']
scan_history = self.nessus.get_scan_history(scan_id)
folder_name = next(f['name'] for f in folders if f['id'] == folder_id)
folder_name = next(f['name'] for f in folders if f['id'
] == folder_id)
if status == 'completed':
file_name = '%s_%s_%s_%s.%s' % (scan_name, scan_id, history_id, norm_time, 'csv')
repls = ('\\', '_'), ('/', '_'), ('/', '_'), (' ', '_')
file_name = '%s_%s_%s_%s.%s' % (scan_name, scan_id,
history_id, norm_time, 'csv')
repls = (('\\', '_'), ('/', '_'), ('/', '_'), (' ', '_'))
file_name = reduce(lambda a, kv: a.replace(*kv), repls, file_name)
relative_path_name = self.path_check(folder_name + '/' + file_name)
@ -227,40 +319,211 @@ class vulnWhisperer(object):
if self.develop:
csv_in = pd.read_csv(relative_path_name)
record_meta = (
scan_name, scan_id, norm_time, file_name, time.time(), csv_in.shape[0], 'nessus', uuid, 1)
scan_name,
scan_id,
norm_time,
file_name,
time.time(),
csv_in.shape[0],
'nessus',
uuid,
1,
)
self.record_insert(record_meta)
self.vprint(
"{info} File {filename} already exist! Updating database".format(info=bcolors.INFO, filename=relative_path_name))
self.conn.commit()
'{info} File {filename} already exist! Updating database'.format(info=bcolors.INFO,
filename=relative_path_name))
else:
file_req = self.nessus.download_scan(scan_id=scan_id, history=history_id, export_format='csv')
clean_csv = pd.read_csv(io.StringIO(file_req.decode('utf-8')))
file_req = \
self.nessus.download_scan(scan_id=scan_id,
history=history_id, export_format='csv')
clean_csv = \
pd.read_csv(io.StringIO(file_req.decode('utf-8'
)))
if len(clean_csv) > 2:
self.vprint("Processing %s/%s for scan: %s" % (scan_count, len(scan_history), scan_name))
clean_csv['CVSS'] = clean_csv['CVSS'].astype(str).apply(self.cleanser)
clean_csv['CVE'] = clean_csv['CVE'].astype(str).apply(self.cleanser)
clean_csv['Description'] = clean_csv['Description'].astype(str).apply(self.cleanser)
clean_csv['Synopsis'] = clean_csv['Description'].astype(str).apply(self.cleanser)
clean_csv['Solution'] = clean_csv['Solution'].astype(str).apply(self.cleanser)
clean_csv['See Also'] = clean_csv['See Also'].astype(str).apply(self.cleanser)
clean_csv['Plugin Output'] = clean_csv['Plugin Output'].astype(str).apply(self.cleanser)
clean_csv.to_csv(relative_path_name, index=False)
self.vprint('Processing %s/%s for scan: %s'
% (scan_count, len(scan_history),
scan_name))
columns_to_cleanse = ['CVSS','CVE','Description','Synopsis','Solution','See Also','Plugin Output']
for col in columns_to_cleanse:
clean_csv[col] = clean_csv[col].astype(str).apply(self.cleanser)
clean_csv['Synopsis'] = \
clean_csv['Description'
].astype(str).apply(self.cleanser)
clean_csv.to_csv(relative_path_name,
index=False)
record_meta = (
scan_name, scan_id, norm_time, file_name, time.time(), clean_csv.shape[0], 'nessus', uuid,
1)
scan_name,
scan_id,
norm_time,
file_name,
time.time(),
clean_csv.shape[0],
'nessus',
uuid,
1,
)
self.record_insert(record_meta)
self.vprint("{info} {filename} records written to {path} ".format(info=bcolors.INFO, filename=clean_csv.shape[0], path=file_name))
self.conn.commit()
self.vprint('{info} {filename} records written to {path} '.format(info=bcolors.INFO,
filename=clean_csv.shape[
0],
path=file_name))
else:
record_meta = (
scan_name, scan_id, norm_time, file_name, time.time(), clean_csv.shape[0], 'nessus', uuid,
1)
scan_name,
scan_id,
norm_time,
file_name,
time.time(),
clean_csv.shape[0],
'nessus',
uuid,
1,
)
self.record_insert(record_meta)
self.vprint(file_name + ' has no host available... Updating database and skipping!')
self.conn.commit()
self.vprint(file_name
+ ' has no host available... Updating database and skipping!'
)
self.conn.close()
"{success} Scan aggregation complete! Connection to database closed.".format(success=bcolors.SUCCESS)
'{success} Scan aggregation complete! Connection to database closed.'.format(success=bcolors.SUCCESS)
else:
self.vprint('{fail} Failed to use scanner at {host}'.format(fail=bcolors.FAIL, host=self.nessus_hostname+':'+self.nessus_port))
self.vprint('{fail} Failed to use scanner at {host}'.format(fail=bcolors.FAIL,
host=self.hostname + ':'
+ self.nessus_port))
class vulnWhispererQualys(vulnWhispererBase):
CONFIG_SECTION = 'qualys'
def __init__(
self,
config=None,
db_name='report_tracker.db',
purge=False,
verbose=None,
debug=False,
username=None,
password=None,
):
super(vulnWhispererQualys, self).__init__(config=config, )
self.qualys_scan = qualysScanReport(config=config)
self.latest_scans = self.qualys_scan.qw.get_all_scans()
self.directory_check()
def directory_check(self):
if not os.path.exists(self.write_path):
os.makedirs(self.write_path)
self.vprint('{info} Directory created at {scan} - Skipping creation'.format(
scan=self.write_path, info=bcolors.INFO))
else:
os.path.exists(self.write_path)
self.vprint('{info} Directory already exist for {scan} - Skipping creation'.format(
scan=self.write_path, info=bcolors.INFO))
def whisper_reports(self, report_id, updated_date, cleanup=True):
"""
report_id: App ID
updated_date: Last time scan was ran for app_id
"""
vuln_ready = None
try:
if 'Z' in updated_date:
updated_date = self.qualys_scan.utils.iso_to_epoch(updated_date)
report_name = 'qualys_web_' + str(report_id) \
+ '_{last_updated}'.format(last_updated=updated_date) \
+ '.csv'
"""
record_meta = (
scan_name,
app_id,
norm_time,
report_name,
time.time(),
clean_csv.shape[0],
'qualys',
uuid,
1,
)
"""
#self.record_insert(record_meta)
if os.path.isfile(self.path_check(report_name)):
print('{action} - File already exist! Skipping...'.format(action=bcolors.ACTION))
pass
else:
print('{action} - Generating report for %s'.format(action=bcolors.ACTION) % report_id)
status = self.qualys_scan.qw.create_report(report_id)
root = objectify.fromstring(status)
if root.responseCode == 'SUCCESS':
print('{info} - Successfully generated report! ID: %s'.format(info=bcolors.INFO) \
% report_id)
generated_report_id = root.data.Report.id
print('{info} - New Report ID: %s'.format(info=bcolors.INFO) \
% generated_report_id)
vuln_ready = self.qualys_scan.process_data(path=self.write_path, file_id=generated_report_id)
vuln_ready.to_csv(self.path_check(report_name), index=False, header=True) # add when timestamp occured
print('{success} - Report written to %s'.format(success=bcolors.SUCCESS) \
% report_name)
print('{action} - Removing report %s'.format(action=bcolors.ACTION) \
% generated_report_id)
if cleanup:
cleaning_up = \
self.qualys_scan.qw.delete_report(generated_report_id)
os.remove(self.path_check(str(generated_report_id) + '.csv'))
print('{action} - Deleted report: %s'.format(action=bcolors.ACTION) \
% generated_report_id)
else:
print('{error} Could not process report ID: %s'.format(error=bcolors.FAIL) % status)
except Exception as e:
print('{error} - Could not process %s - %s'.format(error=bcolors.FAIL) % (report_id, e))
return vuln_ready
def process_web_assets(self):
counter = 0
for app in self.latest_scans.iterrows():
counter += 1
print('Processing %s/%s' % (counter, len(self.latest_scans)))
self.whisper_reports(app[1]['id'], app[1]['launchedDate'])
class vulnWhisperer(object):
def __init__(self,
profile=None,
verbose=None,
username=None,
password=None,
config=None):
self.profile = profile
self.config = config
self.username = username
self.password = password
self.verbose = verbose
def whisper_vulnerabilities(self):
if self.profile == 'nessus':
vw = vulnWhispererNessus(config=self.config,
username=self.username,
password=self.password,
verbose=self.verbose)
vw.whisper_nessus()
elif self.profile == 'qualys':
vw = vulnWhispererQualys(config=self.config)
vw.process_web_assets()