28 Commits

Author SHA1 Message Date
1a3165cfc6 Bump lxml from 4.6.5 to 4.9.1
Bumps [lxml](https://github.com/lxml/lxml) from 4.6.5 to 4.9.1.
- [Release notes](https://github.com/lxml/lxml/releases)
- [Changelog](https://github.com/lxml/lxml/blob/master/CHANGES.txt)
- [Commits](https://github.com/lxml/lxml/compare/lxml-4.6.5...lxml-4.9.1)

---
updated-dependencies:
- dependency-name: lxml
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-07-06 21:50:09 +00:00
691f45a1dc Merge pull request #232 from HASecuritySolutions/dependabot/pip/lxml-4.6.5
Bump lxml from 4.1.1 to 4.6.5
2022-06-11 20:39:14 -05:00
80197454a3 Update README.md 2022-02-03 10:33:12 -06:00
841cd09f2d Bump lxml from 4.1.1 to 4.6.5
Bumps [lxml](https://github.com/lxml/lxml) from 4.1.1 to 4.6.5.
- [Release notes](https://github.com/lxml/lxml/releases)
- [Changelog](https://github.com/lxml/lxml/blob/master/CHANGES.txt)
- [Commits](https://github.com/lxml/lxml/compare/lxml-4.1.1...lxml-4.6.5)

---
updated-dependencies:
- dependency-name: lxml
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2021-12-13 19:44:12 +00:00
e7183864d0 Merge pull request #216 from Yashvendra/patch-1
Updated 3000_openvas.conf
2020-07-20 10:45:00 +02:00
12ac3dbf62 Merge pull request #217 from andrew-bailey/patch-1
Update README.md
2020-07-20 10:43:09 +02:00
e41ec93058 Update README.md
Fix license badge from MIT to Apache 2.0 which is the current license applied in Github
2020-07-20 11:57:36 +09:30
8a86e3142a Update 3000_openvas.conf
Fixed Description
2020-07-19 14:41:21 +05:30
9d003d12b4 improved error logging and excepcions 2020-04-08 12:01:47 +02:00
63c638751b Merge pull request #207 from spasaintk/patch-1
Update vulnwhisp.py
2020-02-29 20:05:51 +01:00
a3e85b7207 Update vulnwhisp.py
Code triggers a crash:
ERROR:root:main:local variable 'vw' referenced before assignment
ERROR: local variable 'vw' referenced before assignment

Proposed fix deals with the issue.
After fix:
INFO:vulnWhispererOpenVAS:process_openvas_scans:Processing complete
2020-02-28 00:33:38 +01:00
4974be02b4 fix of fix... 2020-02-21 16:17:00 +01:00
7fe2f9a5c1 casting port from jira local download to an int 2020-02-21 16:09:25 +01:00
f4634d03bd Merge pull request #206 from HASecuritySolutions/jira_ticket_download_attachment_data
Jira ticket download attachment data
2020-02-21 15:58:05 +01:00
e1ca9fadcd fixed issue where when actioning all actions, if one failed it exited the program 2020-02-21 15:50:14 +01:00
adb7700300 added on Jira local download an extra field with affected assets in json format for further processing in Splunk/ELK 2020-02-21 11:00:07 +01:00
ced0d4c2fc Hotfix #190 2020-02-04 16:47:37 +01:00
f483c76638 latest qualysapi version that supports python 2 is 6.0.0 2020-01-13 11:34:21 +01:00
f65116aec8 fix requirements issue, new version of qualysapi to be reviewed 2020-01-13 11:03:04 +01:00
bdcb6de4b2 Target CentOS 7 (issue #199) (#200) 2019-12-03 16:21:48 +01:00
af8e27d075 Bump requests from 2.18.3 to 2.20.0 (#196)
Bumps [requests](https://github.com/requests/requests) from 2.18.3 to 2.20.0.
- [Release notes](https://github.com/requests/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/master/HISTORY.md)
- [Commits](https://github.com/requests/requests/compare/v2.18.3...v2.20.0)

Signed-off-by: dependabot[bot] <support@github.com>
2019-12-03 16:20:36 +01:00
accf926ff7 fixed ELK7 logstash compatibility, #187 2019-09-16 15:35:34 +02:00
acf387bd0e added ELK versions supported (6 and 7) 2019-08-24 15:06:33 +02:00
ab7a91e020 Update frameworks_example.ini (#186) 2019-08-10 05:32:19 +02:00
a1a0d6b757 Merge pull request #182 from HASecuritySolutions/save_assets_no_DNS_record
[JIRA] added local file save with assets not resolving hostname
2019-06-18 12:05:49 +02:00
2fb089805c [JIRA] added local file save with assets not resolving hostname 2019-06-18 10:53:55 +02:00
6cf2a94431 Support tenable API keys (#176)
* support tenable API keys

* more flexible config support

* add nessus API key support

* fix whitespace
2019-05-02 10:26:51 +02:00
162636e60f Fix newlines in MAC Address field output (#178)
* fix newlines in all MAC Address field

* remove newline

* only cleanse if col exists
2019-05-02 08:58:18 +02:00
14 changed files with 555 additions and 163 deletions

View File

@ -1,4 +1,4 @@
FROM centos:latest FROM centos:7
MAINTAINER Justin Henderson justin@hasecuritysolutions.com MAINTAINER Justin Henderson justin@hasecuritysolutions.com

View File

@ -6,8 +6,10 @@
VulnWhisperer is a vulnerability management tool and report aggregator. VulnWhisperer will pull all the reports from the different Vulnerability scanners and create a file with a unique filename for each one, using that data later to sync with Jira and feed Logstash. Jira does a closed cycle full Sync with the data provided by the Scanners, while Logstash indexes and tags all of the information inside the report (see logstash files at /resources/elk6/pipeline/). Data is then shipped to ElasticSearch to be indexed, and ends up in a visual and searchable format in Kibana with already defined dashboards. VulnWhisperer is a vulnerability management tool and report aggregator. VulnWhisperer will pull all the reports from the different Vulnerability scanners and create a file with a unique filename for each one, using that data later to sync with Jira and feed Logstash. Jira does a closed cycle full Sync with the data provided by the Scanners, while Logstash indexes and tags all of the information inside the report (see logstash files at /resources/elk6/pipeline/). Data is then shipped to ElasticSearch to be indexed, and ends up in a visual and searchable format in Kibana with already defined dashboards.
VulnWhisperer is an open-source community funded project. VulnWhisperer currently works but is due for a documentation overhaul and code review. This is on the roadmap for the next month or two (February or March of 2022 - hopefully). Please note, crowd funding is an option. If you would like help getting VulnWhisperer up and running, are interested in new features, or are looking for paid support (for those of you that require commercial support contracts to implement open-source solutions), please reach out to **info@hasecuritysolutions.com**.
[![Build Status](https://travis-ci.org/HASecuritySolutions/VulnWhisperer.svg?branch=master)](https://travis-ci.org/HASecuritySolutions/VulnWhisperer) [![Build Status](https://travis-ci.org/HASecuritySolutions/VulnWhisperer.svg?branch=master)](https://travis-ci.org/HASecuritySolutions/VulnWhisperer)
[![MIT License](https://img.shields.io/badge/license-MIT-blue.svg?style=flat)](http://choosealicense.com/licenses/mit/) [![GitHub license](https://img.shields.io/github/license/HASecuritySolutions/VulnWhisperer)](https://github.com/HASecuritySolutions/VulnWhisperer/blob/master/LICENSE)
[![Twitter](https://img.shields.io/twitter/follow/VulnWhisperer.svg?style=social&label=Follow)](https://twitter.com/VulnWhisperer) [![Twitter](https://img.shields.io/twitter/follow/VulnWhisperer.svg?style=social&label=Follow)](https://twitter.com/VulnWhisperer)
Currently Supports Currently Supports
@ -30,7 +32,8 @@ Currently Supports
### Reporting Frameworks ### Reporting Frameworks
- [X] [ELK](https://www.elastic.co/elk-stack) - [X] [Elastic Stack (**v6**/**v7**)](https://www.elastic.co/elk-stack)
- [ ] [OpenSearch - Being considered for next update](https://opensearch.org/)
- [X] [Jira](https://www.atlassian.com/software/jira) - [X] [Jira](https://www.atlassian.com/software/jira)
- [ ] [Splunk](https://www.splunk.com/) - [ ] [Splunk](https://www.splunk.com/)

View File

@ -83,6 +83,7 @@ def main():
enabled_sections = config.get_sections_with_attribute('enabled') enabled_sections = config.get_sections_with_attribute('enabled')
for section in enabled_sections: for section in enabled_sections:
try:
vw = vulnWhisperer(config=args.config, vw = vulnWhisperer(config=args.config,
profile=section, profile=section,
verbose=args.verbose, verbose=args.verbose,
@ -91,6 +92,8 @@ def main():
source=args.source, source=args.source,
scanname=args.scanname) scanname=args.scanname)
exit_code += vw.whisper_vulnerabilities() exit_code += vw.whisper_vulnerabilities()
except Exception as e:
logger.error("VulnWhisperer was unable to perform the processing on '{}'".format(args.source))
else: else:
logger.info('Running vulnwhisperer for section {}'.format(args.section)) logger.info('Running vulnwhisperer for section {}'.format(args.section))
vw = vulnWhisperer(config=args.config, vw = vulnWhisperer(config=args.config,

View File

@ -2,6 +2,8 @@
enabled=true enabled=true
hostname=localhost hostname=localhost
port=8834 port=8834
access_key=
secret_key=
username=nessus_username username=nessus_username
password=nessus_password password=nessus_password
write_path=/opt/VulnWhisperer/data/nessus/ write_path=/opt/VulnWhisperer/data/nessus/
@ -13,6 +15,8 @@ verbose=true
enabled=true enabled=true
hostname=cloud.tenable.com hostname=cloud.tenable.com
port=443 port=443
access_key=
secret_key=
username=tenable.io_username username=tenable.io_username
password=tenable.io_password password=tenable.io_password
write_path=/opt/VulnWhisperer/data/tenable/ write_path=/opt/VulnWhisperer/data/tenable/
@ -37,7 +41,7 @@ max_retries = 10
template_id = 126024 template_id = 126024
[qualys_vuln] [qualys_vuln]
#Reference https://www.qualys.com/docs/qualys-was-api-user-guide.pdf to find your API #Reference https://www.qualys.com/docs/qualys-api-vmpc-user-guide.pdf to find your API
enabled = true enabled = true
hostname = qualysapi.qg2.apps.qualys.com hostname = qualysapi.qg2.apps.qualys.com
username = exampleuser username = exampleuser

View File

@ -2,6 +2,8 @@
enabled=true enabled=true
hostname=nessus hostname=nessus
port=443 port=443
access_key=
secret_key=
username=nessus_username username=nessus_username
password=nessus_password password=nessus_password
write_path=/opt/VulnWhisperer/data/nessus/ write_path=/opt/VulnWhisperer/data/nessus/
@ -13,6 +15,8 @@ verbose=true
enabled=true enabled=true
hostname=tenable hostname=tenable
port=443 port=443
access_key=
secret_key=
username=tenable.io_username username=tenable.io_username
password=tenable.io_password password=tenable.io_password
write_path=/opt/VulnWhisperer/data/tenable/ write_path=/opt/VulnWhisperer/data/tenable/

View File

@ -1,12 +1,12 @@
pandas==0.20.3 pandas==0.20.3
setuptools==40.4.3 setuptools==40.4.3
pytz==2017.2 pytz==2017.2
Requests==2.18.3 Requests==2.20.0
lxml==4.1.1 lxml==4.9.1
future-fstrings future-fstrings
bs4 bs4
jira jira
bottle bottle
coloredlogs coloredlogs
qualysapi>=5.1.0 qualysapi==6.0.0
httpretty httpretty

View File

@ -2,7 +2,7 @@
# Email: austin@hasecuritysolutions.com # Email: austin@hasecuritysolutions.com
# Last Update: 03/04/2018 # Last Update: 03/04/2018
# Version 0.3 # Version 0.3
# Description: Take in qualys web scan reports from vulnWhisperer and pumps into logstash # Description: Take in Openvas web scan reports from vulnWhisperer and pumps into logstash
input { input {
file { file {

View File

@ -0,0 +1,231 @@
{
"index_patterns": "logstash-vulnwhisperer-*",
"mappings": {
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "keyword"
},
"asset": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"asset_uuid": {
"type": "keyword"
},
"assign_ip": {
"type": "ip"
},
"category": {
"type": "keyword"
},
"cve": {
"type": "keyword"
},
"cvss_base": {
"type": "float"
},
"cvss_temporal_vector": {
"type": "keyword"
},
"cvss_temporal": {
"type": "float"
},
"cvss_vector": {
"type": "keyword"
},
"cvss": {
"type": "float"
},
"cvss3_base": {
"type": "float"
},
"cvss3_temporal_vector": {
"type": "keyword"
},
"cvss3_temporal": {
"type": "float"
},
"cvss3_vector": {
"type": "keyword"
},
"cvss3": {
"type": "float"
},
"description": {
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
},
"norms": false,
"type": "text"
},
"dns": {
"type": "keyword"
},
"exploitability": {
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
},
"norms": false,
"type": "text"
},
"fqdn": {
"type": "keyword"
},
"geoip": {
"dynamic": true,
"type": "object",
"properties": {
"ip": {
"type": "ip"
},
"latitude": {
"type": "float"
},
"location": {
"type": "geo_point"
},
"longitude": {
"type": "float"
}
}
},
"history_id": {
"type": "keyword"
},
"host": {
"type": "keyword"
},
"host_end": {
"type": "date"
},
"host_start": {
"type": "date"
},
"impact": {
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
},
"norms": false,
"type": "text"
},
"ip_status": {
"type": "keyword"
},
"ip": {
"type": "ip"
},
"last_updated": {
"type": "date"
},
"operating_system": {
"type": "keyword"
},
"path": {
"type": "keyword"
},
"pci_vuln": {
"type": "keyword"
},
"plugin_family": {
"type": "keyword"
},
"plugin_id": {
"type": "keyword"
},
"plugin_name": {
"type": "keyword"
},
"plugin_output": {
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
},
"norms": false,
"type": "text"
},
"port": {
"type": "integer"
},
"protocol": {
"type": "keyword"
},
"results": {
"type": "text"
},
"risk_number": {
"type": "integer"
},
"risk_score_name": {
"type": "keyword"
},
"risk_score": {
"type": "float"
},
"risk": {
"type": "keyword"
},
"scan_id": {
"type": "keyword"
},
"scan_name": {
"type": "keyword"
},
"scan_reference": {
"type": "keyword"
},
"see_also": {
"type": "keyword"
},
"solution": {
"type": "keyword"
},
"source": {
"type": "keyword"
},
"ssl": {
"type": "keyword"
},
"synopsis": {
"type": "keyword"
},
"system_type": {
"type": "keyword"
},
"tags": {
"type": "keyword"
},
"threat": {
"type": "text"
},
"type": {
"type": "keyword"
},
"vendor_reference": {
"type": "keyword"
},
"vulnerability_state": {
"type": "keyword"
}
}
}
}

View File

@ -24,15 +24,19 @@ class NessusAPI(object):
EXPORT_STATUS = EXPORT + '/{file_id}/status' EXPORT_STATUS = EXPORT + '/{file_id}/status'
EXPORT_HISTORY = EXPORT + '?history_id={history_id}' EXPORT_HISTORY = EXPORT + '?history_id={history_id}'
def __init__(self, hostname=None, port=None, username=None, password=None, verbose=True): def __init__(self, hostname=None, port=None, username=None, password=None, verbose=True, profile=None, access_key=None, secret_key=None):
self.logger = logging.getLogger('NessusAPI') self.logger = logging.getLogger('NessusAPI')
if verbose: if verbose:
self.logger.setLevel(logging.DEBUG) self.logger.setLevel(logging.DEBUG)
if username is None or password is None: if not all((username, password)) and not all((access_key, secret_key)):
raise Exception('ERROR: Missing username or password.') raise Exception('ERROR: Missing username, password or API keys.')
self.profile = profile
self.user = username self.user = username
self.password = password self.password = password
self.api_keys = False
self.access_key = access_key
self.secret_key = secret_key
self.base = 'https://{hostname}:{port}'.format(hostname=hostname, port=port) self.base = 'https://{hostname}:{port}'.format(hostname=hostname, port=port)
self.verbose = verbose self.verbose = verbose
@ -52,7 +56,13 @@ class NessusAPI(object):
'X-Cookie': None 'X-Cookie': None
} }
if all((self.access_key, self.secret_key)):
self.logger.debug('Using {} API keys'.format(self.profile))
self.api_keys = True
self.session.headers['X-ApiKeys'] = 'accessKey={}; secretKey={}'.format(self.access_key, self.secret_key)
else:
self.login() self.login()
self.scans = self.get_scans() self.scans = self.get_scans()
self.scan_ids = self.get_scan_ids() self.scan_ids = self.get_scan_ids()
@ -78,8 +88,10 @@ class NessusAPI(object):
if url == self.base + self.SESSION: if url == self.base + self.SESSION:
break break
try: try:
self.login()
timeout += 1 timeout += 1
if self.api_keys:
continue
self.login()
self.logger.info('Token refreshed') self.logger.info('Token refreshed')
except Exception as e: except Exception as e:
self.logger.error('Could not refresh token\nReason: {}'.format(str(e))) self.logger.error('Could not refresh token\nReason: {}'.format(str(e)))
@ -114,7 +126,7 @@ class NessusAPI(object):
data = self.request(self.SCAN_ID.format(scan_id=scan_id), method='GET', json_output=True) data = self.request(self.SCAN_ID.format(scan_id=scan_id), method='GET', json_output=True)
return data['history'] return data['history']
def download_scan(self, scan_id=None, history=None, export_format="", profile=""): def download_scan(self, scan_id=None, history=None, export_format=""):
running = True running = True
counter = 0 counter = 0
@ -127,6 +139,7 @@ class NessusAPI(object):
req = self.request(query, data=json.dumps(data), method='POST', json_output=True) req = self.request(query, data=json.dumps(data), method='POST', json_output=True)
try: try:
file_id = req['file'] file_id = req['file']
if self.profile == 'nessus':
token_id = req['token'] if 'token' in req else req['temp_token'] token_id = req['token'] if 'token' in req else req['temp_token']
except Exception as e: except Exception as e:
self.logger.error('{}'.format(str(e))) self.logger.error('{}'.format(str(e)))
@ -143,7 +156,7 @@ class NessusAPI(object):
if counter % 60 == 0: if counter % 60 == 0:
self.logger.info("Completed: {}".format(counter)) self.logger.info("Completed: {}".format(counter))
self.logger.info("Done: {}".format(counter)) self.logger.info("Done: {}".format(counter))
if profile == 'tenable': if self.profile == 'tenable' or self.api_keys:
content = self.request(self.EXPORT_FILE_DOWNLOAD.format(scan_id=scan_id, file_id=file_id), method='GET', download=True) content = self.request(self.EXPORT_FILE_DOWNLOAD.format(scan_id=scan_id, file_id=file_id), method='GET', download=True)
else: else:
content = self.request(self.EXPORT_TOKEN_DOWNLOAD.format(token_id=token_id), method='GET', download=True) content = self.request(self.EXPORT_TOKEN_DOWNLOAD.format(token_id=token_id), method='GET', download=True)
@ -152,7 +165,7 @@ class NessusAPI(object):
def get_utc_from_local(self, date_time, local_tz=None, epoch=True): def get_utc_from_local(self, date_time, local_tz=None, epoch=True):
date_time = datetime.fromtimestamp(date_time) date_time = datetime.fromtimestamp(date_time)
if local_tz is None: if local_tz is None:
local_tz = pytz.timezone('US/Central') local_tz = pytz.timezone('UTC')
else: else:
local_tz = pytz.timezone(local_tz) local_tz = pytz.timezone(local_tz)
local_time = local_tz.normalize(local_tz.localize(date_time)) local_time = local_tz.normalize(local_tz.localize(date_time))

View File

@ -68,6 +68,7 @@ class JiraAPI(object):
self.logger.error("Error creating Ticket: component {} not found".format(component)) self.logger.error("Error creating Ticket: component {} not found".format(component))
return 0 return 0
try:
new_issue = self.jira.create_issue(project=project, new_issue = self.jira.create_issue(project=project,
summary=title, summary=title,
description=desc, description=desc,
@ -80,6 +81,10 @@ class JiraAPI(object):
if attachment_contents: if attachment_contents:
self.add_content_as_attachment(new_issue, attachment_contents) self.add_content_as_attachment(new_issue, attachment_contents)
except Exception as e:
self.logger.error("Failed to create ticket on Jira Project '{}'. Error: {}".format(project, e))
new_issue = False
return new_issue return new_issue
#Basic JIRA Metrics #Basic JIRA Metrics
@ -226,32 +231,44 @@ class JiraAPI(object):
def ticket_get_unique_fields(self, ticket): def ticket_get_unique_fields(self, ticket):
title = ticket.raw.get('fields', {}).get('summary').encode("ascii").strip() title = ticket.raw.get('fields', {}).get('summary').encode("ascii").strip()
ticketid = ticket.key.encode("ascii") ticketid = ticket.key.encode("ascii")
assets = []
try:
affected_assets_section = ticket.raw.get('fields', {}).get('description').encode("ascii").split("{panel:title=Affected Assets}")[1].split("{panel}")[0]
assets = list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets_section)))
except Exception as e: assets = self.get_assets_from_description(ticket)
self.logger.error("Ticket IPs regex failed. Ticket ID: {}. Reason: {}".format(ticketid, e))
assets = []
try:
if not assets: if not assets:
#check if attachment, if so, get assets from attachment #check if attachment, if so, get assets from attachment
affected_assets_section = self.check_ips_attachment(ticket) assets = self.get_assets_from_attachment(ticket)
if affected_assets_section:
assets = list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets_section)))
except Exception as e:
self.logger.error("Ticket IPs Attachment regex failed. Ticket ID: {}. Reason: {}".format(ticketid, e))
return ticketid, title, assets return ticketid, title, assets
def check_ips_attachment(self, ticket): def get_assets_from_description(self, ticket, _raw = False):
affected_assets_section = [] # Get the assets as a string "host - protocol/port - hostname" separated by "\n"
# structure the text to have the same structure as the assets from the attachment
affected_assets = ""
try:
affected_assets = ticket.raw.get('fields', {}).get('description').encode("ascii").split("{panel:title=Affected Assets}")[1].split("{panel}")[0].replace('\n','').replace(' * ','\n').replace('\n', '', 1)
except Exception as e:
self.logger.error("Unable to process the Ticket's 'Affected Assets'. Ticket ID: {}. Reason: {}".format(ticket, e))
if affected_assets:
if _raw:
# from line 406 check if the text in the panel corresponds to having added an attachment
if "added as an attachment" in affected_assets:
return False
return affected_assets
try:
# if _raw is not true, we return only the IPs of the affected assets
return list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets)))
except Exception as e:
self.logger.error("Ticket IPs regex failed. Ticket ID: {}. Reason: {}".format(ticket, e))
return False
def get_assets_from_attachment(self, ticket, _raw = False):
# Get the assets as a string "host - protocol/port - hostname" separated by "\n"
affected_assets = []
try: try:
fields = self.jira.issue(ticket.key).raw.get('fields', {}) fields = self.jira.issue(ticket.key).raw.get('fields', {})
attachments = fields.get('attachment', {}) attachments = fields.get('attachment', {})
affected_assets_section = "" affected_assets = ""
#we will make sure we get the latest version of the file #we will make sure we get the latest version of the file
latest = '' latest = ''
attachment_id = '' attachment_id = ''
@ -265,12 +282,44 @@ class JiraAPI(object):
if latest < item.get('created'): if latest < item.get('created'):
latest = item.get('created') latest = item.get('created')
attachment_id = item.get('id') attachment_id = item.get('id')
affected_assets_section = self.jira.attachment(attachment_id).get() affected_assets = self.jira.attachment(attachment_id).get()
except Exception as e: except Exception as e:
self.logger.error("Failed to get assets from ticket attachment. Ticket ID: {}. Reason: {}".format(ticket, e)) self.logger.error("Failed to get assets from ticket attachment. Ticket ID: {}. Reason: {}".format(ticket, e))
return affected_assets_section if affected_assets:
if _raw:
return affected_assets
try:
# if _raw is not true, we return only the IPs of the affected assets
affected_assets = list(set(re.findall(r"\b\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\b", affected_assets)))
return affected_assets
except Exception as e:
self.logger.error("Ticket IPs Attachment regex failed. Ticket ID: {}. Reason: {}".format(ticket, e))
return False
def parse_asset_to_json(self, asset):
hostname, protocol, port = "", "", ""
asset_info = asset.split(" - ")
ip = asset_info[0]
proto_port = asset_info[1]
# in case there is some case where hostname is not reported at all
if len(asset_info) == 3:
hostname = asset_info[2]
if proto_port != "N/A/N/A":
protocol, port = proto_port.split("/")
port = int(float(port))
asset_dict = {
"host": ip,
"protocol": protocol,
"port": port,
"hostname": hostname
}
return asset_dict
def clean_old_attachments(self, ticket): def clean_old_attachments(self, ticket):
fields = ticket.raw.get('fields') fields = ticket.raw.get('fields')
@ -441,7 +490,7 @@ class JiraAPI(object):
if transition.get('name') == self.JIRA_REOPEN_ISSUE: if transition.get('name') == self.JIRA_REOPEN_ISSUE:
self.logger.debug("Ticket is reopenable") self.logger.debug("Ticket is reopenable")
return True return True
self.logger.warn("Ticket can't be opened. Check Jira transitions.") self.logger.error("Ticket {} can't be opened. Check Jira transitions.".format(ticket_obj))
return False return False
def is_ticket_closeable(self, ticket_obj): def is_ticket_closeable(self, ticket_obj):
@ -449,7 +498,7 @@ class JiraAPI(object):
for transition in transitions: for transition in transitions:
if transition.get('name') == self.JIRA_CLOSE_ISSUE: if transition.get('name') == self.JIRA_CLOSE_ISSUE:
return True return True
self.logger.warn("Ticket can't closed. Check Jira transitions.") self.logger.error("Ticket {} can't closed. Check Jira transitions.".format(ticket_obj))
return False return False
def is_ticket_resolved(self, ticket_obj): def is_ticket_resolved(self, ticket_obj):
@ -522,7 +571,7 @@ class JiraAPI(object):
def close_obsolete_tickets(self): def close_obsolete_tickets(self):
# Close tickets older than 12 months, vulnerabilities not solved will get created a new ticket # Close tickets older than 12 months, vulnerabilities not solved will get created a new ticket
self.logger.info("Closing obsolete tickets older than {} months".format(self.max_time_tracking)) self.logger.info("Closing obsolete tickets older than {} months".format(self.max_time_tracking))
jql = "labels=vulnerability_management AND created <startOfMonth(-{}) and resolution=Unresolved".format(self.max_time_tracking) jql = "labels=vulnerability_management AND NOT labels=advisory AND created <startOfMonth(-{}) and resolution=Unresolved".format(self.max_time_tracking)
tickets_to_close = self.jira.search_issues(jql, maxResults=0) tickets_to_close = self.jira.search_issues(jql, maxResults=0)
comment = '''This ticket is being closed for hygiene, as it is more than {} months old. comment = '''This ticket is being closed for hygiene, as it is more than {} months old.
@ -553,9 +602,36 @@ class JiraAPI(object):
return True return True
try: try:
self.logger.info("Saving locally tickets from the last {} months".format(self.max_time_tracking)) self.logger.info("Saving locally tickets from the last {} months".format(self.max_time_tracking))
jql = "labels=vulnerability_management AND created >=startOfMonth(-{})".format(self.max_time_tracking) jql = "labels=vulnerability_management AND NOT labels=advisory AND created >=startOfMonth(-{})".format(self.max_time_tracking)
tickets_data = self.jira.search_issues(jql, maxResults=0) tickets_data = self.jira.search_issues(jql, maxResults=0)
#TODO process tickets, creating a new field called "_metadata" with all the affected assets well structured
# for future processing in ELK/Splunk; this includes downloading attachments with assets and processing them
processed_tickets = []
for ticket in tickets_data:
assets = self.get_assets_from_description(ticket, _raw=True)
if not assets:
# check if attachment, if so, get assets from attachment
assets = self.get_assets_from_attachment(ticket, _raw=True)
# process the affected assets to save them as json structure on a new field from the JSON
_metadata = {"affected_hosts": []}
if assets:
if "\n" in assets:
for asset in assets.split("\n"):
assets_json = self.parse_asset_to_json(asset)
_metadata["affected_hosts"].append(assets_json)
else:
assets_json = self.parse_asset_to_json(assets)
_metadata["affected_hosts"].append(assets_json)
temp_ticket = ticket.raw.get('fields')
temp_ticket['_metadata'] = _metadata
processed_tickets.append(temp_ticket)
#end of line needed, as writelines() doesn't add it automatically, otherwise one big line #end of line needed, as writelines() doesn't add it automatically, otherwise one big line
to_save = [json.dumps(ticket.raw.get('fields'))+"\n" for ticket in tickets_data] to_save = [json.dumps(ticket.raw.get('fields'))+"\n" for ticket in tickets_data]
with open(fname, 'w') as outfile: with open(fname, 'w') as outfile:

View File

@ -55,8 +55,12 @@ class vulnWhispererBase(object):
except: except:
self.enabled = False self.enabled = False
self.hostname = self.config.get(self.CONFIG_SECTION, 'hostname') self.hostname = self.config.get(self.CONFIG_SECTION, 'hostname')
try:
self.username = self.config.get(self.CONFIG_SECTION, 'username') self.username = self.config.get(self.CONFIG_SECTION, 'username')
self.password = self.config.get(self.CONFIG_SECTION, 'password') self.password = self.config.get(self.CONFIG_SECTION, 'password')
except:
self.username = None
self.password = None
self.write_path = self.config.get(self.CONFIG_SECTION, 'write_path') self.write_path = self.config.get(self.CONFIG_SECTION, 'write_path')
self.db_path = self.config.get(self.CONFIG_SECTION, 'db_path') self.db_path = self.config.get(self.CONFIG_SECTION, 'db_path')
self.verbose = self.config.getbool(self.CONFIG_SECTION, 'verbose') self.verbose = self.config.getbool(self.CONFIG_SECTION, 'verbose')
@ -200,6 +204,7 @@ class vulnWhispererBase(object):
def get_latest_results(self, source, scan_name): def get_latest_results(self, source, scan_name):
processed = 0 processed = 0
results = [] results = []
reported = ""
try: try:
self.conn.text_factory = str self.conn.text_factory = str
@ -218,6 +223,7 @@ class vulnWhispererBase(object):
except Exception as e: except Exception as e:
self.logger.error("Error when getting latest results from {}.{} : {}".format(source, scan_name, e)) self.logger.error("Error when getting latest results from {}.{} : {}".format(source, scan_name, e))
return results, reported return results, reported
def get_scan_profiles(self): def get_scan_profiles(self):
@ -274,6 +280,8 @@ class vulnWhispererNessus(vulnWhispererBase):
self.develop = True self.develop = True
self.purge = purge self.purge = purge
self.access_key = None
self.secret_key = None
if config is not None: if config is not None:
try: try:
@ -283,24 +291,36 @@ class vulnWhispererNessus(vulnWhispererBase):
'trash') 'trash')
try: try:
self.logger.info('Attempting to connect to nessus...') self.access_key = self.config.get(self.CONFIG_SECTION,'access_key')
self.secret_key = self.config.get(self.CONFIG_SECTION,'secret_key')
except:
pass
try:
self.logger.info('Attempting to connect to {}...'.format(self.CONFIG_SECTION))
self.nessus = \ self.nessus = \
NessusAPI(hostname=self.hostname, NessusAPI(hostname=self.hostname,
port=self.nessus_port, port=self.nessus_port,
username=self.username, username=self.username,
password=self.password) password=self.password,
profile=self.CONFIG_SECTION,
access_key=self.access_key,
secret_key=self.secret_key
)
self.nessus_connect = True self.nessus_connect = True
self.logger.info('Connected to nessus on {host}:{port}'.format(host=self.hostname, self.logger.info('Connected to {} on {host}:{port}'.format(self.CONFIG_SECTION, host=self.hostname,
port=str(self.nessus_port))) port=str(self.nessus_port)))
except Exception as e: except Exception as e:
self.logger.error('Exception: {}'.format(str(e))) self.logger.error('Exception: {}'.format(str(e)))
raise Exception( raise Exception(
'Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format( 'Could not connect to {} -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
self.CONFIG_SECTION,
config=self.config.config_in, config=self.config.config_in,
e=e)) e=e))
except Exception as e: except Exception as e:
self.logger.error('Could not properly load your config!\nReason: {e}'.format(e=e)) self.logger.error('Could not properly load your config!\nReason: {e}'.format(e=e))
sys.exit(1) return False
#sys.exit(1)
@ -435,7 +455,7 @@ class vulnWhispererNessus(vulnWhispererBase):
try: try:
file_req = \ file_req = \
self.nessus.download_scan(scan_id=scan_id, history=history_id, self.nessus.download_scan(scan_id=scan_id, history=history_id,
export_format='csv', profile=self.CONFIG_SECTION) export_format='csv')
except Exception as e: except Exception as e:
self.logger.error('Could not download {} scan {}: {}'.format(self.CONFIG_SECTION, scan_id, str(e))) self.logger.error('Could not download {} scan {}: {}'.format(self.CONFIG_SECTION, scan_id, str(e)))
self.exit_code += 1 self.exit_code += 1
@ -445,9 +465,10 @@ class vulnWhispererNessus(vulnWhispererBase):
pd.read_csv(io.StringIO(file_req.decode('utf-8'))) pd.read_csv(io.StringIO(file_req.decode('utf-8')))
if len(clean_csv) > 2: if len(clean_csv) > 2:
self.logger.info('Processing {}/{} for scan: {}'.format(scan_count, len(scan_list), scan_name.encode('utf8'))) self.logger.info('Processing {}/{} for scan: {}'.format(scan_count, len(scan_list), scan_name.encode('utf8')))
columns_to_cleanse = ['CVSS','CVE','Description','Synopsis','Solution','See Also','Plugin Output'] columns_to_cleanse = ['CVSS','CVE','Description','Synopsis','Solution','See Also','Plugin Output', 'MAC Address']
for col in columns_to_cleanse: for col in columns_to_cleanse:
if col in clean_csv:
clean_csv[col] = clean_csv[col].astype(str).apply(self.cleanser) clean_csv[col] = clean_csv[col].astype(str).apply(self.cleanser)
clean_csv.to_csv(relative_path_name, index=False) clean_csv.to_csv(relative_path_name, index=False)
@ -555,8 +576,11 @@ class vulnWhispererQualys(vulnWhispererBase):
self.logger = logging.getLogger('vulnWhispererQualys') self.logger = logging.getLogger('vulnWhispererQualys')
if debug: if debug:
self.logger.setLevel(logging.DEBUG) self.logger.setLevel(logging.DEBUG)
try:
self.qualys_scan = qualysScanReport(config=config) self.qualys_scan = qualysScanReport(config=config)
except Exception as e:
self.logger.error("Unable to establish connection with Qualys scanner. Reason: {}".format(e))
return False
self.latest_scans = self.qualys_scan.qw.get_all_scans() self.latest_scans = self.qualys_scan.qw.get_all_scans()
self.directory_check() self.directory_check()
self.scans_to_process = None self.scans_to_process = None
@ -642,8 +666,7 @@ class vulnWhispererQualys(vulnWhispererBase):
if cleanup: if cleanup:
self.logger.info('Removing report {} from Qualys Database'.format(generated_report_id)) self.logger.info('Removing report {} from Qualys Database'.format(generated_report_id))
cleaning_up = \ cleaning_up = self.qualys_scan.qw.delete_report(generated_report_id)
self.qualys_scan.qw.delete_report(generated_report_id)
os.remove(self.path_check(str(generated_report_id) + '.csv')) os.remove(self.path_check(str(generated_report_id) + '.csv'))
self.logger.info('Deleted report from local disk: {}'.format(self.path_check(str(generated_report_id)))) self.logger.info('Deleted report from local disk: {}'.format(self.path_check(str(generated_report_id))))
else: else:
@ -728,10 +751,14 @@ class vulnWhispererOpenVAS(vulnWhispererBase):
self.develop = True self.develop = True
self.purge = purge self.purge = purge
self.scans_to_process = None self.scans_to_process = None
try:
self.openvas_api = OpenVAS_API(hostname=self.hostname, self.openvas_api = OpenVAS_API(hostname=self.hostname,
port=self.port, port=self.port,
username=self.username, username=self.username,
password=self.password) password=self.password)
except Exception as e:
self.logger.error("Unable to establish connection with OpenVAS scanner. Reason: {}".format(e))
return False
def whisper_reports(self, output_format='json', launched_date=None, report_id=None, cleanup=True): def whisper_reports(self, output_format='json', launched_date=None, report_id=None, cleanup=True):
report = None report = None
@ -842,8 +869,11 @@ class vulnWhispererQualysVuln(vulnWhispererBase):
self.logger = logging.getLogger('vulnWhispererQualysVuln') self.logger = logging.getLogger('vulnWhispererQualysVuln')
if debug: if debug:
self.logger.setLevel(logging.DEBUG) self.logger.setLevel(logging.DEBUG)
try:
self.qualys_scan = qualysVulnScan(config=config) self.qualys_scan = qualysVulnScan(config=config)
except Exception as e:
self.logger.error("Unable to create connection with Qualys. Reason: {}".format(e))
return False
self.directory_check() self.directory_check()
self.scans_to_process = None self.scans_to_process = None
@ -854,7 +884,7 @@ class vulnWhispererQualysVuln(vulnWhispererBase):
scan_reference=None, scan_reference=None,
output_format='json', output_format='json',
cleanup=True): cleanup=True):
launched_date
if 'Z' in launched_date: if 'Z' in launched_date:
launched_date = self.qualys_scan.utils.iso_to_epoch(launched_date) launched_date = self.qualys_scan.utils.iso_to_epoch(launched_date)
report_name = 'qualys_vuln_' + report_id.replace('/','_') \ report_name = 'qualys_vuln_' + report_id.replace('/','_') \
@ -966,6 +996,13 @@ class vulnWhispererJIRA(vulnWhispererBase):
self.config_path = config self.config_path = config
self.config = vwConfig(config) self.config = vwConfig(config)
self.host_resolv_cache = {} self.host_resolv_cache = {}
self.host_no_resolv = []
self.no_resolv_by_team_dict = {}
#Save locally those assets without DNS entry for flag to system owners
self.no_resolv_fname="no_resolv.txt"
if os.path.isfile(self.no_resolv_fname):
with open(self.no_resolv_fname, "r") as json_file:
self.no_resolv_by_team_dict = json.load(json_file)
self.directory_check() self.directory_check()
if config is not None: if config is not None:
@ -983,7 +1020,8 @@ class vulnWhispererJIRA(vulnWhispererBase):
raise Exception( raise Exception(
'Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format( 'Could not connect to nessus -- Please verify your settings in {config} are correct and try again.\nReason: {e}'.format(
config=self.config.config_in, e=e)) config=self.config.config_in, e=e))
sys.exit(1) return False
#sys.exit(1)
profiles = [] profiles = []
profiles = self.get_scan_profiles() profiles = self.get_scan_profiles()
@ -1173,6 +1211,7 @@ class vulnWhispererJIRA(vulnWhispererBase):
self.logger.debug("Hostname found: {hostname}.".format(hostname=values['dns'])) self.logger.debug("Hostname found: {hostname}.".format(hostname=values['dns']))
except: except:
self.host_resolv_cache[values['ip']] = '' self.host_resolv_cache[values['ip']] = ''
self.host_no_resolv.append(values['ip'])
self.logger.debug("Hostname not found for: {ip}.".format(ip=values['ip'])) self.logger.debug("Hostname not found for: {ip}.".format(ip=values['ip']))
for key in values.keys(): for key in values.keys():
@ -1208,6 +1247,7 @@ class vulnWhispererJIRA(vulnWhispererBase):
vulnerabilities = self.parse_qualys_vuln_vulnerabilities(fullpath, source, scan_name, min_critical, dns_resolv) vulnerabilities = self.parse_qualys_vuln_vulnerabilities(fullpath, source, scan_name, min_critical, dns_resolv)
#***JIRA sync*** #***JIRA sync***
try:
if vulnerabilities: if vulnerabilities:
self.logger.info('{source} data has been successfuly parsed'.format(source=source.upper())) self.logger.info('{source} data has been successfuly parsed'.format(source=source.upper()))
self.logger.info('Starting JIRA sync') self.logger.info('Starting JIRA sync')
@ -1217,6 +1257,18 @@ class vulnWhispererJIRA(vulnWhispererBase):
self.logger.info("[{source}.{scan_name}] No vulnerabilities or vulnerabilities not parsed.".format(source=source, scan_name=scan_name)) self.logger.info("[{source}.{scan_name}] No vulnerabilities or vulnerabilities not parsed.".format(source=source, scan_name=scan_name))
self.set_latest_scan_reported(fullpath.split("/")[-1]) self.set_latest_scan_reported(fullpath.split("/")[-1])
return False return False
except Exception as e:
self.logger.error("Error: {}".format(e))
return False
#writing to file those assets without DNS resolution
#if its not empty
if self.host_no_resolv:
#we will replace old list of non resolved for the new one or create if it doesn't exist already
self.no_resolv_by_team_dict[scan_name] = self.host_no_resolv
with open(self.no_resolv_fname, 'w') as outfile:
json.dump(self.no_resolv_by_team_dict, outfile)
self.set_latest_scan_reported(fullpath.split("/")[-1]) self.set_latest_scan_reported(fullpath.split("/")[-1])
return True return True
@ -1226,7 +1278,13 @@ class vulnWhispererJIRA(vulnWhispererBase):
if autoreport_sections: if autoreport_sections:
for scan in autoreport_sections: for scan in autoreport_sections:
try:
self.jira_sync(self.config.get(scan, 'source'), self.config.get(scan, 'scan_name')) self.jira_sync(self.config.get(scan, 'source'), self.config.get(scan, 'scan_name'))
except Exception as e:
self.logger.error(
"VulnWhisperer wasn't able to report the vulnerabilities from the '{}'s source, section {}.\
\nError: {}".format(
self.config.get(scan, 'source'), self.config.get(scan, 'scan_name'), e))
return True return True
return False return False
@ -1258,35 +1316,35 @@ class vulnWhisperer(object):
if self.profile == 'nessus': if self.profile == 'nessus':
vw = vulnWhispererNessus(config=self.config, vw = vulnWhispererNessus(config=self.config,
username=self.username,
password=self.password,
verbose=self.verbose,
profile=self.profile) profile=self.profile)
if vw:
self.exit_code += vw.whisper_nessus() self.exit_code += vw.whisper_nessus()
elif self.profile == 'qualys_web': elif self.profile == 'qualys_web':
vw = vulnWhispererQualys(config=self.config) vw = vulnWhispererQualys(config=self.config)
if vw:
self.exit_code += vw.process_web_assets() self.exit_code += vw.process_web_assets()
elif self.profile == 'openvas': elif self.profile == 'openvas':
vw_openvas = vulnWhispererOpenVAS(config=self.config) vw = vulnWhispererOpenVAS(config=self.config)
self.exit_code += vw_openvas.process_openvas_scans() if vw:
self.exit_code += vw.process_openvas_scans()
elif self.profile == 'tenable': elif self.profile == 'tenable':
vw = vulnWhispererNessus(config=self.config, vw = vulnWhispererNessus(config=self.config,
username=self.username,
password=self.password,
verbose=self.verbose,
profile=self.profile) profile=self.profile)
if vw:
self.exit_code += vw.whisper_nessus() self.exit_code += vw.whisper_nessus()
elif self.profile == 'qualys_vuln': elif self.profile == 'qualys_vuln':
vw = vulnWhispererQualysVuln(config=self.config) vw = vulnWhispererQualysVuln(config=self.config)
if vw:
self.exit_code += vw.process_vuln_scans() self.exit_code += vw.process_vuln_scans()
elif self.profile == 'jira': elif self.profile == 'jira':
#first we check config fields are created, otherwise we create them #first we check config fields are created, otherwise we create them
vw = vulnWhispererJIRA(config=self.config) vw = vulnWhispererJIRA(config=self.config)
if vw:
if not (self.source and self.scanname): if not (self.source and self.scanname):
self.logger.info('No source/scan_name selected, all enabled scans will be synced') self.logger.info('No source/scan_name selected, all enabled scans will be synced')
success = vw.sync_all() success = vw.sync_all()