Zscaler Splunk App - Design and Installation documentation

Zscaler is pleased to release the attached document in conjunction with the latest version of the Zscaler Splunk App. This new versions adds some great new capabilities with Zscaler API’s being used to retrieve Admin Audit Logs (ZIA) and detailed Cloud Sandbox detonation correlation and reporting.

Splunk Design and Install - 20200402.docx

The Splunk App and Technical Add-On can be downloaded from Splunk Base

Your feedback is always welcome, please feel free to comment here or contact splunk-support@zscaler.com


Can we stream the Logs directly from the Zscaler Cloud to Splunk (on-prem), or we still need a Zscaler_NSS VM to stream it to the Splunk app?
This is a on premises environment.

Yes, NSS is still required.

1 Like

Seems like ‘zscalernss-tunnel’ source type is not defined in the current version of the app. Is there any workaround to this, in order to process tunnel logs in Splunk?

The missing sourcetype has now been added, version 2.0.4 contains the fix.

The Admin Audit Logs works great.

Do we have any documents about QRadar - NSS integration.


We are currently working on a refresh to the DSM and App directly with the IBM team. CC @roguerunner, @rahim888

1 Like

Does anyone know if this app works with splunk 8? Python 3?
I think there may be an issue with the audit log

1 Like

This has not been validated as yet. The plan is to get this in before Python 2.7’s end of life date.

The underlying Zscaler SDK is cross 2/3 compatible, as such we’re not expecting issues, but we do need to validate before marking as compatible.

It seems to be failing for us. The _raw is below but it looks like I get a 200 from the server but then get a socket error

12-16-2019 16:43:32.959 -0500 WARN HttpListener - Socket error from while accessing /servicesNS/nobody/search/data/inputs/zscalerapi-zia-audit/zscaler_audit/: Broken pipe
12-16-2019 16:43:32.945 -0500 INFO ExecProcessor - Removing status item "/opt/splunk/etc/apps/TA-Zscaler_CIM/bin/zscalerapi-zia-audit.py (zscalerapi-zia-audit://zscaler_audit) (isModInput=yes) - splunk-system-user [16/Dec/2019:16:43:32.906 -0500] “GET /services/data/inputs/zscalerapi-zia-audit/zscaler_audit HTTP/1.0” 200 5204 - - - 14ms

This is interesting, I believe it indicates a Splunk internal error as the TA does not use any loopback calls. Can you please review this on Splunk Answers and see if the mentioned fix works in your environment


FYI - V8 and Py3 is now confirmed. Splunk base has been updated for both the TA and App.

1 Like

Feature request: this app could use HTTP proxy support out of the box!

If your local Splunk infrastructure cannot connect to the internet directly, here’s a quick’n’dirty hack to add HTTP proxy support to the session handler for fetching Audit logs and Sandbox results.

Make the following modifications to the file TA-Zscaler_CIM\bin\zscaler_python_sdk\Session.py:

  1. In line 8, add a definiton for your local proxy:

myproxy = {‘https’: ‘http://my.proxy.net:3128’}

  1. Around line 93, add a proxies= parameter to the _perform_get_request function referencing your proxy definition:

def _perform_get_request(self, uri, header):
res = self.session.get(

  1. Repeat for the _perform_post_request (around line 119) and _perform_delete_request (around line 148) functions.

Reference: https://2.python-requests.org/en/master/user/advanced/#proxies

1 Like

Great share, thank you. I’ll look to baking this into the App.

@Eddie_Parra for SDK input.

Many thanks in advance for looking into this. :slightly_smiling_face: You might want to check out the proxy support implementation in the Office 365 Reporting Add-On in Splunkbase (https://splunkbase.splunk.com/app/3720/), it’s pretty similar to your app and also uses the App Builder framework.

1 Like

Working on the Proxy feed. The Zscaler recommended and the one in the PDF have some differences, besides the addition of several other fields. Several fields are surrounded by double quotes, including “%s{ereferer}”, and most of the reqsize fields. Are these double quotes truly needed?

Also, when I enable the tunnel feed, the proxy feed seems to stop. Is there something special I need to do when sending Proxy, Tunnel, and Alert feeds to the NSS AMI?

Hi @Dan_Smart, please use the fields in the design document, these are tested and known to work. We’ve found the quotes are needed to avoid some KV extraction issues with query strings.

For the latter issues re the tunnel type, please open a support ticket. Enabling tunnel feed shouldn’t stop/break anything.

1 Like

Just to be clear, the three feeds can share the same listener port on the syslog/heavy forwarder. Correct?

In my environment I use a dedicated port for each sourcetype, going direct into the forwarder.

If you use Splunk Connect for Syslog (SC4S) you can leverage a single port.

1 Like