Inspect JSON and API content for deep analysis of non sanctioned cloud storage

(Ben Lyons) #1

Hi Guys,

I am looking for a solution which can provide me greater transparency of my users uploading files to non enterprise owned cloud storage tenants. 99% of the time this would be OK, however, there are instances where we may need to monitor or retrospectively review cloud storage. The tenant level analysis is key to my requirement.

eg. an accountant uploads a tax return to their own private google drive storage which is not part of the corporate network (We have a corporate DMS and are Office365 centric). This document does not contain any DLP tags or other identifiers to make it a known “type”. I simply want to be able to know that a user uploaded a file called random.pdf to a non firm cloud storage provider.

I are currently using MS 365 E5, with a strong focus on Defender, the MS ATP stack and incrementally building integration into the Sentinel engine. Our current forward proxy logs are shipped to MS CASB providing us with high level information.

Is this possible now or in the near future, any feedback would be appreciated.


(Thomas Quinlan) #2

Welcome Ben!

The scenario you’re describing would best be served by our DLP offering. It comes in two formats - Patterns/Regular Expressions, and Exact Data Match.

We don’t scan Office 365 traffic (per Microsoft’s recommendation) so you would need to use existing Microsoft scanning for Office traffic.

For everything else, setting up DLP, either using patterns/regular expressions, and/or exact data matching, would fit the scenario you describe. For instance, using existing patterns/dictionaries (such as for US SSNs) you could do a search with varying confidence levels (v. potential false positives, as would be the case in any DLP solution). Alternatively, if you knew the data you wanted to search for (something specific to the tax returns) you could describe that data in the EDM index tool (on-prem VM) & all traffic through Zscaler would be searched for that data. You could choose to monitor & alert, block, or forward information to a 3rd party DLP tool (typically on-prem).

We also have an integration with MCAS through our Nanologging Service. I don’t believe it fits the particular requirements outlined, but I wanted to mention it so you’re aware.

(Ben Lyons) #3

Hi Thomas,

Thanks for the response.

I can see how the DLP tech will provide the functionality if I am able to proactively describe the various patterns/regex. We have various regex, DLP tagging and information rights management tools to assist with “known” content it is the flow of data to external sources we wish to log.

My primary goal is to see which tenants they are accessing on google drive, onedrive, box, slack, hangouts or any other saas. This information is exposed through the http headers and JSON queries when interactive with various sites. I do not wish to block traffic to these sites using MS CASB integrated policies with zScaler as there will always be legitimate use scenarios.

Using the ZIA proxy to inspect headers and log or have the ability to assign different rules based on within our tenant or sanctioned tenants for various cloud services is key.

The following MS and google articles partially describe the concept but they are more policy driven and hard rules rather than providing the ability to review logs after an event. I understand a competitor product which sounds just like an old web browser can currently provide this functionality.