Skip to main content

Seamless Log Shipping: Unlocking Netskope’s Native Integration with Microsoft Sentinel

In today’s security landscape, visibility and real-time insights into your organization’s data are critical for effective threat hunting and incident management. To meet these needs, Netskope has developed a native integration with Microsoft Sentinel using the Codeless Connector Platform—allowing organizations to easily stream all Netskope Web Transaction Logs into Microsoft’s cloud-native SIEM.

 

This blog explores how this new integration seamlessly pulls in near real time all of your organizations Web Transactions to a single Log Analytics table. 

 

Architecture

AD_4nXfxftsm9D88K8vEyfbu8OxmqX96WjzRJPeqBFGD0cAZxtWAuXw4gdJebQp2yNRTLcBL-hlQH_2N6d0SS2Z3WNbEUbZadMCYkBiWZ1Tld3vr6VnxrpQs9gkcwuDXfj_qQNSt6wdpIQ?key=dudSmVWciEVO0vrMH33DHOvv

 

In this architecture data is pushed from Netskope to the customer Storage Blob Container. Every time a file is written either containing Web Transactions or Netskope Alerts and Events a blob notification is sent to Event Grid. Once the notification is seen by Event Grid, the blob URI is then sent to a storage Queue so the Scuba workers (Microsoft Sentinel CCP Connectors) know what and where to pull from Blob. This is achieved via pop messages from the queue to the Scuba workers. Then the Scuba workers ingest the data and place the Netskope Web Transactions and Events to the Data Collection Rule created by the arm template to then appear in your Log Analytics table inside Sentinel.

 

How It Works: A Seamless Process for Log Management

Netskope’s native integration allows for rapid log shipping without any additional deployment overhead. Here’s how you can enable the integration today:

 

  1. Navigate to the Azure Portal and select Deploy a Custom Template and paste this json file in the editor file: (under Build your own template in the editor )

For Web Transactions paste the following json: https://gist.githubusercontent.com/mitchellgulledge2/a463b7e13bfbcffffa43e5e978cd6d13/raw/1d906d9cc8517a438ba962684eafa8974ed74350/webtx_all_columns_netskope_blob.json 

 

  1. Once pasted click next and you will be brought to the following screen will you will need to fill out the workspace location, region and resource group:

 

 

  1. After clicking review and create you will need to navigate to your sentinel instance under Data Connectors where you will see NetskopeWebTxConnector:
  2. Select Open connector page and you will need to fill out the following parameters:
    1. Sentinel azure storage enterprise application - service principal ID
    2. The blob container URL you want to collect data from
    3. The blob container's storage account location
    4. The blob container's storage account resource group name
    5. The blob container's storage account subscription id
    6. The event grid topic name of the blob container's storage account if exist. else keep empty.
  3. Once filled in click connect and data will start flowing in:
  4. You should see the following:


     



    Validation:



     


    Troubleshooting:

  1. Perform all the steps below first in a "clean room" environment. That is, create a new resource group which contains only the minimal # of resources: Sentinel-enabled Log Analytics workspace, a Storage Account, and a Blob Container with expected name.
  2. Ensure diagnostics logs are enabled on the Sentinel Workspace. Do this before anything else.

     

  3. Deploy ONLY ONE Netskope Log Streaming Blob solution template into this workspace. Check resource group deployments to ensure that it deployed without error. After deploying, if you make any changes to the solution template, start over from step 1 in a new clean resource group.
  4. Connect the connector. Use the storage account in the same resource group as the Sentinel Workspace. In case of Netskope connector, leave event grid topic blank, and be sure to set a folder name.

     

  5. Check resource group deployments to ensure that it deployed without error.
  6. Check that the storage account has an "Microsoft.Storage.BlobCreated" Event Subscription, with expected prefix folder.
  7. Click into the Event Subscription details. Ensure the "Endpoint" is {name}-notification
  8. Check that the Storage account has two queues, {name}-dlq and {name}-notification.
  9. Check the role assignments on each queue. Ensure expected App Registration (e.g. "ScubaSentinelToStorageProd" has Storage Queue Data Contributor role. You can verify it's the correct app by clicking on its name, and comparing Object ID on the "Enterprise Application Overview" page with the Service Principal ID in the connector deployment page.
  10. Manually add a piece of test data to the blob container. Be sure the data is formatted correctly, as the DCR will expect it, is in correct format (e.g. .csv.gz), and is in expected folder.
  11. Ensure the blob was added in the correct folder, with the expected name.
  12. Check the metrics on Events page for Storage Account. Look for Published Events and Delivered Events to jump up to 1. Refresh as needed.
  13. Check the {name}-notification queue quickly. Look for a message to arrive in that queue with folder and file name matching the file you just added. Refresh regularly for a few minutes, could be up to 10. The event should disappear from the queue because it has been picked up by the connector.
  14. Check the metrics for the DCR. Look for "Log Ingestion Requests per minute" to come up.
  15. Check the Sentinel Logs, run a simple KQL query for just the table name. Look for matching data to come up.
  16. Visit the connector page. Look for green status, and indication that "last log received" was recently.
  17. Repeat steps 10-15 a few more times.
  18. Now turn on your blob data source, and repeat steps 11-15.
  19. Problems? Check the SentinelHealth table for clues.

     

Be the first to reply!

Reply