Skip to main content

Integration Web Transactions from Netskope Log Streaming to Microsoft Sentinel

  • April 30, 2025
  • 0 replies
  • 1943 views

Kmaheshwari
Netskope Employee
Forum|alt.badge.img+4

(Article Updated on 15th December, 2025) - See the change logs at the bottom.

Netskope is actively partnering with Microsoft to bring the CCF connectivity function to Azure Blob into public preview. We will be publishing into Content Hub the CCF connector for web transaction logs (streamed to blob) separately and as Microsoft advises. In the meantime, please follow this article if you would like to proceed with the custom template, obtained from the Netskope OSS github repo, Netskope has vetted with Microsoft and several Netskope customers. 

Customers that have adopted CCF can also use the CCF connector for retrieving event and alert logs (instead of using Cloud Exchange or Azure Functions) via the Netskope RESTful API gateway. Learn more here:

 


This document outlines the seamless integration of Netskope Web Transaction Logs with Microsoft Sentinel using the Codeless Connector Framework (CCF). This native integration provides near real-time visibility for effective threat hunting and incident management. To configure Netskope Log Streaming to send logs to your Blob storage please go through the article: https://docs.netskope.com/en/stream-logs-to-azure-blob

🏗️Integration Architecture Overview

The integration leverages several Azure services to ensure reliable and scalable log delivery:

  1. Netskope to Azure Storage Blob: Netskope pushes Web Transaction Logs (and Alerts/Events) into a customer-owned Azure Storage Blob Container.

  2. Event Grid Notification: Every time a new file (blob) is written, an automatic notification is sent to Azure Event Grid.

  3. Storage Queue Messaging: Event Grid pushes the blob URI to a Storage Queue. This acts as a work queue for the connector.

  4. Microsoft Sentinel CCF Connector (Scuba Workers): These connectors continuously pull (pop) messages from the Storage Queue.

  5. Data Ingestion: The Scuba workers retrieve the data from the Blob Storage, process it, and ingest it into the Log Analytics Workspace via a Data Collection Rule (DCR).

  6. Log Analytics: The Netskope Web Transactions and Events appear in the specified Log Analytics table within Microsoft Sentinel.

This blog explores how this new integration seamlessly pulls in near real time all of your organizations Web Transactions to a single Log Analytics table. 

 

🔒 Prerequisites

To successfully deploy the custom template and configure the Netskope Data Connector, the user or Service Principal performing the steps must have the following minimum permissions assigned:

1. Microsoft Entra ID (Tenant-Level Role)

The deployment process involves working with Enterprise Applications (Service Principals) to set up the connector.

Role Scope Required Action Minimum Prerequisite Role for Assignment
Application Administrator Microsoft Entra ID (Tenant) Required to create, manage, and configure all aspects of application registrations and Enterprise Applications (Service Principals).

Privileged Role Administrator 

 

2. Azure Role-Based Access Control (Subscription-Level Role)

The connector deployment uses an ARM template that creates several Azure resources (Data Collection Rule, Event Subscription, etc.) and requires the ability to deploy resources.

Role Scope Required Action Minimum Prerequisite Role for Assignment
Contributor Azure Subscription Grants full access to manage all resources (including deploying the ARM template and creating required resources), excluding the ability to assign roles (like Owner/User Access Administrator) to others. Owner or User Access Administrator (at the same scope)

 

 

🚀 How to Enable the Integration

Netskope’s native integration simplifies log shipping with minimal deployment overhead. Follow these steps to enable the connector:

1. Deploy the Azure Sentinel Connector Template

  1. Navigate to the Azure Portal.

  2. Search for and select Deploy a Custom Template.

  3. Select Build your own template in the editor.

  4. Paste the following JSON file URL into the editor:

  5. Click Next.

  6. Fill in the required deployment parameters (Workspace LocationRegion, and Resource Group where your Sentinel instance is located).

  7. Click Review + Create and then Create to deploy the necessary Azure resources (Event Subscription, Storage Queues, Data Collection Rule, etc.).

2. Configure the Data Connector in Sentinel

  1. Navigate to your Microsoft Sentinel Instance.

  2. Under Content management, select Data Connectors.

  3. Locate the new connector named NetskopeWebTxConnector.

  4. Select Open connector page.

  5. Fill out the required parameters:

Parameter Description
Sentinel azure storage enterprise application - service principal ID

Grant the tenant Admin access to the Service Principal for the creation of Service Principal. If the ServicePrincpal already existing it will be auto populated with values.

The blob container URL you want to collect data from The full URL of the Netskope container (e.g., https://[storageaccountname].blob.core.windows.net/[containername]).
The blob container's storage account location The Azure region of the Storage Account.
The blob container's storage account resource group name The Resource Group name of the Storage Account.
The blob container's storage account subscription id The Azure Subscription ID of the Storage Account.
The event grid topic name of the blob container's storage account If an Event Grid Topic exists for the Storage Account, enter its System Topic name. If not, keep this empty. You can find the existing System Topic Name under Events in the particular Blob Storage.

 

If the ServicePrincipal doesn’t exist. It will show something like below and grant the tenant admin consent.

 

It will create the ServicePrincipal something like below. The details related to ServicePrincipal will be auto populated as shown in other screesnshot.

 

  1. Before clicking on connect assign the Storage Blob Data Contributor and Storage Queue Data Contributor at the ServicePrincipal Level.
     

     

  2. Click Connect to finalize the integration.

     

 

📝 Note: Data will typically begin appearing in your Log Analytics workspace within 20 minutes after successful connection.

 

  1. After connection you can check out the data in the Table: NetskopeWebTransaction_CL

     

🔒 Security Enhancement: Whitelisting Scuba Worker IPs (If Blob Firewall is Used) - Only to be used when it’s enabled from customer end

 

We still not suggest get the IPs based whitelisting enabled as the Microsoft Scuba IPs keep on changing

If your Azure Storage Account's networking is restricted using a firewall (e.g., set to "Enabled from selected virtual networks and IP addresses"), you must grant access to the Microsoft Sentinel Codeless Connector Framework (CCF) workers (Scuba workers) for successful data ingestion. You can download the Scuba IPs from the Microsoft Documentation: https://www.microsoft.com/en-us/download/details.aspx?id=56519

Additionally you can use the below script to update the IPs in the Blob Storage: https://raw.githubusercontent.com/netskopeoss/Netskope_Web_Transactions_Azure_Sentinel/refs/heads/main/Ipwhitelistscript.ps1

You can see some error for some of the IPs like below and please ignore and move forward with updating other IPs

Values for request parameters are invalid: networkAcls.ipRule[*].value. For more information, see - https://aka.ms/storagenetworkruleset


Troubleshooting:

  1. Perform all the steps below first in a "clean room" environment. That is, create a new resource group which contains only the minimal # of resources: Sentinel-enabled Log Analytics workspace, a Storage Account, and a Blob Container with expected name.
  2. Ensure diagnostics logs are enabled on the Sentinel Workspace. Do this before anything else.


 

  1. Deploy ONLY ONE Netskope Log Streaming Blob solution template into this workspace. Check resource group deployments to ensure that it deployed without error. After deploying, if you make any changes to the solution template, start over from step 1 in a new clean resource group.
  2. Connect the connector. Use the storage account in the same resource group as the Sentinel Workspace. In case of Netskope connector, leave event grid topic blank, and be sure to set a folder name.

 


 

  1. Check resource group deployments to ensure that it deployed without error.
  2. Check that the storage account has an "Microsoft.Storage.BlobCreated" Event Subscription, with expected prefix folder.
  3. Click into the Event Subscription details. Ensure the "Endpoint" is {name}-notification
  4. Check that the Storage account has two queues, {name}-dlq and {name}-notification.
  5. Check the role assignments on each queue. Ensure expected App Registration (e.g. "ScubaSentinelToStorageProd" has Storage Queue Data Contributor role. You can verify it's the correct app by clicking on its name, and comparing Object ID on the "Enterprise Application Overview" page with the Service Principal ID in the connector deployment page.
  6. Manually add a piece of test data to the blob container. Be sure the data is formatted correctly, as the DCR will expect it, is in correct format (e.g. .csv.gz), and is in expected folder.
  7. Ensure the blob was added in the correct folder, with the expected name.
  8. Check the metrics on Events page for Storage Account. Look for Published Events and Delivered Events to jump up to 1. Refresh as needed.
  9. Check the {name}-notification queue quickly. Look for a message to arrive in that queue with folder and file name matching the file you just added. Refresh regularly for a few minutes, could be up to 10. The event should disappear from the queue because it has been picked up by the connector.
  10. Check the metrics for the DCR. Look for "Log Ingestion Requests per minute" to come up.  
  11. Check the Sentinel Logs, run a simple KQL query for just the table name. Look for matching data to come up.
  12. Visit the connector page. Look for green status, and indication that "last log received" was recently.
  13. Repeat steps 10-15 a few more times.
  14. Now turn on your blob data source, and repeat steps 11-15.
  15. Problems? Check the SentinelHealth table for clues.

 

 

Common Errors:

  •  If you see the error: Deployment Failed At least one resource deployment operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-deployment-operations for usage details.
    { "error": { "code": "BadRequest", "message": "Connectivity check failed. ConnectorId: NetskopeWebTransactions, Status code:S3B40010, Message:An unknown exception occurred." } } - 
    The issue is mainly due to missing role assignment to the ServicePrincipal or ServicePrincipal not created properly or Blob storage is enabled from selected network and Scuba IPs are not whitelisted. All the above steps are already covered in the above documentation

  • Failed to create required resources for data connector Invalid output table schema {0}: The following columns which exist in the current schema do not exist in the new schema or have different types : {1} - This kind of error are seen when you have done some changes in the template schema or directly in your environment which can lead to the above error.
  • If you see the errror: { "code": "BadRequest", "message": "{\r\n \"error\": {\r\n \"code\": \"InvalidRequest\",\r\n \"message\": \"System topic source cannot be modified.\"\r\n }\r\n}" - Your system topic is already existing. Copy the existing system topic name from the Blob Storage > Events.


Change Logs:

  • Updated the template to only deploy the connector. Now all the role assignment need to be done manually as the role assignment needs owner level access and delay steps which were earlier being used for the role assignment have been removed as this was causing issue in some of the customer environment while deploying.
  • Above documentation has been fully updated to the new flow and capture some new update with the same.
  • Additionally if the customer have firewall enabled from their particular blob storage they can use scuba IPs whitelisting step with the script for whitelisting 100s of Microsoft Scuba IPs. The Same Script has been uploaded to the same github repo and link for the same has been attached to the above docs.
  • Recently a lot of errors were seen related to the Bad request during performing connection. The error didn’t have any details but we have captured what could have been missing which have caused the issue. Also captured the other kind of errors seen while deploying or after deploying the template
This topic has been closed for replies.