Integrating the F5 BIGIP with Azure Sentinel

So here’s the deal; I have a few F5 BIG-IP VEs deployed across the globe protecting my cloud-hosted applications. It sure would be nice if there was a way to send all that event and statistical data to my Azure Sentinel workspace. Well, guess what? There is a way and yes, it is nice.

The Application Services 3 (AS3) extension is relatively new mechanism for declaratively configuring application-specific resources on a BIG-IP system. This involves posting a JSON declaration to the system’s API endpoint, (https://<BIG-IP>/mgmt/shared/appsvcs/declare).

Telemetry Streaming (TS) is an F5 iControl LX Extension that, when installed on the BIG-IP, enables you to declaratively aggregate, normalize, and forward statistics and events from the BIG-IP. The control plane data can be streamed to Azure Log Analytics application by posting a single TS JSON declaration to TS’s API endpoint, (https://<BIG-IP>>mgmt/shared/telemetry/declare).

As illustrated on the right, events/stats can be collected and aggregated from multiple BIG-IPs regardless of whether they reside in Azure, on-premises, or other public/private clouds.

Let’s take a quick look at how I setup my BIG-IP and Azure sentinel. Since this post is not meant to be prescriptive guidance, I have included links to relevant guidance where appropriate. Okay, let’s have some fun!

So I don’t want to sound too biased here but, with that said, the F5 crew has put out some excellent guidance on Telemetry Streaming. The CloudDocs site, (see left) includes information for various cloud-related F5 technologies and integrations. Refer to the installation section for detailed guidance.

Install the Plug-in

The TS plug-in RPM can be downloaded from the GitHub repo, (

  1. From the BIG-IP management GUI, I navigated to iApps –> Package ManagementLX  and selected ‘Import’
  2. I selected ‘Choose File’ , browsed to and selected the downloaded rpm.

With the TS extension installed, I can now configure streaming via the newly created REST API endpoint. You may have noticed that I have previously installed the Application Services 3, (AS3) extension. AS3 is a powerful F5 extension that enables application-specific configuration of the BIG-IP via a declarative JSON REST interface.

Configure Logging Profiles and Streaming on BIG-IP

As I mentioned above, I could make use of the AS3 extension to configure my BIG-IP with the necessary logging resources. With AS3, I can post a single JSON declaration, (I used Postman to apply) that configures event listeners for my various deployed modules. In my deployment, I’m currently using Local Traffic Manager, and Advanced WAF. For my deployment, I went a little “old school” and configured the BIG-IP via the management GUI or TMSH cli. Regardless of the method you prefer, the installation instructions provide detailed guidance for each log configuration method.

LTM Logging

To enable LTM request logging, I ran the following two TMSH commands. Afterwards, I enabled request logging on the virtual server, (see below) to begin streaming data to Azure Log Analytics.

  1. Create Listener Pool - create ltm pool telemetry-local monitor tcp members replace-all-with { }
  2. Create LTM Request Log Profile - create ltm profile request-log telemetry request-log-pool telemetry-local request-log-protocol mds-tcp request-log-template event_source=\"request_logging\",hostname=\"$BIGIP_HOSTNAME\", client_ip=\"$CLIENT_IP\",server_ip=\"$SERVER_IP\", http_method=\"$HTTP_METHOD\", http_uri=\"$HTTP_URI\", virtual_name=\"$VIRTUAL_NAME\",event_timestamp=\"$DATE_HTTP\" request-logging enabled

ASM, (Advanced WAF) Logging

To enable ASM event logging, I ran the following two TMSH commands. Afterwards, I simply needed to associate my security logging profiles to my application virtual servers, (see below).

  1. Create Security Log Profile – create security log profile telemetry application replace-all-with { telemetry { filter replace-all-with { request-type { values replace-all-with { all } } } logger-type remote remote-storage splunk servers replace-all-with { {} } } }

Streaming Data to Azure Log Analytics

With my BIG-IP configured for remote logging, I was now ready to configure my BIG-IPs to stream event data to my Azure Log Analytics workspace. This is accomplished by posting a JSON declaration to the TS API endpoint. The declaration, (see example below) includes settings specifying workspace ID, access passphrase, polling interval, etc.). This information can be gathered from the Azure portal or via Azure cli. With the declaration applied to the the BIG-IP event/stat data now streams to my Azure workspace.

Utilize Azure Sentinel for Global Visibility and Analytics

With event and stats now streaming into my previously created OMS workspace from my BIG-IP(s), I can now start to visualize and work with the aggregated data. From the OMS workspace I can aggregate data from my BIG-IPs as well as other sources and perform complex queries. I can then take the results and use them to populate a one or more custom dashboards, (see example below).

Additionally, to get started quickly I can deploy a pre-defined dashboard directly out of the Azure OMS workspace. As of this post, F5 currently has a pre-canned dashboard for visualizing Advanced WAF and basic LTM event data, (see below).


Now, I have a single pane of glass that can be pinned to my Azure portal for quick, near-real time visibility of my globally deployed application. Pretty cool, huh? Here’s the overall order and some relevant links:

  1. Setup Azure Sentinel and OMS Workspace
  2. Install and Configure Telemetry Streaming onto the BIG-IP(s)
  3. Configure logging on BIG-IP(s)

Additional Links

Published Jun 12, 2019
Version 1.0

Was this article helpful?


  • Randyj's avatar
    Icon for Nimbostratus rankNimbostratus

    It appears that Sentinel has deprecated the use of 'Dashboards, in favor of 'Workbooks'. I note there is a 'F5 BIG-IP ASM' workbook available, however, it appears that one of the 'Required data types' (F5Telemetry_LTM_CL) is not included in the logging we've setup by following this article .


    Data types shown in my 'Custom Logs' node in my Log Analytics workspace are:

    • F5Telemetry_ASM_CL
    • F5Telemetry_clientSslProfiles_CL
    • F5Telemetry_deviceGroups_CL
    • F5Telemetry_httpProfiles_CL
    • F5Telemetry_iRules_CL
    • F5Telemetry_ltmPolicies_CL
    • F5Telemetry_networkTunnels_CL
    • F5Telemetry_pools_CL
    • F5Telemetry_serverSslProfiles_CL
    • F5Telemetry_sslCerts_CL
    • F5Telemetry_system_CL
    • F5Telemetry_telemetryEventCategory_CL
    • F5Telemetry_telemetryServiceInfo_CL
    • F5Telemetry_virtualServers_CL

    Any thoughts on how further configure, or troubleshoot ?

  • Hi


    I am trying to follow these steps but I stopped at Streaming Data to Azure Log Analytics where I need to create Jason for sending logs to my Workspace so I would like to know how can I get one sample for perform this activity that because the example above doesn´t have these instructions so that´why I am asking for it .





  • Hello Roberto,


    My apologies for the delayed response. Below is a sample of the JSON I believe you are looking for to hookup TS streaming with an Azure workspace. This sample should follow the above walk-through. With that said, Azure has now migrated from dashboards to azure workbooks. The workbooks can be found out



       "class": "Telemetry",

       "controls": {

         "class": "Controls",

         "logLevel": "info"


       "My_Poller": {

         "class": "Telemetry_System_Poller",

         "interval": 60


       "My_Listener": {

         "class": "Telemetry_Listener",

         "port": 6514


       "My_Consumer": {

         "class": "Telemetry_Consumer",

         "type": "Azure_Log_Analytics",

         "workspaceId": "<INSERT WORKSPACE ID>",

         "passphrase": {

           "cipherText": "<INSERT PRIMARY KEY>"




  • What version of F5 software is required to support the telemetry streaming iApp?

  • Hi,

    Thanks for the great walkthrough!

    However I'm Having an issue with the F5Telemetry_LTM_CL connector.

    I run 2x-1NIC-BIG-IP's with Azure LB in front.


    I'm not allowed to create the ltm pool since it's referencing a self IP.

    I've tried following this on

    But it's possible I'm doing something wrong, the workaround is not very detailed so I'm asking if anyone could elaborate on it.

    Created VS with listener






    Any help is appreciated!