We are excited to announce that Kafka Input and Output with Azure Stream Analytics is now generally available! This marks a major milestone in our commitment to empowering our users with robust and innovative solutions. With Stream Analytics Kafka Connectors, users can natively read and write data to and from Kafka topics. This enables users to fully leverage Stream Analytics' rich capabilities and features even when the data resides outside of Azure. Azure Stream Analytics is a job service, so you do not have to spend time managing clusters, and downtime concerns are alleviated with a 99.99% SLA (Service Level Agreement) at the job level.
Key Benefits:
- Stream Analytics job can ingest Kafka events from anywhere, process them and output them to any number of Azure services as well as to other Kafka clusters.
- No need to use workarounds such as MirrorMaker or Kafka extensions with Azure function to process data in Kafka with azure stream analytics.
- The solution is low code and entirely managed by the Azure Stream Analytics team at Microsoft.
Getting Started:
To get started with Stream Analytics Kafka input and output connector please refer to these links provided below:
You can add Kafka input or output to a new or an existing Stream Analytics job in a few simple clicks. To add Kafka input, go to input under job topology, click Add input, and select Kafka. For Kafka output, go to output under job topology, click Add output, and select Kafka. Next, you will be presented with Kafka connection configuration. Once filled you will be able to test the connection with Kafka cluster.
VNET Integration:
You can connect to Kafka cluster from Azure Stream Analytics whether it is on cloud or on prem with a public endpoint. You can also securely connect to Kafka cluster inside a virtual network with Azure Stream Analytics. Visit the Run your Azure Stream Analytics job in an Azure Virtual Network documentation for more information.
Automated deployment with ARM template
ARM templates allow for quick and automated deployment of Stream Analytics jobs. To deploy a stream analytics job with Kafka Input or Output quickly and automatically, users can include the following sample snippet in their Stream Analytics job ARM template.
"type": "Kafka",
"properties": {
"consumerGroupId": "string",
"bootstrapServers": "string",
"topicName": "string",
"securityProtocol": "string",
"securityProtocolKeyVaultName": "string",
"sasl": {
"mechanism": "string",
"username": "string",
"password": "string"
},
"tls": {
"keystoreKey": "string",
"keystoreCertificateChain": "string",
"keyPassword": "string",
"truststoreCertificates": "string"
}
}
We can’t wait to see what you’ll build with Azure Stream Analytics Kafka input and output connectors. Try it out today and let us know your feedback. Stay tuned for more updates as we continue to innovate and enhance this feature.
Call to Action:
- For direct help with using the Azure Stream Analytics Kafka input, please reach out to askasa@microsoft.com.
- To learn more about Azure Stream Analytics click here
Updated Nov 19, 2024
Version 1.0Vaibhav_Shrivastava
Microsoft
Joined October 08, 2024
Analytics on Azure Blog
Follow this blog board to get notified when there's new activity