apache kafka
5 TopicsAzure Stream Analytics Kafka Connectors is Now Generally Available!
We are excited to announce that Kafka Input and Output with Azure Stream Analytics is now generally available! This marks a major milestone in our commitment to empowering our users with robust and innovative solutions. With Stream Analytics Kafka Connectors, users can natively read and write data to and from Kafka topics. This enables users to fully leverage Stream Analytics' rich capabilities and features even when the data resides outside of Azure. Azure Stream Analytics is a job service, so you do not have to spend time managing clusters, and downtime concerns are alleviated with a 99.99% SLA (Service Level Agreement) at the job level. Key Benefits: Stream Analytics job can ingest Kafka events from anywhere, process them and output them to any number of Azure services as well as to other Kafka clusters. No need to use workarounds such as MirrorMaker or Kafka extensions with Azure function to process data in Kafka with azure stream analytics. The solution is low code and entirely managed by the Azure Stream Analytics team at Microsoft. Getting Started: To get started with Stream Analytics Kafka input and output connector please refer to these links provided below: Stream data from Kafka into Azure Stream Analytics Kafka output from Azure Stream Analytics You can add Kafka input or output to a new or an existing Stream Analytics job in a few simple clicks. To add Kafka input, go to input under job topology, click Add input, and select Kafka. For Kafka output, go to output under job topology, click Add output, and select Kafka. Next, you will be presented with Kafka connection configuration. Once filled you will be able to test the connection with Kafka cluster. VNET Integration: You can connect to Kafka cluster from Azure Stream Analytics whether it is on cloud or on prem with a public endpoint. You can also securely connect to Kafka cluster inside a virtual network with Azure Stream Analytics. Visit the Run your Azure Stream Analytics job in an Azure Virtual Network documentation for more information. Automated deployment with ARM template ARM templates allow for quick and automated deployment of Stream Analytics jobs. To deploy a stream analytics job with Kafka Input or Output quickly and automatically, users can include the following sample snippet in their Stream Analytics job ARM template. "type": "Kafka", "properties": { "consumerGroupId": "string", "bootstrapServers": "string", "topicName": "string", "securityProtocol": "string", "securityProtocolKeyVaultName": "string", "sasl": { "mechanism": "string", "username": "string", "password": "string" }, "tls": { "keystoreKey": "string", "keystoreCertificateChain": "string", "keyPassword": "string", "truststoreCertificates": "string" } } We can’t wait to see what you’ll build with Azure Stream Analytics Kafka input and output connectors. Try it out today and let us know your feedback. Stay tuned for more updates as we continue to innovate and enhance this feature. Call to Action: For direct help with using the Azure Stream Analytics Kafka input, please reach out to askasa@microsoft.com. To learn more about Azure Stream Analytics click here196Views1like0CommentsDynamically adding partitions to a topic/event hub under Event Hubs namespace
Event Hubs provides partitions to scale consumers for parallel processing. The concept of partitions belongs to Topics under Event Hubs namespace. Topics helps categorize the incoming messages and consumer in a group processes the events from one of the Topic partitions. When a Topic is created, the number of partitions are specified at the time of creation. For some special cases though, you may have to add partitions after the Topic has been created. This requires you to dynamically accommodate the addition of partitions. This blog describes the behavior of adding partitions to an existing Topic with Event Hubs. Dynamic additions of partitions is available only on Dedicated Event Hubs clusters and not on Standard Event Hubs namespace.7.3KViews1like0CommentsNew Akka Streams and Apache Flink tutorials/samples for Event Hubs for Apache Kafka Ecosystems
First published on on Jun 13, 2018 Introducing two new Event Hubs for Apache Kafka Ecosystems tutorials - Akka Streams and Apache Flink!Event Hubs for Kafka Ecosystems combines the power and simplicity of Event Hubs with the wide ecosystem of Apache Kafka.3.4KViews1like1Comment