microsoft ignite 2024
3 TopicsAzure Stream Analytics Virtual Network Integration Goes GA!
We are thrilled to announce that the highly anticipated capability of running your Azure Stream Analytics (ASA) job in an Azure Virtual Network (VNET) is now generally available (GA)! This feature, which has been in public preview, is set to revolutionize how you secure and manage your ASA jobs by leveraging the power of virtual networks. What Does This Mean for You? With VNET integration, you can now lock down access to your ASA jobs within your virtual network infrastructure. This provides enhanced security through network isolation, ensuring that your data remains protected and accessible only within your private network. By deploying a containerized instance of your ASA job inside your VNET, you can privately access your resources using: Private Endpoints: These allow you to connect your VNET-injected ASA job to your data sources privately via Azure Private Link. This means that your data traffic remains within the Azure backbone network, reducing exposure to the public internet and enhancing security. Service Endpoints: These enable you to connect your data sources directly to your VNET-injected ASA job. This simplifies the network architecture by providing direct connectivity. Service Tags: These allow you to manage network security by defining rules that allow or deny traffic to Azure Stream Analytics. This helps in maintaining a secure environment by controlling which services can communicate with your ASA jobs. Overall, VNET integration enhances the security of your ASA jobs by leveraging Azure's robust networking features. Expanded Regional Availability We are also excited to announce that this capability is now available in additional regions! Along with the existing regions (West US, Central Canada, East US, East US 2, Central US, West Europe, and North Europe), you can now enable VNET integration in the following regions: Australia East France Central North-Central US Southeast Asia Brazil South Japan East UK South Central India These regions were added in response to customer feedback. If you have suggestions for additional regions, please complete this form: https://forms.office.com/r/NFKdb3W6ti?origin=lprLink This expansion ensures that more customers around the globe can benefit from the enhanced security and network isolation provided by VNET integration. Getting Started To get started with VNET integration for your ASA jobs, follow these steps: Set Up Your VNET: Create or use an existing Azure Virtual Network. Create a Subnet: Add a dedicated subnet for your ASA job within the VNET. Set Up Azure NAT Gateway or disable outbound connectivity: Enhance security and reliability by setting up an Azure NAT Gateway or disable default outbound connectivity. Associate a Storage Account: Ensure you have a General Purpose V2 (GPV2) Storage account linked to your ASA job. Configure Your ASA Job: Azure Portal: Go to Networking and select "Run this job in virtual network." Follow the prompts to configure and save. Visual Studio Code: In the 'JobConfig.json' file, set up the 'VirtualNetworkConfiguration' to reference the subnet. Check Permissions: Make sure you have the necessary Role-based access control permissions on the subnet or higher. For detailed instructions and requirements, refer to the official documentation Run your Stream Analytics in Azure virtual network - Azure Stream Analytics | Microsoft Learn. Join the Revolution Stay tuned for more updates and exciting features as we continue to innovate and improve Azure Stream Analytics. Our other Ignite releases include Azure Stream Analytics Kafka Connectors is Now Generally Available! If you have any questions or need assistance, feel free to reach out to us at askasa@microsoft.com. Happy streaming!281Views0likes0CommentsAzure Stream Analytics Kafka Connectors is Now Generally Available!
We are excited to announce that Kafka Input and Output with Azure Stream Analytics is now generally available! This marks a major milestone in our commitment to empowering our users with robust and innovative solutions. With Stream Analytics Kafka Connectors, users can natively read and write data to and from Kafka topics. This enables users to fully leverage Stream Analytics' rich capabilities and features even when the data resides outside of Azure. Azure Stream Analytics is a job service, so you do not have to spend time managing clusters, and downtime concerns are alleviated with a 99.99% SLA (Service Level Agreement) at the job level. Key Benefits: Stream Analytics job can ingest Kafka events from anywhere, process them and output them to any number of Azure services as well as to other Kafka clusters. No need to use workarounds such as MirrorMaker or Kafka extensions with Azure function to process data in Kafka with azure stream analytics. The solution is low code and entirely managed by the Azure Stream Analytics team at Microsoft. Getting Started: To get started with Stream Analytics Kafka input and output connector please refer to these links provided below: Stream data from Kafka into Azure Stream Analytics Kafka output from Azure Stream Analytics You can add Kafka input or output to a new or an existing Stream Analytics job in a few simple clicks. To add Kafka input, go to input under job topology, click Add input, and select Kafka. For Kafka output, go to output under job topology, click Add output, and select Kafka. Next, you will be presented with Kafka connection configuration. Once filled you will be able to test the connection with Kafka cluster. VNET Integration: You can connect to Kafka cluster from Azure Stream Analytics whether it is on cloud or on prem with a public endpoint. You can also securely connect to Kafka cluster inside a virtual network with Azure Stream Analytics. Visit the Run your Azure Stream Analytics job in an Azure Virtual Network documentation for more information. Automated deployment with ARM template ARM templates allow for quick and automated deployment of Stream Analytics jobs. To deploy a stream analytics job with Kafka Input or Output quickly and automatically, users can include the following sample snippet in their Stream Analytics job ARM template. "type": "Kafka", "properties": { "consumerGroupId": "string", "bootstrapServers": "string", "topicName": "string", "securityProtocol": "string", "securityProtocolKeyVaultName": "string", "sasl": { "mechanism": "string", "username": "string", "password": "string" }, "tls": { "keystoreKey": "string", "keystoreCertificateChain": "string", "keyPassword": "string", "truststoreCertificates": "string" } } We can’t wait to see what you’ll build with Azure Stream Analytics Kafka input and output connectors. Try it out today and let us know your feedback. Stay tuned for more updates as we continue to innovate and enhance this feature. Call to Action: For direct help with using the Azure Stream Analytics Kafka input, please reach out to askasa@microsoft.com. To learn more about Azure Stream Analytics click here196Views1like0CommentsGenerally Available:Protocol Buffers (Protobuf) with Azure Stream Analytics
We are excited to announce that Azure Stream Analytics built-in Protobuf deserializer is now generally available! With the built-in deserializer, Azure Stream Analytics supports an out-of-the-box feature to align with the other file formats we support, including JSON and AVRO. Using Protobuf, you can handle data streams in a compact binary format, making it ideal for high-throughput, low-latency applications. Protobuf Protocol Buffers (Protobuf) is a language-neutral, platform-neutral mechanism for serializing structured data, developed by Google. It is widely used for communication between services or for saving data in a compact and efficient format. Configure your Stream Analytics job. Using the Protobuf deserializer is simple to configure. When setting up your Stream Analytics input, select the file format as Protobuf and then upload the Protobuf definition file (.proto file). Complete your configuration by specifying the message type and prefix style. Steps to configure a Stream Analytics job To set up your Stream Analytics job to deserialize events in Protobuf: After creating your job, go to Inputs. Click Add input and choose the desired input to configure. Under Event serialization format, select Protobuf from the dropdown list. Protobuf definition file A file that specifies the structure and data types of your Protobuf event Message type When you define a .proto schema, each message represents a data structure with specific fields. Add the message type that you want to deserialize Prefix style The setting that determines how long a message is, to deserialize Protobuf events correctly We can’t wait to see what you’ll build with Azure Stream Analytics built in Protobuf deserializer. Try it out today and let us know your feedback. Stay tuned for more updates as we continue to innovate and enhance this feature. Call to Action: For direct help with using the Azure Stream Analytics Kafka input, please reach out to askasa@microsoft.com. To learn more about Azure Stream Analytics click here To learn more about Azure Stream Analytics Protobuf deserializer click here151Views0likes0Comments