REST
22 TopicsSending Messages to Confluent Cloud topic using Logic App
With Logic Apps, we can create workflows that connect to various services and systems, allowing us to automate tasks and streamline the business operations. In this blog post, we will explore how to use Azure Logic Apps to send messages to Kafka Confluent topic. Currently, there is no out-of-box Kafka Confluent connector in logic app. We found that Kafka Confluent provides REST API Confluent Cloud API Reference Documentation. This sample shows how to use HTTP action in workflow to call the Kafka Confluent API which produce record to topic. Prerequisites Azure Account and access to Azure Portal. Access to Confluent Cloud. Confluent Cloud is a fully managed pay-as-you-go Kafka service. You can get a free trial here. Setup the Kafka cluster and topic If you are new to Confluent Kafka, you can check their tutorials: Quick Start for Confluent Cloud | Confluent Documentation. Create a new Kafka cluster on Confluent Cloud. Navigate to the cluster and click Cluster settings. Note the REST endpoint. We will use following endpoint in this example: Create a new Kafka Topic called "LAtest" using the default topic settings. Create a new API Key and Secret. Navigate to the cluster and from the left menu, select Cluster Overview -> API Keys. Click Create key and follow the prompts to create a Global access API key. Note down the value of key and secret. To communicate with the REST API, we need to use this API key ID and corresponding secret to create the base64 encoded string in the authorization header that will be included in the REST calls. To learn more, see Authentication part which describes Cloud and Cluster API keys and base64 encoding. Create Logic App workflow To produce message to a topic, we need to provide JSON data and a base64-encoded API key and secret to the REST Produce endpoint: /kafka/v3/clusters/<cluster-id>/topics/<topic-name>/records. Below is a sample REST code snippet (Non-streaming mode): curl \ -X POST \ -H "Content-Type: application/json" \ -H "Authorization: Basic <base64-encoded API key and secret>" \ https://xxx.gcp.confluent.cloud:443/kafka/v3/clusters/lkc-mxpx52/topics/LAtest/records \ -d '{"value":{"type":"JSON","data":"Hello World!"}}' In the Logic App workflow, we can add a "When a HTTP request is received trigger" to fire the workflow for test. Then we add a HTTP action with following configurations: Run the workflow We can send the body message to target topic successfully. View the Messages tab of Latest topic on the Confluent Cloud UI:148Views2likes0CommentsDeploy Logic App Standard with Application Routing Feature Based on Terraform and Azure Pipeline
Due to Terraform's cross-cloud compatibility, automation, and efficient execution, among many other advantages, more and more customers use it to deploy integration solutions based on Azure Logic App standard. However, despite the extensive contributions from the community and individual contributors providing Terraform templates and supporting VNET integration solutions for Logic App standards, there are still very few terraform templates covering the "Application routing" and "Configuration routing" settings: This article shared a mature plan to deploy logic app standard then set the mentioned routing features automatically. It's based on Terraform template and Azure DevOps Pipeline. Code Reference: https://github.com/serenaliqing/LAStandardTerraformDeployment/tree/main/Terraform-Deployment-Demo About Terraform Template: Please kindly find the the template in directory Terraform/LAStandard.tf, it includes the terraform definitions for logic app standard, the backend storage account, application insights, virtual network and VNET integration settings. About VNET Routing Configuration Because there is no terraform examples available for VNET routing, we add VNET Settings by invoking "Patch" request to ARM RESTful API endpoint for interacting with logic app standard site: https://management.azure.com/subscriptions/<Your subscription id>/resourceGroups/$(deployRG)/providers/Microsoft.Web/sites/$(deployLA)?api-version=2022-03-01 We figured out the required request body in network trace as the following format: { "properties": { "vnetContentShareEnabled": false, "vnetImagePullEnabled": true, "vnetRouteAllEnabled": false, "vnetBackupRestoreEnabled": false } } Please find the YAML file in TerraformPipeline/logicappstandard-terraform.yml. Within the Yaml file , the "AzureCLI@2" task is used to send the request by Azure CLI command. task to send the patch request. Special Tips: To use the terraform task during Azure pipeline run, it's required to install terraform extension (which you can find in the following link): https://marketplace.visualstudio.com/items?itemName=ms-devlabs.custom-terraform-tasks Terraform tasks: Reference: Deploy Logic App Standard with Terraform and Azure DevOps pipelines https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/app_service https://azure.microsoft.com/en-us/products/devops/pipelines274Views2likes0CommentsUnable to attach binary files for Azure DevOps REST API
I was trying to upload binary files using Azure DevOps REST API service. reference: https://docs.microsoft.com/en-us/rest/api/azure/devops/wit/attachments/create?view=azure-devops-rest-6.0#upload-a-binary-file I was trying to upload "ATTACHMENT_TEST.zip" ref : https://drive.google.com/file/d/15Y3IS0BWoCaMo7kjt6t1ZCdHNH_PGT65/view?usp=sharing Converted ATTACHMENT_TEST.zip to base64 UEsDBBQAAAAAALZIcVSXxhNiBAAAAAQAAAAIAAAAVEVTVC50eHRTSUREUEsBAhQAFAAAAAAAtkhxVJfGE2IEAAAABAAAAAgAAAAAAAAAAQAgAAAAAAAAAFRFU1QudHh0UEsFBgAAAAABAAEANgAAACoAAAAAAA== Tried to add base64 as json in payload. The URL produced by the output is giving me invalid zip Code import requests import json url = "https://dev.azure.com/{Organization}/{ProjectName}/_apis/wit/attachments?uploadType=Simple&api-version=6.0&fileName=app.zip" payload = json.dumps("[UEsDBBQAAAAAALZIcVSXxhNiBAAAAAQAAAAIAAAAVEVTVC50eHRTSUREUEsBAhQAFAAAAAAAtkhxVJfGE2IEAAAABAAAAAgAAAAAAAAAAQAgAAAAAAAAAFRFU1QudHh0UEsFBgAAAAABAAEANgAAACoAAAAAAA==]") headers = { 'Content-Type': 'application/json', 'Authorization': 'Basic $AuthKey', 'Cookie': 'VstsSession=%7B%22PersistentSessionId%22%3A%22fe6c3302-6671-4bfc-9cbe-0d33f145a31f%22%2C%22PendingAuthenticationSessionId%22%3A%2200000000-0000-0000-0000-000000000000%22%2C%22CurrentAuthenticationSessionId%22%3A%2200000000-0000-0000-0000-000000000000%22%2C%22SignInState%22%3A%7B%7D%7D' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text)1.6KViews0likes0CommentsGetting News articles using REST API
Hello, We have a site for newsroom, where we are displaying all news articles. I need to create a rest API to read this information and display it on our REACT based app. I am unable to get hold of a URL that would provider me with this Information. I have tried following URLS but I am not getting the desired results https://<tenant>.sharepoint.com/sites/iNewsroom/_api/Web/Lists/getByTitle('Site%20Pages')/items?$select=* https://<tenant>.sharepoint.com/_api/search/query?querytext=%27IsDocument:True%20AND%20FileExtension:aspx%20AND%20PromotedState:2%27 https://<tenant>.sharepoint.com/search/_api/search/query?querytext=%27IsDocument:True%20AND%20FileExtension:aspx%20AND%20PromotedState:2%27\ Is there any other URL I can try to get the information? The newsroom site has a few sitep707Views0likes0CommentsQuery web API and return JSON data
curl -X GET "https://api.server.com/v1/markets/quotes?symbols=AAPL,VXX190517P00016000&greeks=false" \ -H 'Authorization: Bearer <TOKEN>' \ -H 'Accept: application/json' How do i run this Rest Json API in sql server directly ? I believe its a combination of using he below, but i could not figure out the last syntax. sp_OACreate, sp_OAMethod sp_OAGetProperty Python version is here : # Version 3.6.1 import requests response = requests.get('https://api.server.com/v1/markets/quotes', params={'symbols': 'AAPL,VXX190517P00016000', 'greeks': 'false'}, headers={'Authorization': 'Bearer <TOKEN>', 'Accept': 'application/json'} ) json_response = response.json() print(response.status_code) print(json_response)1KViews0likes1CommentDevSum Special: Skriv inte kod som "Legenden Leo" - Distributed Systems - Season 3, Ep. 42
I veckans episod, som spelades in live på DevSum konferensen (https://devsum.se), är vi glada att välkomna Dylan Beattie! Dylan har en imponerande meritlista, med erfarenhet från att ha talat på hundratals konferenser till att undervisa i komplexa arkitekturkurser. Men det kanske mest unika inslaget är hans kreativa omtolkning av låten "We didn't start the fire", omarbetad för att handla om JavaScript-ramverk! I detta avsnitt berör vi allt från Dylan's syn på YouTube-kommentarer till det mänskliga egot, med en hint av IT-historia dyker vi djupt ner i ämnen som distribuerade system och eventbaserad arkitektur, samt utforskar hur teamstruktur kan påverka arbetsflöden. Men det är inte allt - vi tar även en titt på många andra spännande ämnen och kommer in på allt från REST, SOAP och GRPC till Telemetri och legenden leo (Legenden Leo - Kungliga slotten). Så luta dig tillbaka och njut av en diskussion som bjuder på både insikter och underhållning, oavsett om du är en erfaren utvecklare eller bara har ett allmänt intresse för teknikvärlden. Missa inte det här avsnittet! Lyssna på avsnittet härCascading dropdown using on quick edit mode?
Hi, Is it possible for cascade dropdown to work on quick edit mode? I have asked around how to make it work and this guide was give to me - https://www.c-sharpcorner.com/blogs/cascading-dropdownlist-in-sharepoint?fbclid=IwAR3zoSBZRiIr1gg6lNBPEmNdj3buiqeRmtQ13feE2t4dnp_JPpmFeCfJM4w The problem is every time I paste the given code on a script editor webpart it would show as it is and cascade would not work Any advise to lead me to the right direction, please? Thanks2.7KViews0likes4Comments