Forum Widgets
Latest Discussions
Azure OpenAI with SharePoint
Dear Forum Members, Our intranet is built upon SharePoint Online. I am looking for ways to improve the SPO search result with the help of Azure OpenAI. Does anyone have such experience or guidance on how do I get started? Thanks.nhtkidDec 16, 2022Iron Contributor3KViews3likes0CommentsIssues with the new Azure ML Notebook VM
Hi Team, I’m assisting a Partner in Germany that started to work with the new Azure Machine Learning compute instance (NC6): https://docs.microsoft.com/en-us/azure/machine-learning/concept-compute-instance Because this is in preview, I found it as a good idea to relay the Partner feedback to you. Please take a quick look over these issues and let me know if some of them are known, or should I redirect the Partner to technical support. Here are the Partner questions: How should I manage conda environments in the notebook vm jupyter lab? Currently I create environments and add to the selectable kernels in the jupyter lab via “ipykernel install” I also tried nb_conda but that didn’t work. Now I experience some strange behaviour. After activating the environment I seems like I can only use the base env pip. E.g. if I install a pypi module inside the env it is not available in the notebook. If I install the module via “conda install” it is. Pip install abc -> not working python -m pip install abc -> not working conda install abc -> working Currently I want to use spacy. For that I need to install some language model (python -m spacy download abc), which is then not accessible in the notebook. Is it possible to trigger a shutdown of the vm after a notebook execution finished, similar to databricks notebooks? Is it possible to use jupyter lab extensions? After activating the extension manager I get an error message in the extension tab : Error communicating with server extension. Consult the documentation for how to ensure that it is enabled. Reason given: Error: 500 (Internal Server Error) I tried the docs and the suggested commands. Didn’t help. I suggested to the Partner to look into the tutorial with the 1 st experiment as mentioned on our site, in order to maybe train themselves with an already tested operation: https://docs.microsoft.com/en-us/azure/machine-learning/tutorial-1st-experiment-sdk-train The Partner replied that they take another route. In our tutorial the Jupyter Notebooks are run in the Azure ML Notebooks section. But the Partner is starting the jupyter lab through the compute section of Azure ML and do their dev setup (conda env, module installations etc.) through jupyter lab terminal. Is this particular practice atypical or not the best one? For any suggestions or contact info to people that might be interested in this topic, I would like to thank you in advance. Thanks, SorinsorinMJun 09, 2020Microsoft2.5KViews0likes1CommentModel Catalogue - Azure Machine Learning Lab
During the Microsoft Build 2023 event, the concept of Model catalog has been introduced in the Azure Machine Learning Lab. It is one of the finest and easy way to develop and deploy a machine learning model. Readily available models can be immediately deployed and consumed by the end users. Azure Machine learning team makes it more easy for the developers and consumers. But, I have few questions for them 1. We already have some readily consumable AI services under Azure Cognitive Services. What additional benefit Azure Machine learning catalog will bring? 2. How frequent Azure ML team will update the catalog? Regards PKpraveenkumar558Jun 04, 2023Copper Contributor2.3KViews0likes0CommentsCan we use Azure Machine Learning on a Azure Analysis Services?
Hello, it's my first post ever since i was hired as Data Scientist after graduating this month. Of course, i have no experience in any cloud microsoft techonologies (yet) and ask for forgiveness in advance if I mix concepts incorrectly. Also, i tried to search similar answers in the forum but sadly found nothing. In our business, we deployed an Azure Analysis Services with our data models ( i think it's called datalake). These models are used most of them for Reporting in Power BI. Right now, we would like to explore more deep types of analysis using machine learning techniques in Azure Machine Learning. The basic problem is how do i acces to the information in the tabular models in Azure Analysis Services from Azure Machine learning? Is this new platform (Azure ML) able to do that easily, without any trick? For example, we tried to make some querys (to Azure Analysis Services) from our locals, using python and pyodbc library. This never worked and there's no information (at least i've found) in the internet. The reason to use python to make querys it's for practical reasons. You make the query, and you call still work on the same notebook without downloading anything externaly, then throw X machine learning algorithm to do Classification, Regression... in the data selected in the query to make a Exploratory Data Analysis. So our idea, would be to replace this locals machines for something in the cloud which i hope have a direct implementation to work with that and also better hardware. Am i wrong, right...? Do you have any suggestion of how we should do things? Please, correct me! Thank you for your answer!RogerPouDec 24, 2020Copper Contributor2.2KViews1like1CommentAzure ML Inference Cluster - AKS with Private IP
I have an AKS cluster in a VNET/Subnet. My AKS is linked to AzureML. I successfully deployed an Azure ML service to that AKS. However, I see that the azureml-fe service is responding to a public IP and not a private IP from my VNET/Subnet. How can I make it so my AzureML inference service is exposed with a private IP?aferlandMar 12, 2021Copper Contributor1.8KViews0likes1CommentStable Diffusion Webui on azure ML compute instance endpoint connection
Hello everybody, I try to make the following setup work on Azure ML compute instances: On my local client machine (could be anything from mac or windows or linux) I do have a node.js based electron desktop app and I want to use the azure_ML compute instance to send HTTPS GET and POST request calls to an azure_ML compute instance so that the azure cloud machine runs the python based API version of automatic1111 Stable diffusion's webui which I successfully setup following this article from Vlad Iliescu How to run Stable Diffusion Web UI on Azure ML Compute Instances | Vlad Iliescu It is in the end a compute instance where I have git cloned the open source github repo from sd webui automatic1111, created a conda env, pip installed all it's torch cudatoolkit 18 packages and 40 other python packages and called the webui.sh file which launches a localhost with a gradio based webui where you can generate stable diffusion images. This azure compute instance generates me the images and sends me back the base64 image response which I convert on my client's local machine to a png and save that in the client's local storage. When launching the Stable diffusion webui on the azure machine it launches a REST API based off of a FAST API on a localhost with port number localhost:7861 I'm not able to access the azure compute instance's localhost:7861 from my client's local machine to send https requests back and forth. The automatic1111 sd webui launch provides a public gradio.live IP address also but I don't want to use it since it's shared and public. So I turn the webui off which leaves me with the api only localhost:7861 port. In the end I want to be able to make this work for client companies in Germany which are known to have very strict security protocols and the gradio.live public link which looks like something like this: "https://2e11d22a72837b885b.gradio.live" will not work on a professional production level for multiple B2B clients. Ideally, my goal is that the user should click a "connect" button on the electron app which connects with the azure compute instance, that accesses and communicates with stable diffusion's FAST API and the azure compute instance's localhost port 7861. I want to be able to access or communicate with the localhost of the compute instance to keep it isolated, private and secure for the client, while matching the azure standards and proper ways of working with REST APIs in the end. The customer should get a unique private or secured public IP or link like "https://customerXYZ-onAzure-port1234.com" something like that ideally. Having said all of that: How would I achieve this? I was researching many things like creating webapps (but I don't have a website since this is a desktop app), or create an application gateway, setup ACR, or ACI, container registries for docker images, ssh into the machine (which I tried but request calls are not working as well as stdin and it felt like not a proper way to do so), create load balancers, virtual networks etc. Thank you for your time, EmreetekinalpAug 28, 2023Copper Contributor1.5KViews0likes0CommentsTech Minutes Video - Project Trove
This post is Authored by Trinh Duong, Christian Liensberger and Giampaolo Battaglia Office of the CTO Team & AI/Innovation at Microsoft We recently launched the Innovation Tech Minutes series, which are short, snackable informative tidbits from Microsoft researchers, developers and engineers all around the world on some of the latest and future technologies. In our latest episode, Christian Liensberger, Principal Program Manager and Advisor to Microsoft’s CTO shares new insights into Project Trove - a crowdsourcing marketplace where you can gather high-quality images for your AI models. Images are responsibly sourced from regular individuals and adhere to a rigid licensing and privacy framework, resulting in a more responsible data collection platform. In this Tech Minutes video, Christian shares the advantages of Trove, and also provides a walkthrough of Trove Web App from an AI Developer standpoint (selecting the right images for your model training), as well as showing how photo takers can upload their images through the Trove App on Android. Watch the Tech Minutes video Happy viewing & happy end of year! Trinh, Christian and Giampaolo1.4KViews1like1CommentJumping from Google's Teachable Machine to Azure. Help
I've been using Google's Teachable Machine for experiments for months, using two classes of images to train for recognition. I now need to switch the data source to tabular data (TM doesn't support this), and feel as though I've walked into Costco, Home Depot and Walgreens combined, with Azure. I've reviewed libraries of demos at studio.azureml.net and signed up for something else related to Azure, but, beyond uploading data, I have yet to find a way to replicate the workflow and simplicity of setup Teachable Machine offered. Any guidance is appreciated as (now knowing 9 computer programming languages) I'm not keen on learning yet another "ecosystem" over the course of X months.InspireMeJun 12, 2021Copper Contributor1.2KViews0likes0Comments
Resources
Tags
- azure machine learning9 Topics
- ai6 Topics
- ML Studio4 Topics
- machine learning3 Topics
- BUG Report1 Topic
- DP-1001 Topic
- sharepoint1 Topic
- openai1 Topic
- mobile1 Topic
- anaconda31 Topic