Discovery
4 TopicsSQL Migration to Azure Cloud
I have implemented a basic C# application connected with On premises SQL Server, I am going to migrate the same database and the data to Azure cloud using Microsoft Migration Tool, After the migration without touching the coding part I am debugging the same application (Changed the connections string only). On premises sql database connected with SQL Server, In the database I am running this query through management studio and I am getting the following results, Same query I am deploying through my developed C# application, I am testing the connection with sql authentication credentials String source = @”Data Source =” + textBox1.Text; Initial Catalog = CheckPostingDb; User Id =” + textBox2.Text;; Password=” + textBox3.Text; SqlConnection con = new SqlConnection(source); con.Open(); MessageBox.Show(“Db Connected”); Once it’s succeeded, I’m running the same query through sqlcommand function in C#, got the same results in the text box String sqlSelectQuery = “SELECT COUNT(*) AS MREQUESTS FROM MREQUESTS WHERE REQSTATE=1”; SqlCommand cmd = new SqlCommand(sqlSelectQuery, con); if (dr.Read()) { textBox4.Text = Convert.ToString(dr[“MREQUESTS”]) } con.Close(); Let’s migrate to Cloud I have deployed a sample database in the Azure Cloud with SQL authentication, It’s just a blank database and it doesn’t have any tables Tried the same query here and returning with failed errors, Start ab new project type as Migration In this step I am specifying the source and target server details In my scenario Source server is in localhost and target sql server is in Azure could Source Server — localhost , Target Server — gohulan.database.windows.net Select the Correct database from the source server to Migrate to cloud, In the target server select the correct database from Azure cloud, in my Azure cloud I have only one database named CheckPostingDb Once its’s connected I am going to select the objects or tables from the source database that I would like to migrate In my testing environment I am selecting only one table, my table is MREQUESTS since I am targeting the results only from this table through my C# application. Once the table is ticked, I have generated the SQL script Once the script is generated, I am deploying the schema, /******** DMA Schema Migration Deployment Script Script Date: 2/24/2020 12:50:55 PM ********/ /****** Object: Table [dbo].[MREQUESTS] Script Date: 2/24/2020 12:50:55 PM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N’[dbo].[MREQUESTS]’) AND type in (N’U’)) BEGIN CREATE TABLE [dbo].[MREQUESTS]( [ID] [bigint] IDENTITY(1,1) NOT NULL, [RID] [uniqueidentifier] NOT NULL, [ReqTime] [datetime] NOT NULL, [ReqState] [tinyint] NOT NULL, [RecordType] [int] NOT NULL, [Data1] [bigint] NULL, [ServiceID] [int] NULL, [FirstRequestTime] [datetime] NULL, [OfflinePosting] [bit] NULL, [ServiceHostInfo] [nvarchar](80) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, CONSTRAINT [PK_MREQUESTS] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ) END GO IF NOT EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N’[dbo].[MREQUESTS]’) AND name = N’AK_MREQUESTS_RID’) CREATE UNIQUE NONCLUSTERED INDEX [AK_MREQUESTS_RID] ON [dbo].[MREQUESTS] ( [RID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) GO IF NOT EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N’[dbo].[MREQUESTS]’) AND name = N’IX_MREQUESTS_2') CREATE NONCLUSTERED INDEX [IX_MREQUESTS_2] ON [dbo].[MREQUESTS] ( [ReqTime] ASC, [ReqState] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) GO After I deployed the generated script in the previous step, it doesn’t give any errors or warnings. Successfully Executed. Once the script is migrated, I can see my table was created in the Azure cloud. But my query was returning with 0 results, it means that MREQUESTS table has been deployed but not the data yet. I am migrating the data as well in my next step Once after I started the migration it will start the process to send the data to cloud, time depends on the data capacity and the network speed. Since my table doesn’t have huge data it finished with in short time without any warning or errors. Running the same query in the Query Editor in Azure to check my data, it’s succeeded and returned with same value as on-premise query returns earlier. Returning to my C# application and no changes made in the application but changing the connection string by changing the server name and sql authentication credentials, Debugging the application to confirm it’s functioning properly, tested my connection with correct sql credentials Wow the results as expected, means Migration is succeeded.1.4KViews1like0CommentsMicrosoft Azure Hub-Spoke model by Enterprise Design part 3 of 4 Data Migration
Hyper-V Clusters front tier with SQL Clusters in the Backend SQL assessment and Data Migration to Azure This blogpost is about SQL assessment and Data Migration to your Azure design in the Cloud in a secure way. Before you begin with your Data assessment and getting your workloads together with Microsoft Azure ServiceMaps, I wrote these blogposts about Microsoft Azure HUB – Spoke model by Enterprise Design : Microsoft Azure Hub-Spoke model by Enterprise Design 1 of 4 Microsoft Azure Policy and BluePrints Overview (Extra Blogpost) Microsoft Azure Hub-Spoke model by Enterprise Design 2 of 4 “Lift and Shift” For Microsoft SQL databases there are different Azure Solutions in the Cloud possible, but first you need to know which versions of SQL do you have and how are they running now in your Datacenter? Read more on my Blogpost here about Azure Data MigrationStart your move to Azure with Azure Migrate
Today the Azure Migrate service became generally available. The service allows you to discover virtual machines in your on-premises environment running in VMWare and provides guidance and insights to help you move to those VMs running your applications to Azure. Azure Migrate allows to plan your move to Azure considering three primary dimensions: Readiness - is the VM and application suitable to run in an Azure VM Right-sizing - what's the correct Azure VM to use Cost - what's it going to cost to run the VM in Azure The Azure Migrate team announced the milestone in this blog post today. Learn more about the Azure Migrate service where you can see review the capabilities of the service, gain access the service details, and try it out in a lab.Welcome to the Azure Migration Tech Community
Welcome to the new Azure Migration IT Forum. Here we will discuss, disclosure and talk about all things migration. Migrating to Azure is not simply lifting and shifting your existing VM’s (although you can easily do this if you want), there are many tools and services available to help you discover, right-size and optimize your VM’s before they more to Azure – saving you time and money. However, migration is not all about Virtual Machines. Much of the time you are looking at an end-to-end Application, including app services, data, and infrastructure tiers. Azure migration services are there to help you for all scenarios – app migration, data migration and VM migration with new tools, services and a strong partner ecosystem. To kick start your journey, on April 12 th we released a new Azure TCO calculator. The tool enables you to understand initial cost comparison in migrating your on-premises workloads to Azure. Through one of three input mechanisms, you can model the cost of your on-premises physical and virtual servers. Further inputs for storage and networking usage offers deeper costing analysis, providing an initial comparison report to identify savings when moving your on-premises environment to Azure. Getting stated is easy, simply choose the best way to input your sever and workload data Manual Import - the manual assessment enables you to enter in information about your server environment such as procs, cores and memory directly into the Azure TCO analysis tool. Custom Import - custom inventory assessment enables you to use your existing discovery tools output, providing the Azure TCO calculator with the required data to enable the assessment. In this scenario, you would use the custom inventory template provided by the Azure TCO tool, and import your existing data to this. Automated discovery and import - the automated inventory assessment uses the Microsoft Assessment and Planning toolkit to automatically collect your server hardware configuration, and enable this for input into the Azure TCO tool. You can access the new TCO tool at http://tco.microsoft.com/. Check out the complete Getting Started guide for a step-by-step screen captured walkthrough for using the TCO tool, including discovery and analysis. Also you can view this in action with our video tutorial. If you have any questions, issues or problems with the new TCO tool, make sure you post them here. We have experts on standby awaiting your question and will get back to you promptly! I hope you see terrific value in the tools for migration to Azure. In the coming weeks, we will delve more into existing migration tools such as Azure Site Recovery, excellent partner offerings for discovery and right-sizing, as well as new serviced we are on the verge of announcing! Many thanks Michael Leworthy