Simple Guide to implement Off-Site/Cloud backup in SQL Server 2017 using Azure Blob Storage in 2 Steps


In this post we will see how to implement Off-Site/Cloud backup in SQL Server 2017 using Azure Blog Storage

Requirements

1. SQL Server 2017

2. Azure Account

3. Azure Blog Storage with Container provisioned and Shared Access Signature generated

4. SQL Server Management Studio (Optional, if you prefer to run using SQLCMD)

 

Step 1: Create Credential – Please use below query to create Credential to access Azure Blob Storage using Shared Access Signature

CREATE CREDENTIAL [https://sqlxpertise.blob.core.windows.net/sqlbackups]
	WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET =
	'sv=INSERT YOUR SHARED ACCESS SIGNATURE';
GO

 

Step 2: Backup Database to Cloud – Run below query to backup the database to Azure Blob Storage, Please change the query to meet your needs

BACKUP DATABASE SQL2017_DEMO
    TO URL = 'https://XYZ.blob.core.windows.net/sqlbackups/BackupFileName.bak'
    WITH STATS = 10, SKIP, NOREWIND, NOUNLOAD

 

Output:

capture20171030101644076

If we login to Azure Portal and Navigate to the Storage account and sqlbackups container, we can see the backup file

capture20171030104711661

You can also use SQL Server Management studio 2017 to connect to Azure Storage as well to see the backup file

capture20171030104822250

Having an offsite backup is always important and helpful.

Hope you find this post helpful !!!

Advertisement

Monitoring Transaction Replication : Replicating On-Premise SQL Server databases to SQL Azure databases


In previous post, we saw How to setup Transactional Replication to replicate On-Premise SQL Server databases to SQL Azure databases, in this post we will how to monitor the Replication process and make changes to data in On-Premise and verify the changes in SQL Azure database

Replication Monitoring

Step 1: Right click on Replication –> Local Publications –> Launch Replication Monitor

capture20171025144517902

Step 2: We can see the summary of subscriptions, current average performance and current worst performance

capture20171026151048695

Step 3: Expand the Publication to see the list of subscribers, we can see the list of subscribers and each of their Performance, Latency and Last Synchronization time

capture20171026151138103

Step 4: If we need to review the status of each Agent status, we can use Agents tab and see their status

capture20171026151212465

The different types of Agents status available are as follows

capture20171026151221076

Step 5: To view the details of a particular publisher/subscriber, Please right click on the Publication/Subscription and click on “View Details

capture20171026151412435

In details view we can see the completed (Success and Failure) and current sessions, if we click on a particular session, we can also see the steps ran on that session

Details of the publication session

capture20171026151457561

Details of the Subscriber session

capture20171026151512972

On-Premise Data Update verification in SQL Azure database

We will make data changes using below query in On-Premise server

Query to insert a new record and update one record in On-Premise SQL Server database

INSERT INTO dbo.Customers 
( 
CustomerID, 
CustomerName, 
CustomerAddress, 
Comments, 
Value 
) 
VALUES 
( 80001, -- CustomerID - int 
'CustomerName80001', -- CustomerName - char(100) 
'CustomerAddress80001', -- CustomerAddress - char(100) 
'Comments80001', -- Comments - char(185) 
80001 -- Value - int 
);

UPDATE dbo.Customers 
SET Comments = 'Update Comments 80000' 
WHERE CustomerID = 80000; 

After 2 minutes, we ran the below query in SQL Azure database

Query to verify data changes in SQL Azure database

SELECT [CustomerID], 
[CustomerName], 
[CustomerAddress], 
[Comments], 
[Value] 
FROM [dbo].[Customers] 
WHERE CustomerID >= 80000; 

Output: We can see the changes made in On-Premise server is replicated to SQL Azure database

capture20171026151906330

Hope you find this post helpful !!!

Transactional Replication: Replicate On-Premise SQL Server databases to SQL Azure databases


Transactional replication to SQL Azure Database is now available as of Oct 23, 2017, This feature will help us in replicating/migrating data from On-Premise SQL Server databases to SQL Azure databases with minimal impact and downtime

In this post we will see the following

1. Setup new publisher and configure new distributor

2. Add SQL Azure Database as Subscriber

For this demo we are using

Tools/Servers Used

On-Premise SQL Server 2017 Developer Edition

SQL Azure Database

SQL Server Management Studio 2017 v 17.3

How to Setup new publisher and configure new distributor

 

Step 1: Please connect to the On-Premise SQL Server on which we are planning to publish the data, Right click on Replication and click on New –> Publication

capture20171026125922682

 

Step 2: Click next to continue

capture20171025093756535

Step 3: We are going to configure the publisher to act as Distributor as well, but if you prefer to use a different server, use the 2nd option

After selecting, Please click Next to continue

capture20171025093802802

Step 4: By default SQL Server will create the ReplData folder under Data folder, but if you prefer to use a different folder, Please change it

Please make sure the Process account which you are planning to use and SQL Server service account has rights to Read/Write to this folder else you Replication Snapshot job will fail to generate scripts

capture20171025093808736

Step 5: Please select the database from which we need to publish data, In this demo we are using database “Demo”

After selecting the database click “Next” to proceed

capture20171025093817202

Step 6:  Please select “Transactional Publication” and click “Next” to proceed

capture20171025093823858

Step 7: Please select the required objects, you wish to replicate, we will use Customers table for this demo, after selecting the required objects, Please click “Next” to continue

capture20171025094712912

Step 8: This is an optional step, if you want only filtered data to be replication, Please click on “Add…” add relevant filters

In this demo, we are going to replicate the entire table, so we will skip this and click “Next” to continue

capture20171025093902119

Step 9: Please select the first option to create the initial snapshot and then use the 2nd option to setup the schedule in which the snapshot agent needs to run

Please click “Change…” to setup the schedule details

capture20171025094004761

For this demo, we are setting it up to 24 hours/day for every 30 minutes

capture20171025093958343

After configuring the schedule, Please click “Next” to proceed

Step 10: In this step we will be configuring the Snapshot and Log Reader Agent security settings

Click on “Security Settings…” to configure

If you want to use 2 different account for Snapshot agent and Log Reader agent, please unselect “Use the security settings from the Snapshot Agent” and use “Security Settings…” for Log Reader Agent

capture20171025094552399

Please provide the Windows User account to run the Snapshot Agent

capture20171025094537848

After providing the details, click “Ok” to close and then review the user name provided and click “Next” to continue

Step 11: By default “Create the publication” option will be selected, Please select “Generate a script file with steps to create the publication” if you prefer to run the script manually

capture20171025094603896

If you selected the 2nd option, Please specify the path where you want the script to be created and click “Next” to continue

capture20171025094612146

Step 12: Please provide the Publication Name and review the settings and click “Finish” to start the publisher/distributor configuration process

capture20171025094629258

We can see the progress for each steps

capture20171025094635981

After all steps are completed, we should see the Success message and click “Close” to continue

capture20171025094745397

Step 13: To verify the created publication, Please refresh the Local Publications node under Replication and you should see the newly created Publication

capture20171026124457112

We have successfully created the Publication now, you can verify the status of initial snapshot scripts creating by verifying the job status in SQL Server Agent – Job Activity Monitor

We can see the highlighted job as completed Successfully

capture20171026132659759

How to setup SQL Azure Database as Subscriber

We will now see how to create a new SQL Azure Database subscriber

Step 1: Right click on the Replication –> Local Publications –> Publication Name and click on “New Subscriptions…”

 

capture20171026124759159

 

Step 2: Click “Next” to continue

capture20171025094823157

 

Step 3: Please review the selected Publisher and Publication and click “Next” to continue

capture20171025094828599

Step 4: Please select “Run all agents at the Distributor” option and click “Next” to continue

capture20171025094835370

Step 5: Click on “Add Subscriber

capture20171025094903713

Click on “Add SQL Server Subscriber”

capture20171025094910631

Please provide the SQL Azure Database server name and select SQL Server Authentication and provide the user name and password and click “Connect” to continue

capture20171025095231591

 

Step 6: Please select the Subscription Database and click “Next” to continue

capture20171025143325718

Step 7: Please click  “…” button to configure the Distribution Agent security

capture20171025143431898

Please provide the windows user name to run the process account in Distributor Agent

For “Connect to the Subscriber”, Please provide the SQL Server Login account to connect

capture20171025143419741

Please click “Ok” and then review the selected settings and click “Next” to continue

Step 8: Please specify the Agent Schedule and click “Next” to continue

capture20171025143445700

Step 9: Please select “Initialize” and select “Immediately” for Initialize When, Please choose this accordingly if your snapshot agent job is completed or not

Click “Next” to continue

capture20171025143505239

Step 10: By default “Create the subscription(s)” will be selected, if you prefer to create the subscription manually using script, Please select Generate Script option and click “Next” to continue

capture20171025143513271

Please provide the script path and click “Next” to continue

capture20171025143518278

Step 11: Please review the settings and click “Finish” to start the process

capture20171025143522741

Once the process is complete, Please refresh the publication to see the Subscription

capture20171025144459332

After the Replication Distribution job is run, Please connect to SQL Azure database to see the Table is created in SQL Azure database

capture20171025144542960

In next post, we will see how to monitor the Replication, we will also make changes to local data and see the changes being replicated to SQL Azure

Hope you find this post helpful !!!

SQL Server 2017 Cumulative Update 1 is available now and Steps to install Cumulative Update in SQL Server on Ubuntu Linux


Microsoft has released SQL Server 2017 Cumulative Update 1 today, The build number for the release is 14.0.3006.16

For full list of fixes included in the release – Please use this link https://support.microsoft.com/en-us/help/4038634/cumulative-update-1-for-sql-server-2017

To download for Windows, Please use this link https://www.microsoft.com/en-us/download/details.aspx?id=56128

To update SQL Server on Linux, please run the following commands in Bash shell or your favorite choice of Linux shell

/* Update Packages */
sudo apt-get update 

/* Upgrade updated packages */
sudo apt-get upgrade

/* Restart SQL Server */
sudo systemctl restart mssql-server

/* Check SQL Server Status */
sudo systemctl status mssql-server

Hope you find this post helpful !!!

How to import CSVs from Azure Blob Storage to SQL Server 2017 ?


SQL Server 2017 introduces option to import CSV files stored in Azure Blog Storage

In this article we will see how to implement that step by step

Step 1: Create Database Master Encryption Key

This step is optional, if you already have created your database master encryption key, Please skip this

CREATE MASTER KEY ENCRYPTION
	BY PASSWORD = 'PleaseUseStrongPasswordHere'

Step 2: Create Database Scoped Credential

CREATE DATABASE SCOPED CREDENTIAL DemoAzureCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET =
	'INSERT YOUR SHARED ACCESS SIGNATURE HERE';

Please note when you are copying the Share Access Signature from Azure Portal, it will copy with ?  as prefix, and you will need to remove it

Step 3: Create External Data Source

Please make sure to replace sqlxpertise  with your own Blob account name and democsvs with your container name

CREATE EXTERNAL DATA SOURCE DemoAzureBlobCSVs
    WITH  (
        TYPE = BLOB_STORAGE,
        LOCATION = 'https://sqlxpertise.blob.core.windows.net/democsvs',
        CREDENTIAL = DemoAzureCredential
    );

Step 4: Test External Data source connection using OPENROWSET command

SELECT * FROM OPENROWSET(
   BULK  'DemoProducts.txt',
   DATA_SOURCE = 'DemoAzureBlobCSVs',
   SINGLE_CLOB) AS DataFile;

Sample output – This is based on the input file I used and it will vary based on your input file uploaded to Azure Blog Storage

capture20171024154723249

Step 5: Import Data to table

CREATE TABLE tmpProducts
(
	[productid] [int] IDENTITY(1,1) NOT NULL,
	[productname] [nvarchar](40) NOT NULL,
	[supplierid] [int] NOT NULL,
	[categoryid] [int] NOT NULL,
	[unitprice] [money] NOT NULL,
	[discontinued] [bit] NOT NULL,
 ) 

 BULK INSERT tmpProducts
		FROM 'DemoProducts.txt'
	   WITH
       (
			              DATA_SOURCE = 'DemoAzureBlobCSVs',
                        FORMAT = 'CSV'
       );

Output

capture20171024155032516

Once data is successfully imported, let us review the data imported

 SELECT productid,
        productname,
        supplierid,
        categoryid,
        unitprice,
        discontinued FROM dbo.tmpProducts

Output

capture20171024155144082

Note: BULK INSERT is working only for permanent tables and not temporary tables, You will get below error message if you try importing temporary tables

CREATE TABLE #tmpProducts
(
	[productid] [int] IDENTITY(1,1) NOT NULL,
	[productname] [nvarchar](40) NOT NULL,
	[supplierid] [int] NOT NULL,
	[categoryid] [int] NOT NULL,
	[unitprice] [money] NOT NULL,
	[discontinued] [bit] NOT NULL,
 ) 

 BULK INSERT #tmpProducts
	FROM 'DemoProducts.txt'
	WITH (DATA_SOURCE = 'DemoAzureBlobCSVs', FORMAT = 'CSV');

Error Message

Msg 12703, Level 16, State 1, Line 57
Referenced external data source “DemoAzureBlobCSVs” not found.

Yes, the message is misleading, but I am sure, Microsoft SQL Server team will realize and change this.

Hope you find this post helpful !!!

%d bloggers like this: