Archive

Archive for October, 2017

Simple Guide to implement Off-Site/Cloud backup in SQL Server 2017 using Azure Blob Storage in 2 Steps

October 31, 2017 2 comments

In this post we will see how to implement Off-Site/Cloud backup in SQL Server 2017 using Azure Blog Storage

Requirements

1. SQL Server 2017

2. Azure Account

3. Azure Blog Storage with Container provisioned and Shared Access Signature generated

4. SQL Server Management Studio (Optional, if you prefer to run using SQLCMD)

 

Step 1: Create Credential – Please use below query to create Credential to access Azure Blob Storage using Shared Access Signature

CREATE CREDENTIAL [https://sqlxpertise.blob.core.windows.net/sqlbackups]
	WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET =
	'sv=INSERT YOUR SHARED ACCESS SIGNATURE';
GO

 

Step 2: Backup Database to Cloud – Run below query to backup the database to Azure Blob Storage, Please change the query to meet your needs

BACKUP DATABASE SQL2017_DEMO
    TO URL = 'https://XYZ.blob.core.windows.net/sqlbackups/BackupFileName.bak'
    WITH STATS = 10, SKIP, NOREWIND, NOUNLOAD

 

Output:

capture20171030101644076

If we login to Azure Portal and Navigate to the Storage account and sqlbackups container, we can see the backup file

capture20171030104711661

You can also use SQL Server Management studio 2017 to connect to Azure Storage as well to see the backup file

capture20171030104822250

Having an offsite backup is always important and helpful.

Hope you find this post helpful !!!

Advertisements

Monitoring Transaction Replication : Replicating On-Premise SQL Server databases to SQL Azure databases

October 30, 2017 Leave a comment

In previous post, we saw How to setup Transactional Replication to replicate On-Premise SQL Server databases to SQL Azure databases, in this post we will how to monitor the Replication process and make changes to data in On-Premise and verify the changes in SQL Azure database

Replication Monitoring

Step 1: Right click on Replication –> Local Publications –> Launch Replication Monitor

capture20171025144517902

Step 2: We can see the summary of subscriptions, current average performance and current worst performance

capture20171026151048695

Step 3: Expand the Publication to see the list of subscribers, we can see the list of subscribers and each of their Performance, Latency and Last Synchronization time

capture20171026151138103

Step 4: If we need to review the status of each Agent status, we can use Agents tab and see their status

capture20171026151212465

The different types of Agents status available are as follows

capture20171026151221076

Step 5: To view the details of a particular publisher/subscriber, Please right click on the Publication/Subscription and click on “View Details

capture20171026151412435

In details view we can see the completed (Success and Failure) and current sessions, if we click on a particular session, we can also see the steps ran on that session

Details of the publication session

capture20171026151457561

Details of the Subscriber session

capture20171026151512972

On-Premise Data Update verification in SQL Azure database

We will make data changes using below query in On-Premise server

Query to insert a new record and update one record in On-Premise SQL Server database

INSERT INTO dbo.Customers 
( 
CustomerID, 
CustomerName, 
CustomerAddress, 
Comments, 
Value 
) 
VALUES 
( 80001, -- CustomerID - int 
'CustomerName80001', -- CustomerName - char(100) 
'CustomerAddress80001', -- CustomerAddress - char(100) 
'Comments80001', -- Comments - char(185) 
80001 -- Value - int 
);

UPDATE dbo.Customers 
SET Comments = 'Update Comments 80000' 
WHERE CustomerID = 80000; 

After 2 minutes, we ran the below query in SQL Azure database

Query to verify data changes in SQL Azure database

SELECT [CustomerID], 
[CustomerName], 
[CustomerAddress], 
[Comments], 
[Value] 
FROM [dbo].[Customers] 
WHERE CustomerID >= 80000; 

Output: We can see the changes made in On-Premise server is replicated to SQL Azure database

capture20171026151906330

Hope you find this post helpful !!!

Transactional Replication: Replicate On-Premise SQL Server databases to SQL Azure databases

October 27, 2017 Leave a comment

Transactional replication to SQL Azure Database is now available as of Oct 23, 2017, This feature will help us in replicating/migrating data from On-Premise SQL Server databases to SQL Azure databases with minimal impact and downtime

In this post we will see the following

1. Setup new publisher and configure new distributor

2. Add SQL Azure Database as Subscriber

For this demo we are using

Tools/Servers Used

On-Premise SQL Server 2017 Developer Edition

SQL Azure Database

SQL Server Management Studio 2017 v 17.3

How to Setup new publisher and configure new distributor

 

Step 1: Please connect to the On-Premise SQL Server on which we are planning to publish the data, Right click on Replication and click on New –> Publication

capture20171026125922682

 

Step 2: Click next to continue

capture20171025093756535

Step 3: We are going to configure the publisher to act as Distributor as well, but if you prefer to use a different server, use the 2nd option

After selecting, Please click Next to continue

capture20171025093802802

Step 4: By default SQL Server will create the ReplData folder under Data folder, but if you prefer to use a different folder, Please change it

Please make sure the Process account which you are planning to use and SQL Server service account has rights to Read/Write to this folder else you Replication Snapshot job will fail to generate scripts

capture20171025093808736

Step 5: Please select the database from which we need to publish data, In this demo we are using database “Demo”

After selecting the database click “Next” to proceed

capture20171025093817202

Step 6:  Please select “Transactional Publication” and click “Next” to proceed

capture20171025093823858

Step 7: Please select the required objects, you wish to replicate, we will use Customers table for this demo, after selecting the required objects, Please click “Next” to continue

capture20171025094712912

Step 8: This is an optional step, if you want only filtered data to be replication, Please click on “Add…” add relevant filters

In this demo, we are going to replicate the entire table, so we will skip this and click “Next” to continue

capture20171025093902119

Step 9: Please select the first option to create the initial snapshot and then use the 2nd option to setup the schedule in which the snapshot agent needs to run

Please click “Change…” to setup the schedule details

capture20171025094004761

For this demo, we are setting it up to 24 hours/day for every 30 minutes

capture20171025093958343

After configuring the schedule, Please click “Next” to proceed

Step 10: In this step we will be configuring the Snapshot and Log Reader Agent security settings

Click on “Security Settings…” to configure

If you want to use 2 different account for Snapshot agent and Log Reader agent, please unselect “Use the security settings from the Snapshot Agent” and use “Security Settings…” for Log Reader Agent

capture20171025094552399

Please provide the Windows User account to run the Snapshot Agent

capture20171025094537848

After providing the details, click “Ok” to close and then review the user name provided and click “Next” to continue

Step 11: By default “Create the publication” option will be selected, Please select “Generate a script file with steps to create the publication” if you prefer to run the script manually

capture20171025094603896

If you selected the 2nd option, Please specify the path where you want the script to be created and click “Next” to continue

capture20171025094612146

Step 12: Please provide the Publication Name and review the settings and click “Finish” to start the publisher/distributor configuration process

capture20171025094629258

We can see the progress for each steps

capture20171025094635981

After all steps are completed, we should see the Success message and click “Close” to continue

capture20171025094745397

Step 13: To verify the created publication, Please refresh the Local Publications node under Replication and you should see the newly created Publication

capture20171026124457112

We have successfully created the Publication now, you can verify the status of initial snapshot scripts creating by verifying the job status in SQL Server Agent – Job Activity Monitor

We can see the highlighted job as completed Successfully

capture20171026132659759

How to setup SQL Azure Database as Subscriber

We will now see how to create a new SQL Azure Database subscriber

Step 1: Right click on the Replication –> Local Publications –> Publication Name and click on “New Subscriptions…”

 

capture20171026124759159

 

Step 2: Click “Next” to continue

capture20171025094823157

 

Step 3: Please review the selected Publisher and Publication and click “Next” to continue

capture20171025094828599

Step 4: Please select “Run all agents at the Distributor” option and click “Next” to continue

capture20171025094835370

Step 5: Click on “Add Subscriber

capture20171025094903713

Click on “Add SQL Server Subscriber”

capture20171025094910631

Please provide the SQL Azure Database server name and select SQL Server Authentication and provide the user name and password and click “Connect” to continue

capture20171025095231591

 

Step 6: Please select the Subscription Database and click “Next” to continue

capture20171025143325718

Step 7: Please click  “…” button to configure the Distribution Agent security

capture20171025143431898

Please provide the windows user name to run the process account in Distributor Agent

For “Connect to the Subscriber”, Please provide the SQL Server Login account to connect

capture20171025143419741

Please click “Ok” and then review the selected settings and click “Next” to continue

Step 8: Please specify the Agent Schedule and click “Next” to continue

capture20171025143445700

Step 9: Please select “Initialize” and select “Immediately” for Initialize When, Please choose this accordingly if your snapshot agent job is completed or not

Click “Next” to continue

capture20171025143505239

Step 10: By default “Create the subscription(s)” will be selected, if you prefer to create the subscription manually using script, Please select Generate Script option and click “Next” to continue

capture20171025143513271

Please provide the script path and click “Next” to continue

capture20171025143518278

Step 11: Please review the settings and click “Finish” to start the process

capture20171025143522741

Once the process is complete, Please refresh the publication to see the Subscription

capture20171025144459332

After the Replication Distribution job is run, Please connect to SQL Azure database to see the Table is created in SQL Azure database

capture20171025144542960

In next post, we will see how to monitor the Replication, we will also make changes to local data and see the changes being replicated to SQL Azure

Hope you find this post helpful !!!

SQL Server 2017 Cumulative Update 1 is available now and Steps to install Cumulative Update in SQL Server on Ubuntu Linux

October 26, 2017 Leave a comment

Microsoft has released SQL Server 2017 Cumulative Update 1 today, The build number for the release is 14.0.3006.16

For full list of fixes included in the release – Please use this link https://support.microsoft.com/en-us/help/4038634/cumulative-update-1-for-sql-server-2017

To download for Windows, Please use this link https://www.microsoft.com/en-us/download/details.aspx?id=56128

To update SQL Server on Linux, please run the following commands in Bash shell or your favorite choice of Linux shell

/* Update Packages */
sudo apt-get update 

/* Upgrade updated packages */
sudo apt-get upgrade

/* Restart SQL Server */
sudo systemctl restart mssql-server

/* Check SQL Server Status */
sudo systemctl status mssql-server

Hope you find this post helpful !!!

How to import CSVs from Azure Blob Storage to SQL Server 2017 ?

October 26, 2017 Leave a comment

SQL Server 2017 introduces option to import CSV files stored in Azure Blog Storage

In this article we will see how to implement that step by step

Step 1: Create Database Master Encryption Key

This step is optional, if you already have created your database master encryption key, Please skip this

CREATE MASTER KEY ENCRYPTION
	BY PASSWORD = 'PleaseUseStrongPasswordHere'

Step 2: Create Database Scoped Credential

CREATE DATABASE SCOPED CREDENTIAL DemoAzureCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET =
	'INSERT YOUR SHARED ACCESS SIGNATURE HERE';

Please note when you are copying the Share Access Signature from Azure Portal, it will copy with ?  as prefix, and you will need to remove it

Step 3: Create External Data Source

Please make sure to replace sqlxpertise  with your own Blob account name and democsvs with your container name

CREATE EXTERNAL DATA SOURCE DemoAzureBlobCSVs
    WITH  (
        TYPE = BLOB_STORAGE,
        LOCATION = 'https://sqlxpertise.blob.core.windows.net/democsvs',
        CREDENTIAL = DemoAzureCredential
    );

Step 4: Test External Data source connection using OPENROWSET command

SELECT * FROM OPENROWSET(
   BULK  'DemoProducts.txt',
   DATA_SOURCE = 'DemoAzureBlobCSVs',
   SINGLE_CLOB) AS DataFile;

Sample output – This is based on the input file I used and it will vary based on your input file uploaded to Azure Blog Storage

capture20171024154723249

Step 5: Import Data to table

CREATE TABLE tmpProducts
(
	[productid] [int] IDENTITY(1,1) NOT NULL,
	[productname] [nvarchar](40) NOT NULL,
	[supplierid] [int] NOT NULL,
	[categoryid] [int] NOT NULL,
	[unitprice] [money] NOT NULL,
	[discontinued] [bit] NOT NULL,
 ) 

 BULK INSERT tmpProducts
		FROM 'DemoProducts.txt'
	   WITH
       (
			              DATA_SOURCE = 'DemoAzureBlobCSVs',
                        FORMAT = 'CSV'
       );

Output

capture20171024155032516

Once data is successfully imported, let us review the data imported

 SELECT productid,
        productname,
        supplierid,
        categoryid,
        unitprice,
        discontinued FROM dbo.tmpProducts

Output

capture20171024155144082

Note: BULK INSERT is working only for permanent tables and not temporary tables, You will get below error message if you try importing temporary tables

CREATE TABLE #tmpProducts
(
	[productid] [int] IDENTITY(1,1) NOT NULL,
	[productname] [nvarchar](40) NOT NULL,
	[supplierid] [int] NOT NULL,
	[categoryid] [int] NOT NULL,
	[unitprice] [money] NOT NULL,
	[discontinued] [bit] NOT NULL,
 ) 

 BULK INSERT #tmpProducts
	FROM 'DemoProducts.txt'
	WITH (DATA_SOURCE = 'DemoAzureBlobCSVs', FORMAT = 'CSV');

Error Message

Msg 12703, Level 16, State 1, Line 57
Referenced external data source “DemoAzureBlobCSVs” not found.

Yes, the message is misleading, but I am sure, Microsoft SQL Server team will realize and change this.

Hope you find this post helpful !!!

Temporal Tables – Part 5- Simple Guide to implement History Tables using SQL Server 2016/2017– Storing History data in Azure using Stretch Database

October 25, 2017 Leave a comment

In previous posts, we saw

Part 1 – How to create Temporal Tables

Part 2 – Accessing History data from Temporal Tables

Part 3 – Modifying Temporal Tables, Disabling System versioning and why we need to disable ?

Part 4 – Managing Temporal Tables and History Data

In this post we will see how to store the history data in Azure using Stretch database

Step 1: Right click on Database name and click on Tasks –> Stretch –> Enable

capture20171023145111550

Step 2: Select the tables which you would like to enable Stretch

capture20171023145132469

Step 3: Click on “Entire Table” hyperlink in Migrate column to customize the criteria for Stretch, by default the Entire table will be stored in SQL Azure

Please specify the name of the Stretch Criteria, Please select the Column based on which you want to apply the filter criteria , select the Condition and then specify the value

capture20171023145208563

capture20171023143520236

After specify the criteria, click on “Check” to validate, once you “Success”, Please click “Done” to continue

Step 4:  Please review the selected tables and click “Next” to continue

capture20171023145217186

Step 5: Please sign in to your Azure account, if you are not already signed in, Please select the Subscription, Azure region

We can create a new SQL Azure server or use an existing server, in this demo I am using an existing server

After providing details, click “Next” to continue

capture20171023145308669

Step 6: Please provide the Public IP Address range for which you need to enable access, If you have only one Public IP Address, Please specify that for both From and To and then click Next to proceed

FYI, If you created a new server, by default this step will enable the Firewall for Azure services

capture20171023145338248

Step 7: Review the settings selected and also review the estimated Pricing, this will change based on the number of tables and size of data you are planning to store in SQL Azure

Please click “Finish” to start the Stretch enabling process

capture20171023145344372

We can review the Stretch deployment process in below screenshot, based on your selecting, the steps shown below can change

capture20171023145350517

Once all the steps are completed, we can see status as “Passed”, if you want to see the verbose details on what happened in each step, we can review the Logs, To access the logs, Please click on Hyperlink, This will be very helpful when there is any error occurred

capture20171023145653549

How to monitor Stretch databases ?

We can now see the Icon for the database has changed

capture20171023145727577

To monitor the progress of Stretch, Please right click on Database Name –> Tasks –> Stretch –> Monitor

capture20171023145943961

In Monitor window, we can see the Source server, Target Azure server, Database and list of tables being stretched and number of rows uploaded

capture20171023145858728

We can also connect to the SQL Azure server and access the history data as shown below

capture20171023150934383

capture20171023150943585

capture20171023151039526

How to disable Stretch for a particular table / database ?

To disable Stretch for a table, right click on table name –> Stretch –> Disable

You have 2 options

1. Bring data back from Azure – This will download all the data from SQL Azure and update your local server

2. Leave data in Azure – This will leave the data as is in SQL Azure server and new data will be stored locally

Please choose the option based on your requirement

capture20171023150056965

In this demo, we are using “Bring data back from Azure” option, Please click “Yes” to confirm the disable process

capture20171023151116898

Depending on the size of the data, the process will run and you will see the below message

capture20171023151124884

After disabling Stretch on all the tables, we can disable the Stretch at the database level by right clicking on Database name –> Tasks –> Stretch –> Disable

capture20171023151619546

FYI, If you try disabling the Stretch in your database, before disabling Stretch in the tables, you will see this below validation message

capture20171023150002544

Once the Stretch disable process is completed, we will see this below completed message

capture20171023151629941

FYI, You can use the Stretch option not just for Temporal History tables, you can use this for any other tables where you want to store historical data or least accessed data remotely for reference or compliance purposes

Hope you find this post helpful !!!

Temporal Tables–Simple Guide to implement History Tables using SQL Server 2016/2017–Part 4–Managing Temporal Tables and History Data

October 24, 2017 1 comment

In previous posts, we saw

Part 1 – How to create Temporal Tables

Part 2 – Accessing History data from Temporal Tables

Part 3 – Modifying Temporal Tables, Disabling System versioning and why we need to disable ?

In this post we will see how to manage Temporal tables and History

To get list of Temporal Tables – Snippet 1

SELECT name,
       object_id,
       temporal_type,
       temporal_type_desc,
       history_table_id,
       is_remote_data_archive_enabled,
       is_external,
       history_retention_period,
       history_retention_period_unit,
       history_retention_period_unit_desc
FROM sys.tables
WHERE temporal_type > 0
ORDER BY name;

 

Output

capture20171023102432733

If we see the output, we can identify the following

1. Type of Temporal Table – System Versioned or History table using “temporal_type” and “temporal_type_desc” columns

2. For System Versioned tables, we can also see the History Table ID

3. History Retention Period can be determined using history_retention_period, history_retention_period_unit and history_retention_period_unit_desc

history_retention_period = -1  = History is kept forever

Positive Integer = Refers to the number of Days/Weeks/Years

history_retention_period_unit will have one of these values

-1 = INFINITE

3 = DAY(S)

4 = WEEK(S)

5 = MONTH(S)

6 = YEAR(S)

By default, when we create temporal tables, if we don’t specify Retention period, it will be created with Infinite period of retention

Modifying Retention Period

Let us see how we can change Retention Period for an existing Temporal Table

Code to alter table to set Retention Period for 2 days – Snippet 2

ALTER TABLE dbo.Demo
	SET (SYSTEM_VERSIONING = ON
			(
				HISTORY_TABLE = dbo.DemoHistory,
				HISTORY_RETENTION_PERIOD = 2 DAYS
			)
		)

If we run query from Snippet 1, we will see below output

capture20171023103757653

To get list of Databases with Retention Enabled – Snippet 3

Once retention period is enabled for any table in the database, SQL Server will set the is_temporal_history_retention_enabled flag = 1 for the database by default and also create background jobs to cleanup the history data

SELECT name,
       database_id,
       is_temporal_history_retention_enabled
FROM sys.databases
WHERE is_temporal_history_retention_enabled = 1
ORDER BY name;

Output

capture20171023104055758

If you want to disable automatic cleanup task, Please use below command

Query to disable cleanup task – Snippet 4

ALTER DATABASE SQL2017_DEMO
     SET TEMPORAL_HISTORY_RETENTION OFF

If we run the query in Snippet 3, we can see the changes

capture20171023104605524

How to cleanup History tables manually ?

If you prefer to cleanup the History tables manually rather than automated, Please follow below steps

1. Make your application offline

2. Run query to disable System versioning on required tables

3. Run query to delete data from History using SysEndTime as Filter criteria

4. Run query to enable System versioning for tables disabled in Step 2

5. Make your application online

If your application has to be on 24/7, better option is to use the in-built automated cleanup task

Hope you find this post helpful !!!

%d bloggers like this: