Generate SAS For Backup to Azure Storage

#Import-Module dbatools
#Import-Module Az
#Create a new Container or use an existing one
$ContainerName= "devbackup"
#Create a new Resource Group or use an existing one
$ResourceGroupName = "rgdba"
#Create a new Storage Account or use an existing one
$AccountName = "backupacount"
# Choose data center
$Location = "centralus"
# Choose type
$SKUName = "Standard_GRS"
#WARNING: To sign in, use a web browser to open the page and enter the code EDL5EZVMX to authenticate.
#Check you're in the correct subscription
Get-AzResourceGroup Name $ResourceGroupName ErrorVariable notPresent ErrorAction SilentlyContinue
if ($notPresent)
# ResourceGroup doesn't exist
New-AzResourceGroup Name $ResourceGroupName Location $Location
# use splatting
$NewStorageAccount = @{
ResourceGroupName = $ResourceGroupName
AccountName = $AccountName
Location = $Location
SKUName = $SKUName
Kind = "StorageV2"
AccessTier = "Hot"
EnableHttpsTrafficOnly = $true
$sa = Get-AzStorageAccount ResourceGroupName $ResourceGroupName | Where-Object StorageAccountName -eq $AccountName
if ($null -eq $sa)
"Creating Storage Account $storageAcctName"
$sa = New-AzStorageAccount @NewStorageAccount
$StorageKeys = Get-AzStorageAccountKey ResourceGroupName $ResourceGroupName Name $AccountName
$StorageContext = New-AzStorageContext StorageAccountName $AccountName StorageAccountKey $StorageKeys[0].Value
#If ($null -eq (get-azstoragecontainer -context $AzStorageContext | Where-Object Name -eq $ContainerName) )
#$sc = Get-AzStorageContainer -Context $StorageContext | Where-Object Name -eq $ContainerName
Get-AzStorageContainer Context $StorageContext Name $ContainerName ErrorVariable notPresent ErrorAction SilentlyContinue
if ($notPresent)
# "Creating New Storage Container $ContainerName"
$sc = New-AzStorageContainer Context $StorageContext Name $ContainerName
} else {
$sc = Get-AzStorageContainer Context $StorageContext Name $ContainerName
#Create a Shared Access Policy giving (r)ead, (w)rite, (l)ist and (d)elete permissions for 10 year from now
$NewSharedAccessPolicy = @{
Context = $StorageContext
Policy = $StorageContext.StorageAccountName+"sasPolicy10y"
Container = $ContainerName
ExpiryTime = (Get-Date).ToUniversalTime().AddYears(10)
Permission = "rwld"
Get-AzStorageContainerStoredAccessPolicy Context $StorageContext Container $ContainerName ErrorVariable notPresent ErrorAction SilentlyContinue
if ($notPresent)
{ # ResourceGroup doesn't exist
New-AzStorageContainerStoredAccessPolicy @NewSharedAccessPolicy ErrorAction SilentlyContinue
#Get the Shared Access Token
$Sas = New-AzStorageContainerSASToken Policy $NewSharedAccessPolicy.Policy Context $StorageContext Name $ContainerName
#We need the URL to the blob storage container we've created:
$ContainerUrl = $sc.CloudBlobContainer.uri.AbsoluteUri
write-host $SasSql
# Invoke-DbaQuery -SqlInstance instance1,instance2 -Database Master -Query $SasSql

Managed_backup function fn_is_master_switch_on() was missing

When I am working on managed-backup on SQL Server 2019 to backup to Azure.
I run into a situation that I cannot find a listed system scalar function:


From MSDN:
But it occasionally appears from intelligence menu context.

The function returns from this query on my instances:

SELECT name, type_desc
FROM msdb.sys.objects
WHERE SCHEMA_ID = SCHEMA_ID(‘managed_backup’)
and name=’fn_is_master_switch_on’

But cannot find it from a few machines.

The related stored procedure is working fine.


After couple of hours, I can see the function again.

select msdb.managed_backup.fn_is_master_switch_on()

Custom Split With JSON and String_Agg function

SQL Server 2017 or 2019
Create table test (Col1 varchar(900))
Insert into test values
('sql4 13434 22 test 39480'),
('sql2 DEF 39 tests'),
('data 123 258 256 tests 4587'),
('sql2 FED tests')
;with mycte as (
select Col1,[value],Cast([key] as int) k,
when lead([value]) over(partition by Col1 order by [key]) like'%test%'
and try_cast([value] as int) is not null
then 1
when [value] like'%test%'
and lag(try_cast([value] as int)) over(partition by Col1 order by [key]) is null
then 1
else 0 end newkey
from TEST
cross apply openjson('["'+(replace(Col1,' ','","')+'"]'))
,mycte2 as (
select Col1 ,k, [value]
, Sum(newkey)over(partition by Col1 order by k) grp
from mycte
,mycte3 as (
select Col1,grp, (case when grp=0 then string_agg([value],' ') WITHIN GROUP ( ORDER BY k) else null end) col1a
, (case when grp=1 then string_agg([value],' ') WITHIN GROUP ( ORDER BY k) else null end) col1b
from mycte2
group by Col1,grp
select Col1, max(col1a) col1a , max(col1b) col1b
from mycte3
group by Col1
order by 1,2
drop table test

Custom Schedule for an Agent Job

When you have a need to run an agent job on a quarterly schedule, you will find out that there is not an available template you can choose from the UI.

One way you can reach this goal is to add a first step to check month before the normal job step(s).

You can modify your existing job to add a step as start step and use the following T-SQL command to generate error by design.

If the schedule is not what you need, generate an errorĀ  in the code.

Configure the step to continue to next step if success; quit and report success if it fails.

Your other steps in your job will not run if it is not on schedule.

Here is the T-SQL code for step one to check quarter start months.

declare @rundt datetime=getdate()
If month(@rundt) not in (1,4,7,10)
THROW 51000,'Job cannot be executed on this month',1;

Use JSON to Simulate String_Agg in SQL Server 2016

Declare @table1
table(TableId int, SId int, Data varchar(200), DataOwner int)
insert into @table1 values
(1, 10, 'aaaa', 1001)
,(2, 10, 'aaaa', 1000)
,(3, 20, 'bbbb', 1000)
,(4, 10, 'ccc', 2000)
;with mycte as (
select Distinct SId , Data, DataOwner
from @table1
,mycte2 as (
SELECT Cast(DataOwner as varchar(30)) DataOwner
FROM mycte t2 WHERE t2.SId = t1.SId and t2.Data = t1.Data
) jsCol
FROM mycte t1
Group by SId , Data)
select SId , Data,
Concat(JSON_VALUE (jsCol,'$.Vals[0].DataOwner'),
','+JSON_VALUE (jsCol,'$.Vals[1].DataOwner'),
','+JSON_VALUE (jsCol,'$.Vals[2].DataOwner'),
','+JSON_VALUE (jsCol,'$.Vals[3].DataOwner') ) DataOwner
from mycte2

SUM SUM OVER Sample Code

create table WeekSales(dt datetime, Amount decimal(8,2))
insert into WeekSales values(getdate()15, 100),(getdate()+0, 100)
,(getdate()14, 100),(getdate()23, 100),(getdate()3, 100),(getdate()3, 100)
,(getdate()23, 100),(getdate()+41, 100),(getdate()+11, 100),(getdate()+1, 100)
,(getdate()52, 100),(getdate()+10, 100),(getdate()+20, 100),(getdate()+0, 100)
,(getdate()61, 100),(getdate()+53, 100),(getdate()+33, 100),(getdate()+3, 100)
,(getdate()+64, 100),(getdate()+43, 100),(getdate()+46, 100),(getdate()+6, 100)
,(getdate()+53, 100),(getdate()+33, 100),(getdate()+13, 100),(getdate()+12, 100)
,(getdate()+42, 100),(getdate()+23, 100),(getdate()+23, 100),(getdate()+24, 100)
,(getdate()+2, 100),(getdate()+4, 100),(getdate()+5, 100),(getdate()+6, 100)
,(getdate()+3, 100)
datename(weekday,dt) Day_Name
,SUM(Amount) Amt
,SUM(SUM(Amount)) OVER() AS sumsumTotal
, SUM(Amount)*1. / SUM(SUM(Amount )) OVER() [Weekdaypercent]
,SUM(SUM(Amount )) OVER(Order by datepart(weekday,dt)) sumsumRunningTotal
FROM WeekSales f
WHERE dt>=dateadd(day,15,getdate()) and dt<dateadd(day,15,getdate())
GROUP BY datename(weekday,dt),datepart(weekday,dt)
drop table WeekSales

How to Delete Folder with Files from Bitbucket

From Bitbucket repository on the left, select Source<>.
On the right, select the Folder and expand the folder to see files;

Select a file and from arrow beside Edit and select Delete>> Commit.
REPEAT this process to delete each file;

From Bitbucket repository on the left, select Source<> again;

Observe that an empty folder is automatically removed from a git repository!

Parse Data with JSON

Declare @rept table(Id int Identity(1,1), ReptDesc varchar(max))
Insert into @rept (ReptDesc)
'Category: Projects; Accounting and Operations; Sales
Description: this is a transaction report'),
('Category: Technical, Accounting
('Category: Accounting'),
('Category: Accounting, Sales'),
SQL Server 2016 or up versions
;with mycte as (
Select * , replace(replace(replace(replace(replace(replace(ReptDesc,char(10),''),char(13),''),
'Category:','{"Category":["'),',','","'), ';','","'),'Description:','"], "Description":["')+'"]}' jsonCol
from @rept
select id
,c.value as Category
,d.value as Description
from mycte
CROSS APPLY OPENJSON(jsoncol, '$.Category') AS c
Outer APPLY OPENJSON(jsoncol, '$.Description') AS d
id Category Description
1 Projects this is a transaction report
1 Accounting and Operations this is a transaction report
1 Sales this is a transaction report
2 Technical
2 Accounting
3 Accounting NULL
4 Accounting NULL
4 Sales NULL

Custom Reports from SSMS for SQL Server Assessment

It is our pleasure from Iowa SQL Server Users Group (IASSUG) to have Dave Bland as speaker for Des Moines PASS August 10 Meeting:

“How to do a Complete Server Assessment”

He created a series of custom SSRS reports and offered to use them from SSMS with a few clicks to access bunch of information about your server and database.

You can download the source reports from a zipped folder at Dave Bland’s blog site:

SQL Server Assessment Using SSMS and SSRS

After you download the file and unzip it a your local drive to the SSMS file folder for easy access.

Launch your SSMS and connect to an Instance(Server). Right click on Instance name(or database name) and choose the Reports>Custom Reports… and browser to the files you unzipped.

Click on the first report _LaunchingPage.rdl and you are launching the Main report page.

Your journey starts here.

I will run this report on a few new 2019 servers I set up recently to view lot of useful information in a very short time.

During the presentation, Dave walked through some interested reports and their use cases. I have come across many of them from my DBA career and I know Dave did a great favor for SQL Server community with this handy report tool.

Thanks Dave!

Split with JSON in One Column

INSERT INTO #TEST(OrderITEM, CustomerId,OrderDate)
VALUES('A-100 A-105 C-100' ,1,Getdate()30),
('A-111 A-102 D-201' ,2,Getdate()5),
('D-400 G-10 F-220' ,3,Getdate()20),
('H-801 M-451 D-201' ,1,Getdate()3),
('F-701 A-102 D-201' ,4,Getdate()6),
('G-11 A-102 D-201' ,5, Getdate()10),
('B-710' ,6, Getdate())
select CustomerId,OrderDate
,value as OrderITEM
from #TEST
cross apply openjson('["'+(replace(OrderITEM,' ','","')+'"]'))
DROP TABLE if exists #TEST