Azure Cost Management – SQL Database Example

A key selling point of public cloud computing is that you “pay only for what you use”.  For many Microsoft Azure resources, this can be easily accomplished via the Azure Portal.  For example, auto-scaling of scale sets can be configured through the Azure portal to adjust instances to match a load or schedule.  For other Azure resources or more complex workloads, it is more involved.    You will need to write code to provide the intended capability.   In this example we will explore automating the deployment of resources needed to complete acceptance testing.


You are leveraging an Azure SQL Elastic Pool for your production databases.   To test rollouts of new functionality you need an exact copy of the production database environment.   You also need to minimize cost.

Production Resources

SQL Servers


SQL Elastic Pool

  • prodpool

SQL Databases

  • db1
  • db2
  • db3


The approach is to create the testing environment at the moment it is needed.  It will be an exact copy of the running production.   An Azure SQL server will be created and remain in place to ensure the name remains the same (and therefore SQL connection strings won’t have to be updated before completing tests).  There is no cost for a SQL Server with no databases, so costs will only be incurred during the actual testing.

Each time the test environment is needed, a PowerShell script will be run to:

  • Create a SQL Server elastic pool named testpool on


  • Copy each database from to the testpool SQL Server elastic pool on


The script leverages the Microsoft Azure PowerShell module, and goes through the following steps

  • Get information on the production elastic pool so we know what size and edition to create for the test pool and what databases to copy

$ProdPool = Get-AzureRmSqlElasticPool -ServerName “prodexample” `
-ResourceGroupName “prod” `
-ElasticPoolName “prodpool”
$ProdPoolDBs = $ProdPool | Get-AzureRmSqlElasticPoolDatabase

  • Create the test elastic pool

$TestPool = New-AzureRmSqlElasticPool -ResourceGroupName “testsql” `
-ServerName “testexample” `
-ElasticPoolName “testpool” `
-Edition $ProdPool.Edition `
-Dtu $ProdPool.Dtu

  • Copy each database from the production elastic pool to the testing elastic pool

foreach ($DB in $ProdPoolDBs) {
New-AzureRmSqlDatabaseCopy -ResourceGroupName “prod” `
-ServerName “prodexample” `
-DatabaseName $DB.DatabaseName `
-CopyResourceGroupName “testsql” `
-CopyServerName “testexample” `
-CopyDatabaseName $DB.DatabaseName `
-ElasticPoolName “testpool”

When the testing is completed, the test environment can be deprovisioned by removing the databases and then removing the test elastic pool.

The result is that you pay nothing until your test environment is needed, and you can provision your test environment as an exact replica of your production environment–in minutes (depending upon number and size of the databases).

Cloud 9 maintains scripts to automate this scenario and many other Azure resource management scenarios.  Please contact us at (855) 225-6839 or email to for information on our services and capabilities.


Ken Channon
Principal, Cloud Infrastructure Architect
Cloud 9 Infosystems, Inc.



Posted in Uncategorized | Leave a comment

Let Microsoft Flow handle automatic notification among your team when Power BI data driven alert is triggered

Microsoft Flow has been introduced to automate the workflows between your favorite applications. With flows you can collect data, send/get notifications, access files and more.

Human involvement in tracking changes in the dataset, may end up slowing the notification process. Communication within the team does immediately not happen when there is a change in the dataset. This communication breakdown causes essential information to not be brought out to the team in a timely manner. Today I am creating a flow that uses Power BI to send an alert to my sales team when the sales meet the threshold value.

Every time the company sales falls below the threshold value, the sales team should be notified immediately. The team can then work to address the problem at an initial stage and not end up in a situation where they are surprised by the fluctuations in the value at later stages. Microsoft Flows can communicate with a huge set of applications which includes Power BI and Office 365 Outlook. This problem can be solved using Flows. A Flow was created to send out an email with ample information immediately and automatically, when the company sales fall below the threshold value.


To start off with, I have already created a gauge in Power BI Service that is connected to live stream sales data.

                                         blog img-1

  • Set an alert, by opening menu and hit the bell like icon. That will open alert pane. You can add as many alerts as needed. Click on add alert and set alert name, condition and the frequency as required and done. You have successfully created an alert.
  • Now that I created alert, it`s time to start setting up the flow. Go to the following link to create a new flow
  • Start by adding Power BI to the flow, to do this search for power BI. As I wanted to notify my team when a data driven alert is triggered. I will go ahead and select img-2
  • Give an appropriate Alert ID, pick from the drop down menu that has the list of all the alerts configured in Power BI Services.
  • Add an action that sends out an email to the sales team with ample information.


I am using Office 365 outlook – send an email, to communicate with sales team. There are around 30 actions under office 365 outlook to pick from. Enter the email id(s) to whom the alert should be sent. A Side pane pops up when filling out details for sending an email, that has the information from the Power BI alerts and allows the user to add that information in the flow with a single click. Hit Create a flow and the flow is created and start working immediately.

blog img-3

Alert Monitoring team can now sit back and relax without the worry of manually checking sales data on a regular basis.  With few simple steps, an automated flow to send an e-mail to the sales team was created.  With the good understanding of the flow that you want to setup, it is easy to use Microsoft flows to create automated work flows.


Posted in Uncategorized | Leave a comment

PowerShell and Azure REST API

List Blob Encryption Status

As you write PowerShell scripts to automate management of your Azure environment, you may find limitations with the Azure SDK cmdlets.    For example, I recently enabled encryption on a customer’s storage account.   Since it only encrypts new blobs written, I wanted to provide a list of which blobs were encrypted and which were not.     The cmdlets do not provide the blob encryption status even though it is available via the Azure Rest API (i.e. the ServerEncrypted  property of the List Blobs operation).

I wrote a PowerShell script to make a request to list the blobs in a container and then display the blobs based upon encryption status (green for encrypted, red for not encrypted).

Here’s a breakdown of the script:

1. Set the Storage Account Name, Key and Container

$StorageAccount = <storage account name> 
$ContainerName = <container name> 
$Key = <access key>

2. Prepare the request

a. Create the string to sign

$date = [System.DateTime]::UtcNow.ToString("R")
$stringToSign = "GET`n`n`n`n`n`n`n`n`n`n`n`nx-ms-date:$date`nx-ms-version:2016-051`n/$StorageAccount/$ContainerName`ncomp:list`nrestype:container"

This string is for a list blobs operation using shared key authorization.  The format of the string is described here.

b. Create a hash object

 $sharedKey = [System.Convert]::FromBase64String($Key)
 $hasher = New-Object System.Security.Cryptography.HMACSHA256
 $hasher.Key = $sharedKey

c. Sign the string

 $signedSignature = [System.Convert]::ToBase64String($hasher.ComputeHash([System.Text.Encoding]::UTF8.GetBytes($stringToSign)))

d. Create the headers

$authHeader = "SharedKey${StorageAccount}:$signedSignature" style="padding-left: 60px;">
$headers = @{"x-ms-date"=$date "x-ms-version"="2016-05-31" "Authorization"=$authHeader}

e. Create the URI

 $URI = "https://$" + $ContainerName + "?restype=container&comp=list" 

3. Make the request

 Invoke-RestMethod -method GET -Uri $URI -Headers $headers -OutFile $FileName

4. Parse output to display encryption status

 foreach ($blob in $BlobList.EnumerationResults.Blobs.Blob) {
 if ($blob.Properties.ServerEncrypted -eq $False) {
 write-host $blob.Name -ForegroundColor Red }
 else {
 write-host $blob.Name -ForegroundColor green}

Example output from the script:


Ken Channon
Cloud 9 Infosystems, Inc.

Posted in Uncategorized | Leave a comment

Microsoft Azure has overtaken AWS as public cloud provider of choice

A new survey of IT professionals shows Microsoft Azure has overtaken Amazon Web Services (AWS) as the public cloud provider of choice, although there is considerable overlap.

The survey was commissioned by Sumo Logic, a data analytics provider, and was performed by UBM Research. It surveyed 230 IT professionals from companies with 500 or more employees.

The survey found 80 percent of enterprises currently use or plan to use at least one public cloud provider, if not more. And given the figures, a large number are clearly using more than one. Around two-thirds (66 percent) of respondents said they use Azure while 55 percent said they use AWS. Salesforce App cloud comes in third at 28 percent, IBM fourth at 23 percent and Google is at 20 percent.

Read More

Posted in Uncategorized | Leave a comment

Microsoft Gaining Ground As Alternative To Amazon Web Services

Microsoft‘s (MSFT) Azure cloud computing platform is gaining ground as an alternative to‘s (AMZN) Amazon Web Services, one analyst says.

“Heavy investments from Microsoft now appear to be paying off this year, based on positive feedback from channel partners that were more skeptical of Azure six months ago,” Pacific Crest Securities analyst Brent Bracelin said in a report late Thursday. “One partner called 2017 ‘The Year of Azure’ based on increasing level of interest and activity.”

Bracelin reiterated his overweight rating on Microsoft stock with a price target of 70.

Microsoft was up 0.6%, near 65, in afternoon trading on the stock market today. The stock has formed a flat base over the last seven weeks with a buy point of 66.01.

Microsoft is doing especially well with large enterprise customers because it offers products that span on-premise and cloud environments, Bracelin said.

Are you getting the most out of IBD? Our Getting Started Guide can help!

The migration of enterprise information technology from on-premise hardware to cloud-based services is still in the early innings, he said.

“While AWS has a multiyear lead in IaaS (infrastructure as a service), we see higher potential for Azure to become an upside lever for Microsoft this year and argue it has unparalleled product breadth across both applications and infrastructure within on-premise and public cloud environments,” Bracelin said.

Azure has an estimated $2.5 billion annual run rate. Microsoft’s overall commercial cloud business, which also includes Office 365 and Dynamics Online, is on a $14 billion run rate, he said.

Those figures are small compared with total annual tech spending, including communications, of $1.3 trillion, Bracelin said.

“The shift to cloud could be a decade-long tailwind for Microsoft, AWS and Google,” he said. Alphabet (GOOGL)-owned Google is a distant third in the cloud computing market.

Read More


Posted in Uncategorized | Leave a comment

The cloud comes through in crises, drones detect diseases and Cortana Intelligence elevates IoT — Weekend Reading, Feb. 3

 We may be in the chilly depths of February, but there’s no winter hibernation going on at Microsoft. Join us for an overview of some of the big happenings over the past week.

Posted in Uncategorized | Leave a comment

Microsoft breaks through in the Gartner Magic Quadrant for Business Intelligence and Analytics Platforms

Today I’m thrilled to share that for the second year in a row, Microsoft has been positioned furthest to the right for completeness of vision within the Leaders quadrant of Gartner’s 2017 Magic Quadrant for Business Intelligence and Analytics Platforms – the 10th consecutive year Microsoft has been positioned as a leader.

We’re humbled by this recognition for the innovation we’ve delivered with Microsoft Power BI in the past year, including significant growth in both the vision axis and execution axis since the 2016 report. But, more importantly, we’re encouraged by the progress we’ve made as a community in executing against the ambitious goal set when Power BI was made generally available only a short time ago in July 2015: Provide BI to more people than ever before, across all roles and disciplines within organizations.

Power BI is the modern business intelligence solution you can bet on. We will continue to deliver innovation and value with precision and speed. The mantra that drives us is “five minutes to wow” – our relentless focus to enable a user to sign up for Power BI within five seconds and get business value from the service within five minutes. Customers around the world are realizing this vision and capitalizing on the promise of self-service BI by making it a reality at massive scale.

Read More

Posted in Uncategorized | Leave a comment