PowerShell, Azure, Azure SQL & Power Bi working in Harmony…

Power Bi…

As you all by now are aware of, I do most of my day to day tasks with PowerShell. I have recently spent a lot of time in a project where I gather data from an API with the help from PowerShell and .Net Framework. Then I store that data into an Azure Storage Container for redundancy purposes. Once everything is prepared, I also upload the API Data in JSON format, to an Azure SQL database. A colleague of mine then turns that data into opportunities and provides interactive visualisation and business intelligence capabilities with Power Bi

  • First, I connect to the Azure Automation Account with the help of my Easy Authentication method and the Az Module. I then gather information from the Variable and Credential section in Azure to be able to connect to both the SQL Server and the Azure Storage Account.
  • I with a little help of the error-handling code my colleague wrote – Invoke-CodeInWrapper and some changes to that, for it to fit this purpose. I’m able to securely get the data from an API otherwise somewhat not entirely trustworthy, connection-wise.
  • Once I have that, I utilise the Set-AzStorageBlobContent to save the information onto the Azure Storage Account. Everything happens in a “for each loop” which contains a “while loop”, I do this so that I can gather all the data from several different API entities. See Code below.
$File = New-Item -ItemType File -Name $BlobName
$RawContent | Out-File -FilePath $File -Append
$AzureStorageBlob = Set-AzStorageBlobContent -File $File -Container $StorageContainerName -Blob $BlobName -Force -Context $DestinationContext
  • If everything goes accordingly, I make an INSERT INTO query to Azure SQL database, Using a Function I’ve written way back. Handle-SQL, which is a .Net Framework Function, it utilises the Namespace System.Data.SqlClient and its classes SQLConnectionSQLCommand and SQLDataAdapter 
    (Maybe I’ll write about that function in a later blog post (if so then I’ll try to remember to link that post in here.))
  • When we’ve gone through all different API entities, and everything has been uploaded to the SQL Table located in Azure. I do an EXECUTE query against the SQL Database for it to “Clean” the JSON information just uploaded.

Afterword

I hope you found this article somewhat useful I do this mostly to get peoples heads spinning when it comes to automate, Maybe you’ve had a similar idea of something while reading this article, well then I see this as a success. Please share those thoughts and ideas in the comment section below. Even though the above article might not count as a “Automation job” it is at its core job running each day and gathers the data from another source and then be able to show the history of that information in a beautiful and interactive visualisation within PowerBI.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.