Tag Archives: Reporting

Querying Azure DevOps REST API with PowerShell

In previous posts we have talked about trying to use and consume Azure DevOps using PowerShell and utilizing the Azure CLI. In particular, my post titled Use PowerShell to consume your Azure CLI DevOps result set painted a rather frustrating picture when trying to manipulate the tabular dataset from the Azure CLI. Furthermore, our functionality is restricted to only those commands implemented by the Azure CLI Azure DevOps add-in -as will become increasingly obvious, this is limited to say the least.

There is a better way to query Azure Devops – Azure DevOps REST API to the rescue.


Probably the first thing you will want to do is understand what kind of queries and actions you can make against the Azure DevOps REST API. These are not limited to reporting upon existing configurations, it can also be used to change configuration. For example, through the REST API we could POST a call to create a brand new release. For the purposes of simplicity we will simply query Azure DevOps in this article.

In order to understand all the potential Azure DevOps queries and actions you can make through the Azure DevOps REST API you can refer to the Microsoft Azure DevOps Services REST API Reference. We will return back to this reference when we look to make specific calls but before we get there, we will first break down the steps that you will need to take in order to successfully make your call.

Create your PAT token

In order to securely communicate with Azure DevOps, you will first need to create a PAT token which will allow your code to make an authorized call to the REST API. This can be created by clicking the configuration icon from the toolbar of Azure DevOps.

image.png

In my specific example I am going to create a PAT token with Full access, but it is recommended that you should create a Custom defined scope to limit the security surface area. Also note that you must set an expiration date to this token and once it expires, you will need either regenerate it, or create a new one to meet your personal requirements.

image.png

We can now use this PAT in your REST API call, but it is important to ensure this string uses Base64 encoding.

Assign and encode your PAT token

$personalToken = "tiksj25oumfavuzr4316vhpxw2mywzbapxj7sw3x2xet3dml1ygy"

#Write-Host "Initialize authentication context" -ForegroundColor Yellow
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($personalToken)"))
$header = @{authorization = "Basic $token"}

Make your REST API call

From the Microsoft Azure DevOps Services REST API Reference select the REST API call URI that you need to use.

In this first example, the URI chosen is used to query all existing Azure DevOps Projects. The following code invokes the Azure DevOps REST API call and iterates through each project.

For example, to make a call to query all projects in your Azure DevOps organisation you can call the following:

$url = "https://retracement.visualstudio.com/_apis/projects?api-version=5.0"

$output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header

$output.value | ForEach-Object {
    Write-Host $_.name
}

In my case I get the following Projects returned:

Parts Unlimited
ACME Corp
Main Project
My Test Project

Other examples

In the following examples I will perform some common queries against our Azure DevOps project. I will do my best to expand and implement new REST API calls over time in follow up posts.

Query all build definitions

In this example we will return the results in descending order. There is also a bit of further work needed to parse the definition output to improve the quality of the result set. I’ve left a few extra fields commented out for brevity.

# Builds API call
$url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/build/builds?api-version=5.0"
$output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
$output.value | Sort-Object id -Descending|ForEach-Object {
    Write-Host $_.buildNumber - $_.status - $_.reason# - $_.definition - $_.url
}

This returns the following builds:

26740 - completed - schedule
20021 - completed - schedule
17436 - completed - schedule
14701 - completed - manual

Query all release pipeline definitions

In this example we will pull back a sorted list of all release pipeline definitions.

# Release Definitions API call
$url = "https://retracement.vsrm.visualstudio.com/ACME%20Corp/_apis/release/definitions?api-version=5.0"
$output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
$output.value | Sort-Object name|ForEach-Object {
    Write-Host $_.name
}

And we get all current release definitions:

Big Daddy Release Pipeline
Little Tom Release Pipeline
Widgets Release Pipeline

Query all repositories in a project

# Repositories API call
$url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/git/repositories?api-version=6.0-preview.1"
$output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
$output.value | ForEach-Object {
    Write-Host $_.id,$_.name
}

The following repositories are returned:

ba008565-118a-41e6-878c-d7a8180bf734 Widget Database
49ebc167-8b48-4202-af4e-f8fd885aede1 Widget Notebooks
682c7ebf-11d1-443f-b0b0-fbd7f2bfdd71 ACME dotNet master
112635ba-c5e7-4c91-bae7-ff014cf36be4 ACME Helper

Query a repository branches

In this next example we will take a repository id and use this in our REST API call to query it’s branches.

# Repository API call
$repoId = "ba008565-118a-41e6-878c-d7a8180bf734"
$url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/git/repositories/$repoId/refs?api-version=6.0-preview.1"
$output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
$output.value | ForEach-Object {
    Write-Host $_.name
}

The branches returned for this repo are as follows:

ba008565-118a-41e6-878c-d7a8180bf734 refs/heads/mybrillfeature
ba008565-118a-41e6-878c-d7a8180bf734 refs/heads/development
ba008565-118a-41e6-878c-d7a8180bf734 refs/heads/master

Query all branches for all repositories in a project

We can put the previous two API calls together to query all branches for all repositories.

#branches for each repo
$url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/git/repositories?api-version=6.0-preview.1"
$repo = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
$repo.value | ForEach-Object {
    $repoId = $_.id
    $repoName = $_.name
    $url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/git/repositories/$repoId/refs?api-version=6.0-preview.1"
    $output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
    $output.value | ForEach-Object {
        Write-Host $repoId - $repoName - $_.name
    }
}

This returns:

ba008565-118a-41e6-878c-d7a8180bf734 - Widget Database - refs/heads/mybrillfeature
ba008565-118a-41e6-878c-d7a8180bf734 - Widget Database - refs/heads/development
ba008565-118a-41e6-878c-d7a8180bf734 - Widget Database - refs/heads/master
49ebc167-8b48-4202-af4e-f8fd885aede1 - Widget Notebooks - refs/heads/development
49ebc167-8b48-4202-af4e-f8fd885aede1 - Widget Notebooks - refs/heads/master
682c7ebf-11d1-443f-b0b0-fbd7f2bfdd71 - ACME dotNet master - refs/heads/development
682c7ebf-11d1-443f-b0b0-fbd7f2bfdd71 - ACME dotNet master - refs/heads/master
112635ba-c5e7-4c91-bae7-ff014cf36be4 - ACME Helper - refs/heads/development
112635ba-c5e7-4c91-bae7-ff014cf36be4 - ACME Helper - refs/heads/master

Tabular query of all branches for all repositories in a project

And finally, our Azure DevOps REST API result set is far more useful as a tabular object so that we can manipulate it further in PowerShell (should we so wish) and perform various filters and sorts against it. So extending the previous example we will put our result set into a table object.

#branches for each repo in a table
$table = New-Object System.Data.DataTable #create table and columns
$table.Columns.Add("Id")
$table.Columns.Add("Repository")
$table.Columns.Add("Branch")

$url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/git/repositories?api-version=6.0-preview.1"
$repo = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
$repo.value | ForEach-Object {
    $repoId = $_.id
    $repoName = $_.name
    $url = "https://retracement.visualstudio.com/ACME%20Corp/_apis/git/repositories/$repoId/refs?api-version=6.0-preview.1"
    $output = Invoke-RestMethod -Uri $url -Method Get -ContentType "application/json" -Headers $header
    $output.value | ForEach-Object {
        $table.Rows.Add($repoId, $repoName, $_.name)|Out-Null
    }
}
$table | select Repository, Branch | Sort-Object Repository, Branch| ft

Querying the table object and filtering on two columns returns:

Repository          Branch                                         
----------          ------
ACME dotNet master  refs/heads/development
ACME dotNet master  refs/heads/master
ACME Helper         refs/heads/development
ACME Helper         refs/heads/master
Widget Database     refs/heads/mybrillfeature
Widget Database     refs/heads/development
Widget Database     refs/heads/master
Widget Notebooks    refs/heads/development
Widget Notebooks    refs/heads/master

Conclusion

As you have seen, the Azure DevOps API is not only very easy to use and consume through PowerShell (when you know how), but provides a much more comprehensive route to interface with Azure DevOps than the other techniques I have previously talked about (such as the Azure CLI).

In future posts I will talk about implementing other queries and actions such listing all outstanding pull requests across repositories and even how to create a release.

Hope you find this post useful, please leave your comments below!

What are the impersonation rights for my database users?

From time to time I get thrown the odd request to provide various bits of information from the SQL Server environment and most recently I was asked about a group of databases that required migrating to another instance. For some reason there had been a little bit of a panic about the impersonation rights in each of the databases and their configuration. The request was *ahem* kindly passed to me. The number of database users was huge in each database and performing a manual lookup through the SSMS GUI got rather tiresome very quickly (after about the second user).

My reply was that as long as we ensured each database was migrated with associated logins and mapped correctly (to prevent orphaned users) we didn’t really need to worry about the impersonation, since it would be contained within the databases and unaffected by the move. Unfortunately, there was no getting away from it, they still wanted a report.

I first decided to perform a web search for the term “list impersonate rights” which led my to quite a useful post by Kendal Van Dyke (blog|twitter) called “Hey Mr. DBA, What Permissions Do I Have On This Database?” but I was already familiar with the examples listed, having previously come across the fn_my_permissions and fn_builtin_permissions functions in Books Online. They did what I wanted, but really the wrong way around. To use them I would have to write a cursor based solution which would be a little long winded.

Thankfully most SQL Server problems are relatively easy to solve by applying a very small amount of grey matter to them. I decided to turn my attention to the system catalog views and started off with sys.database_principals. Using the Microsoft SQL Server 2008 R2 System Views poster which is helpfully stuck to my office wall I focused in on this view and my attention was drawn to the sys.database_permissions catalog directly underneath. Surely this would be the most likely candidate for the IMPERSONATE grants?

Wall posters can come in handy!

After a very quick query of several of the databases provided to me I noticed that in one of them under the column permission_name, several rows contained the entry “IMPERSONATE” with a state_desc of “GRANT”. Bingo! That was just what I wanted. Time to write the query….

So we can test the query, let us first create a new database, several logins, several users from the logins and grant impersonation rights to some of the users to some of the others:-

CREATE DATABASE Impersonation;
GO
USE Impersonation;
GO

CREATE LOGIN login1 WITH PASSWORD = 'Password1';
CREATE LOGIN login2 WITH PASSWORD = 'Password2';
CREATE LOGIN login3 WITH PASSWORD = 'Password3';
CREATE LOGIN login4 WITH PASSWORD = 'Password4';

CREATE USER user1 FROM LOGIN login1;
CREATE USER user2 FROM LOGIN login2;
CREATE USER user3 FROM LOGIN login3;
CREATE USER user4 FROM LOGIN login4;

GRANT IMPERSONATE ON USER::user4 TO user1;
GRANT IMPERSONATE ON USER::user4 TO user2;
GRANT IMPERSONATE ON USER::user1 TO user3;
GRANT IMPERSONATE ON USER::user1 TO user2;

Next we should connect under login1 and change context to our Impersonation database; then attempt to impersonate user4 to test that everything works:-

user4 impersonation works

USE Impersonation
GO
EXECUTE AS USER = 'user4';
SELECT USER_NAME();

OK so we know all this works and we know how the rights are assigned. Lets write our query:-

USE Impersonation
GO
SELECT DB_NAME() AS 'database'
	,pe.permission_name
	,pe.state_desc
	,pr.name AS 'grantee'
	,pr2.name AS 'grantor'
FROM sys.database_permissions pe
	JOIN sys.database_principals pr
		ON pe.grantee_principal_id = pr.principal_Id
	JOIN sys.database_principals pr2
		ON pe.grantor_principal_id = pr2.principal_Id
WHERE pe.type = 'IM'

This query produces a rather simple result set as you see in the next picture:-


And there we have it -all impersonation rights are listed for the database in question. I was able to go onto query the five databases in question and provide a very simple report which would have taken me a very long time to perform using the SSMS GUI.

Whilst nothing I have demonstrated in this article is particularly difficult, it does highlight the multitude of different ways to perform operations in SQL Server. Just make sure you choose the easiest route!

T-SQL Tuesday #25 – The Best Tool You’ve Never Used

Wow, it’s been a long time since my last T-SQL Tuesday post and I have disappointingly missed it 5 times for one reason or another. Still, the important thing is that I am back people!

This month’s T-SQL Tuesday is brought to you by Allen White (blog|twitter) who many of you will know for his PowerShell, SMO and SQL Server expertise.


Ironically for this post (or the subject at least) I was saving up towards a larger Data Visualisation article that I was putting together, but when I read the subject of this months T-SQL Tuesday I just knew it was time to spill the beans.

Let me give you the backstory…

Around four years ago, week in and week out I used to religiously buy a Technology Publication just on the “off chance” that they would have something of use in their pages. Very occasionally a little gem would crop up and eventually one day I found what I was looking for. This particular little gem used a rather ingenious way to visualise folder and file space usage and most importantly it was free and fast.

One of the biggest problems I used to experience as a production DBA was the filling of various disk drives. From backups, log, data, tempdb and even system drive we would get it all and having a substantial amount of servers made it a common problem.

When these issues would occur I was very happy to receive them because I knew I could identify the problem area in minutes (if not seconds) and it was always very satisfying to whip out my secret tool (ooo errr missus!)

So why is it so good…

There are of course lots of other space analysis tools on the market, many commercial and many shareware or even free. But most of those I have seen have usually been slow, used poor visualisation and mostly required a local or remote installation. My tool is a simple Windows based executable that does not require installation and can live on your network home folder. This can be mapped across to any server you Remote Desktop into, so hey presto it is always available to you 🙂

One of the biggest usability points of the tool is the ability to zoom into out out of folders very easily and quickly. It is this ability which makes finding problem files and folders incredibly simple.

Imagine the following scenarios :-

  1. Your system drive is close to filling up, you are not sure why and your Windows Administrator is struggling to locate the problem folder.
  2. You have thousands of databases across many instances on a server and your: –
    1. Data drive is full
    2. Log drive is full
  3. Your Backup drive is getting very full. Backup and archive routines appear to be working.

… and some real-world explanations for those situations that I have uncovered in seconds using this awesome utility :-

  1. A DBA left a long running Profiler trace locally on the SQL Server. It transpired that even though the trace file can be saved on any drive, Profiler unfortunately generates a ever growing temporary trace file locally on the C: Drive until it is stopped (a recipe for disaster!).
    1. A mis-behaving bulk loading process caused a database data file to grow out of control.
    2. A long running transaction prevented the log file VLFs from being truncated and caused it to grow out of control.
  2. DBAs did not follow standards and practices and stored VLDB adhoc backups in random locations.

Remember fixing a SQL Server problem is usually pretty easy in most cases, the hard part is identifying where the actual problem lies. With the best Tool you have have never used, even identifying your space issues becomes a breeze.

Ladies and Gentlemen I give you SpaceMonger! Ensure that you download the old v1.4.0 version located on the Free Software tab since this is the superior release. Now imagine using this on a multi TB disk and being able to zoom into your problem. You will be astounded how fast and easy it is.

The more data you have the easier those problem files are to find!

Automating your snapshots

A little while ago I was talking to someone about possibilities of reporting from SQL Server to avoid impacting the production server and the very first thing that sprang to mind was for them to use snapshotting from a mirror. They had already considered this option but it was not really viable because an almost real-time reporting solution was required. It wasn’t until later that I wondered whether I had not really been precise enough in my explanation.

Since creating a snapshot can be pretty much instantaneous due to the way in which their mechanics work, as long as “real-time” means the time at any one instant, then there is no reason why snapshotting cannot be the solution for you. The only problem then that remains is how you are going to manage your snapshots, namely creation and destruction. Since the initial overhead of setting up (and then removing) a snapshot is relatively low, it makes perfect sense in my opinion to generate snapshots for reporting and then tear them down post report creation.

When you are creating a snapshot, since you must specify all datafile locations, auto-generation of them may at first seem a little more complicated than it really is. In truth, it is not particularly difficult and I have written some fairly useful code which can create a snapshot for you easily. Obviously you can customise the naming conventions of the snapshotted database to your requirements. Your reporting solution then could call a procedure to create the report snapshot (using the code below) on your mirrored database. All that would remain is to tear the snapshot down once the report has completed. I shall leave it to your imagination how you want to tear down your snapshots since depending upon your naming convention your method might differ slightly, but querying snapshots in existence is very simple as follows :-

SELECT d.name,f.physical_name FROM sys.databases d JOIN
sys.master_files f ON d.database_id=f.database_id
WHERE d.source_database_id=

The procedure does however have an output parameter that returns the created snapshot name and you can use it in your reporting TSQL code to tear the snapshot down post report generation. Its up to you, but it is there if you want it!

We then move on to the main code itself to auto-generate your database mirror snapshot. The first thing you need to do is to decide which database you are going to deploy this stored procedure to. I personally prefer to have a dedicated database administration database that has all the custom written procedures, functions and the like housed in there. Once you have changed context into the database of choice your then need to execute the usp_createsnapshot creation script which can be found here to deploy.

Once this procedure is deployed, all you need to do from now on in order to create a quick snapshot on any of your databases or mirrors located on this server is simply call the proc and pass in the name of the database or mirrored database and execute and hey presto a database snapshot has been created for you 🙂
Below I have entered a few examples of how the procedure works and it’s error handling and feedback. The final example is valid input.

EXEC dbo.usp_createsnapshot @help=1
GO
EXEC dbo.usp_createsnapshot master
GO
EXEC dbo.usp_createsnapshot invaliddbname
GO
DECLARE @ss sysname
EXEC dbo.usp_createsnapshot msdb,@ss OUT
PRINT 'Snapshot ' + @ss + ' created.'

Within this following screenshot are the results, and in particular the output within the red box demonstrates the correct use of the stored procedure.

Well that’s the end of this post ladies and gentlemen. I hope you find this stored procedure as useful as I do, and remember ….Happy Reporting!