Category Archives: SQLServerPedia Syndication

Using Azure CLI to query Azure DevOps

Coding CatIn previous posts, I have touched upon the use of Azure Cloud Shell for generic querying of Azure resources and I thought it would be useful to quickly document its use for something a little more specific such as querying or manipulating Azure DevOps through the command line.

For my example, I will focus on something as mundane and straight-forward as querying the Azure DevOps repository meta-data (so that I can look at and compare branch settings against each other) but I hope you get the idea that this is just scratching the tip of the iceberg and the Azure CLI is a powerful tool to add to your arsenal of scripting languages.

The whole end-end process required to query Azure DevOps is itself is a relatively straight-forward affair -especially when you know exactly what you are doing (isn’t everything!) but before we get there, you will first need to have access to the Azure CLI. You have two ways of using it, the first being to install it locally -and instructions to do this can be found via an earlier post titled “AzureRM, Azure CLI and the PowerShell Az Module“. Alternatively, you may also use the Azure CLI through Azure Cloud Shell (i.e. directly from Azure) as detailed in another of my posts titled “Introduction to Azure Cloud Shell“.

Configure az devops pre-requisites

Once you are up and running with the Azure CLI and have access to its az command, there are a few pre-requisites needed before you can query Azure DevOps directly. These are detailed as follows:
1. You must ensure that you are running Azure CLI version 2.0.49 or higher. You can check this by simply running the following command:
az --version
az version
2. Your Azure CLI must have the azure-devops extension added to it. To check if this is already available run the following command to list your extensions:
az extension list
az extension
If the extension is not listed you can add it as follows:
az extension add --name azure-devops
added extension
For further information on this extension, you can view the Microsoft documentation titled “Use extensions with Azure CLI“.
3. Your az session must be signed in to your Azure tenant, and to do this use the az login command and provide the relevant credentials:
az login
4. Finally, to avoid having to provide a project context every time you run an az devops command you should set a default project context as follows (obviously use your own organization and project):
az devops configure --defaults organization= project="ACME Corp"

You are now ready to go!

Querying DevOps through Azure CLI

In order to find out all the commands now made available to you with your new extension, you can execute the following command:
az devops -h

By doing so, you will note that the extension provides devops subgroup commands such as teams -for example to list your current devops teams:
az devops team list

As the help context shows, the extension also provides “related groups” (such as repos) to manage other facets of Azure DevOps. In our specific example, we want to query all available repos for our Azure DevOps project. We can do this as follows:
az repos list
json results
Notice that your results come back in JSON format by default. We can override this and return results in tabular format by using the output parameter:
az repos list --output tabletable output
The Azure CLI also provides a query option so that you can provide a JMESPath query string to filter your results. For instance, in the most basic scenario we can return the first element from our results (using zero-based index notation):
az repos list --query [0]

That is clearly not so useful, so instead, I want to return specific properties from all repos. In this case, I want to return its name, Azure repo url path, and the default branch that is set:
az repos list --query [].[name,webUrl,defaultBranch]

In our final example we will return the results in a tabular format and alias our property names (for our column headings):
az repos list --query "[].{Name:name, Url:webUrl, DefaultBranch:defaultBranch}" --output tablewith aliases


Being able to programmatically query Azure DevOps through the Azure CLI is incredibly useful and powerful and could help you keep your environment standardized (for example ensure branch policies across repos are identical) or even provides a method that you can easily track change. Obviously we are not just restricted to the Azure DevOps repos, we can look at all facets of the environment. For example, to list all current builds in a project we can issue the following command:
az pipelines build list -o table

As a final point of note, I confess to finding JMESPath to query and filter my results far less intuitive or simple than with other languages (especially given the semi-structured nature of the data you are filtering), but with a little bit of trial and error, you can eventually get there!

I hope you find my post useful and please feel free to provide feedback in the comments.

Further References

Encouraging diversity at technical events – volunteers and organisers

In the third and final post of this three-part series, I will be exploring a few thoughts and ideas regarding how we can encourage diversity of volunteers and organisers at technical events. I have been organising events as far back as 2010 so I have (no doubt) been responsible for many failures (and hopefully a few successes) in promoting and hosting events to the community.

Before we start talking about diversity, I think it is first important to define what I mean by this term. According to the Merriam-Webster definition, diversity is “the condition of having or being composed of differing elements” and especially “the inclusion of different types of people (such as people of different races or cultures) in a group or organization”. I don’t quite think this extends far enough, and I see personally see diversity as the inclusion of all people regardless of their gender, sexual orientation, ability, physical appearance, age, size, race, or religion. Our aim, therefore, should be to provide a harassment-free conference experience in which everyone can enjoy this in equal measure.

The following is not a comprehensive list, but I hope it will provide some food for thought when you start planning your next event.

Understand why you are doing this

It is important that you understand why and who you are going to supplement your organiser and volunteer team. You are not trying to fill quotas here, but instead trying to build a diverse team that truly reflects all views, ideas, outlooks, and abilities. The reason why this is a good thing is it will help you implement an event that will appeal to a diverse audience (and we have already discussed why that can be a good thing). So remember, it is not quotas you are filling, but representation.

Look towards encouraging volunteers from other events and groups

It perhaps should now go without saying that you should reach out into other communities for help for your event. These people will be more attuned to meeting their own communities needs and wants, and will also be used to coping with the demands of pulling off a successful event.

Give people ownership of their responsibilities

If you are a control freak (like me), you will struggle with the idea of handing over power of any sort for your fellow organisers and volunteers to run independently of you. This kind of command structure is not conducive to team members from growing into their role and are more likely to fail. No one wants to be bossed around at an event and if they are, you are probably not going to see them wanting to join your event the next time. Most people will live up to their responsibilities if they have control, so it is important that you give them this, and offer your support where they need it.

Understand a persons strengths and appoint accordingly

Do not assume that all of your volunteers will have exactly the same abilities (or capabilities). One person might be able to stand on their feet for a large part of the day and run between rooms – but someone else might have a medical condition that could struggle with these kinds of duties.

It is important that you communicate with all of your volunteers and understand how they can most effectively participate with volunteering at your event. Assign them roles that they will not only be able to perform, but also ones that they will enjoy.

Get plenty of volunteers onboard

Not everyone can easily deal with the pressure that running an event entails. It is important that you do not put unnecessary and extreme pressure on your team members to deliver or that exposes them to too many problems. Ensure that you bring onboard plenty of volunteers so that one person is not a single point of failure to the event activities, and that everyone has plenty of support.


It is incredibly important that you try to bring onboard lots of diverse representation onto your team so that you can run a more informed and efficient event schedule that meets the demands of your audience. This representation will also act as a great way to encourage further involvement and diversity in future years and ultimately help you achieve your goals (where you alone might fail) of encouraging diversity at technical events.

I hope you have enjoyed listening to my ramblings on this topic and I am very happy to listen to your thoughts and ideas too!

Encouraging diversity at technical events – speakers

In the second of this three-part series, I will be exploring a few thoughts and ideas regarding how we can encourage diversity of speakers at technical events. I have been organising events as far back as 2010 so I have (no doubt) been responsible for many failures (and hopefully a few successes) in promoting and hosting events to the community.

Before we start talking about diversity, I think it is first important to define what I mean by this term. According to the Merriam-Webster definition, diversity is “the condition of having or being composed of differing elements” and especially “the inclusion of different types of people (such as people of different races or cultures) in a group or organization”. I don’t quite think this extends far enough, and I see personally see diversity as the inclusion of all people regardless of their gender, sexual orientation, ability, physical appearance, age, size, race, or religion. Our aim, therefore, should be to provide a harassment-free conference experience in which everyone can enjoy this in equal measure.

The following is not a comprehensive list, but I hope it will provide some food for thought when you start planning your next event.

Promote your Call For Papers through other events and groups

In a similar way that you would (if you were trying) to encourage attendees to your event, it is important that you also reach out to other community events and user groups to announce your Call For Papers. There are lots of groups like Girls Who Code and PASS Women In IT who would be a great starting point, but you should also think about even contacting non-technical groups and associations. This will give your event much more reach to communities beyond bridging more than just the gender balance and help make your speaking roster incredibly diverse, interesting, and appealing.

Seek non-technical Sessions

If you are seeking speakers from non-technical communities, it probably should go without saying that the vast majority of the potential speakers in them would be non-technical. You should be open to (and actively seek) non-technical sessions on subjects that might be of interest to a technical audience. Topics such as embracing diversity in IT or designing a workplace for mobility could be a perfect fit for your event and will appeal to a more managerial level of audience – these are the very same people that your sponsors will be keen on (those with the power to sign off purchases of products!).

Approach Speakers Directly

As an event organiser, there is always a niggling voice in the back of your mind saying “if someone cannot be bothered to submit a session then they don’t deserve to speak” but the reality is that many people do not submit to your event for a million and one reasons. It might be a confidence thing, or it might even be a fear that your event won’t be welcoming to them. This mindset will not help encourage diversity of submissions, so you should make the effort to approach those individuals directly that would traditionally not submit to your event. Remember that you are not selecting someone because they simply “fill a quota” but instead selecting someone because they bring something extra to your event. It is a bit like a music festival organiser signing some great bands for their schedule – you are almost certainly going to want the populist bands that tour everywhere, but you’ll also want to seek out those exciting new bands so that people can say “I saw them first at your festival!”.

Do not be afraid of setting diversity targets

A persistent concern that I wrestle with is the fear of positively discriminating against someone in order to “fill a quota”. It is not good for anyone if your speakers are not selected on their own merits -and they should be, but you should also not fear aiming towards (and reviewing) your diversity targets year on year. Remember that the whole point of aiming for a good diversity balance today will serve to encourage others tomorrow and remove the fear of “I can’t do this” or “I shouldn’t be here”. Remember that until your event gets an equal balance of submissions from all-comers and becomes a norm (which is probably not going to happen any time soon) you cannot truly say that selection was “fair” and you should work towards making it so.

Blind selection

Blind selection of sessions can be a good way to remove unconscious bias from the selection process, but it can also help to miss striking a good diversity balance if (as we talked about earlier) you do not have an equal representation of sessions. If you perform blind selection at your event, you should also not be afraid to review how that may have skewed your diversity targets – and not be afraid to address them for the reason given above.

Fast Call For Papers and announce your schedule early

Speakers with a young family are going to have more difficulty in speaking at your event if you do not give them enough time to make arrangements for childcare or other such considerations. The earlier that you are able to give a speaker notification of selection, the earlier they can purchase flights and accommodation at much cheaper prices. This is essential for a speaker that does not have a large disposable income, or other dependents to consider, or someone traveling from another country. I generally find that a minimum of 3 months’ notice should be given to minimize the expense to the speaker and make it more likely that they will be able to attend.

Provide Creche facilities

At the last conference I organised, I tried very hard to provide creche facilities but sadly our venue did not allow children under 16 on-premises. It was my belief that if we could provide a temporary place for children to be looked after during the event, more speakers with children would be more likely to attend and drop their child in the creche whilst they spoke. Obviously a similar argument could be made for having a creche to encourage attendees with children, though I think it is less likely for a parent to bring their child or children to a conference and want to drop them off in the Creche all day.

Avoid unisex speaker shirts

Unisex speaker shirts sound like a good idea at the time since (as an event organizer) you would only be dealing with ordering various shirt sizes rather than worrying about cut, but from personal experience, I have heard too many complaints from female speakers (to ignore) that these Unisex style shirts do not fit ladies very well. In practice, the overhead of ordering a different style of shirt is insignificant, though one problem you might run into is that commercial clothing manufacturers often only supply certain items of clothing in Unisex style. Do your best to go for alternative items of clothing where possible as your official speaker shirt since I know that this is very much appreciated and demonstrates to all speakers that you really care.

Be flexible with timeslots

Speakers with children, medical conditions, or other considerations may have to return home as soon as possible once their speaking engagement has finished. It is important that organizers try to be as flexible and accommodating as possible when drawing up an event agenda and (occasionally) be able to juggle timeslots on the day due to short notice change of circumstances. Liaising closely with your speakers about speaking arrangements will give them confidence in your ability to accommodate any known or unforeseen problems and make them more likely to be able to deliver their session.

Provide a private area for speakers

It is important that speakers have a quiet place to relax, hang out, and get away from everything. I know that many events have started to “do away” with the speaker room to encourage interaction between speakers and attendees, but some speakers might feel socially awkward and uncomfortable doing this. Remember everyone is different. You are trying to encourage people from diverse backgrounds and identities to speak at your event, so you should try and provide what they need to let them be able to be themselves.


Having a diverse speaker line up at your event is a very powerful way of encouraging new and existing talent from other communities to speak at future events as well as promoting your event further to a larger potential audience. In my opinion, it is going to take a lot of time and effort to get to a point where we won’t have to proactively need to go out and look for speakers from other communities to present at our events, but the more effort we make today, the more likely it will become the norm in the future.

Encouraging diversity at technical events – attendees

In the first of this three-part series, I will be exploring a few thoughts and ideas regarding how we can encourage diversity of attendees at technical events. I have been organising events as far back as 2010 so I have (no doubt) been responsible for many failures (and hopefully a few successes) in promoting and hosting events to the community.

Before we start talking about diversity, I think it is first important to define what I mean by this term. According to the Merriam-Webster definition, diversity is “the condition of having or being composed of differing elements” and especially “the inclusion of different types of people (such as people of different races or cultures) in a group or organization”. I don’t quite think this extends far enough, and I see personally see diversity as the inclusion of all people regardless of their gender, sexual orientation, ability, physical appearance, age, size, race, or religion. Our aim, therefore, should be to provide a harassment-free conference experience in which everyone can enjoy this in equal measure.

The following is not a comprehensive list, but I hope it will provide some food for thought when you start planning your next event.

Implement a Code of Conduct

One of the very first things we can do for our event is to implement a code of conduct for all attendees, speakers, sponsors, volunteers, and organizers to adhere to. This will act as a necessary framework to refer back to. I personally like the ability to amend and change the Code of Conduct where it becomes obvious over time that you have made a mistake or omission – but you should be careful not to change policy on a whim, get team approval, and document what and when a revision was made.

The Code of Conduct should be well publicized and kept as simple as possible to ensure that it is understood and followed by everyone at your event.

Have contact details for problems to be reported to – and make everyone aware during the event who they can speak to about any possible issues.

Ensure your event space is accessible and has adequate parking and drop-off areas

I once held a conference at an event space that had stairs and few lifts. After being contacted by a wheel-chair bound attendee in advance of an up-and-coming event, it became clear that we would have to assign a member of the volunteer staff to assist with the attendee in order for them to move from session to session. He was also arriving by train and getting a taxi to and from the event, so we also needed to arrange special dispensation into the event grounds to allow the taxi in and out since we were not allowed to use their private parking or access and the nearest public car park was a 15-minute walk away.

Clearly an event’s accessibility is going to affect who is going to want (or be able) to attend your event and it might not be obvious what someone’s ability might be.

You should look to make accessibility both in/ around and to/ from the event as painless possible -but it is also critical that you provide a point of contact for inquiries where extra help might be needed.

Establish session delivery rules

Firstly the event Code of Conduct should already make it clear to speakers to refrain from any inflammatory, derogatory, sexual, racist, or offensive comments or actions.

Furthermore, all speakers should adhere to a common framework of best practices for session delivery to ensure that a consistent and optimal experience is enjoyed by all. This is usually better being explicitly defined by an event.
Speakers can help people with audible impairments and deliver their session in a calm and clear manner trying their best not to mumble or rush (regardless of time). For people with visual impairments, the slide decks should avoid any color combinations that would give them problems (avoiding colour combinations that make it hard for colour blindness) and they should use large enough fonts and graphics. It should go without saying that all demos should be equally large enough for all to see.

Speakers should be mindful of attendees with mobility disabilities being able to move between rooms. They should aim to start their session on (or just after time) and wrap up well within the designated end time to provide as much time for an attendee to get to another room.

Organizers should ensure that there is plenty of time in between sessions and that session rooms have plenty of access between the chairs. Why not even go one step further and leave a front-row clear for wheelchairs? Why not leave a generous amount of space between chairs to make everyone feel a little more relaxed and comfortable during the presentation?

Publicise your event effectively

It can be very difficult to attract diversity at your event since often people from different backgrounds might not hang around in the same social or disadvantaged circles as yourself. You should be mindful of this and think outside the box. Attempt to contact group leaders in these other communities and ask them to promote your event to their attendees. Perhaps you can even offer to provide a financial incentive for every successful referral they manage to make? Every attendee has a small financial value to an event, and sponsors like to see new faces, so why not pass on a small amount of your sponsorship in this way? Communities supporting communities is really why we run these events.

Whatever you do to spread the message far and wide to potential future attendees, the most important thing you need to do is to promote your positive event message of diversity ensuring that all information is available (or easily accessible) from the front page of your event. You might only get one-shot to attract new faces who traditionally might avoid these kind of events so this is your opportunity to sell it to them.

Set up social and quiet spaces

Not everyone likes talking to complete strangers. For some people this comes easy, but for many (myself included) event break times are usually not something I particularly enjoy. Try to do your best to set up various activities in your social areas to make it easier for people to interact and encourage dialogue and participation (for instance and some of my past events we have set up large garden games (such as Giant Connect 4 and Giant Jenga) in the social spaces. Also aim to provide several quiet areas where people can go and sit down, be by themselves, and relax. Do not forget that some people might have religious or personal activities that they need to do during break times -perhaps they need to pray or take medication. At all of my recent events, I have made sure we have had at least 1 quiet room and 1 prayer room available at all times.

Provide accurate name badges

Name badges are a rather personal thing. Not everyone wants their twitter handle emblazoned across their chest, nor might they want any assumptions as to their gender, names, or other such nomenclature printed there. You should, therefore, give this thought and consider providing a display name type field on the event registration, making sure that the attendee is happy with anything else you might want to print about them.

Respect opt-outs

Probably one of the most frustrating aspects of attending an event is getting spammed. Whilst this is annoying to many, to those from diverse backgrounds, the thought of having their personal private details distributed to complete strangers is probably a step too far. If someone states that they want to opt-out of sponsor communications, then ensure their details are not passed onto anyone else. Clearly GDPR means you should be handling other people’s data with care already!


Having a diverse attendance at your event is not only great to share different ideas from many different viewpoints, but can also be attractive to sponsors of an event who are always wanting to cross-pollinate and promote into different communities. But reaching out to these diverse communities and attracting them to your event can be very difficult. Over time, word of mouth can help to promote your event far and wide, but you first need to lay the groundwork so that your event is a safe welcoming place to be and provides for people with diverse needs and requirements. Remember that attendee diversity can also be encouraged through attendees seeing that there is also diversity in your event organizer/ volunteer teams and speakers chosen for your event. In the last two parts of this series we will explore these things.

Azure subscription is not registered to use Cosmos DB namespace

It is usually the simplest things that often leave me feeling like I am a complete dummy. One such issue I ran into fairly recently when trying to deploy Cosmos DB into an Azure subscription.

From Azure DevOps I received the following error:

2019-05-31T16:28:55.4288261Z ##[error]MissingSubscriptionRegistration : The subscription is not registered to use namespace ‘Microsoft.DocumentDB’. See for how to register subscriptions.
2019-05-31T16:28:55.5005526Z ##[section]Finishing: Deploy Cosmos Account and Database with Shared Capacity

I initially assumed that the error was Azure DevOps related so I attempted to deploy using PowerShell and ran into an almost identical error.

I had deployed this Cosmos DB template successfully many times in our other subscriptions and could not understand why a simple deployment to an alternative subscription would fail. Looking back at the error message I followed the link provided which took me to a Microsoft doc titled Troubleshoot common Azure deployment errors with Azure Resource Manager and linked within I ended up on Resolve errors for resource provider registration.

It turns out that the Azure resource providers, which you can almost think of as Class libraries, can (like their programmatic counterparts) be in either Registered or NotRegistered state. When they are in a NotRegistered state, this means that we are unable to call that provider to create a specific resource (such as Cosmos DB in my case).

We can use PowerShell Az or the Azure CLI (both talked about elsewhere in this blog) to report what resources are available. In this specific example, I am going to return all providers that match the wildcard pattern of Microsoft.D*. The code searches for and sets the relevant subscription using Azure Cloud Shell (for simplicities sake), but you can do this through a remote Azure CLI or PowerShell Az session if you would prefer (and connectivity allows).

$subscription = Get-AzSubscription | Where-Object Name -Match "MySubscription1*"
Select-AzSubscription $subscription 
$providers = Get-AzResourceProvider -listavailable |Select-Object ProviderNamespace, RegistrationState
$providers  | Where-Object ProviderNamespace -Match "Microsoft.D*"

We get the following results:

ProviderNamespace               RegistrationState
-----------------               -----------------
Microsoft.DBforPostgreSQL       Registered
Microsoft.DevTestLab            Registered
Microsoft.Databricks            Registered
Microsoft.DataLakeStore         Registered
Microsoft.DataLakeAnalytics     Registered
Microsoft.DBforMySQL            Registered
Microsoft.DevSpaces             Registered
Microsoft.Devices               Registered
Microsoft.DataFactory           Registered
Microsoft.DataBox               NotRegistered
Microsoft.DataBoxEdge           NotRegistered
Microsoft.DataCatalog           NotRegistered
Microsoft.DataMigration         NotRegistered
Microsoft.DataShare             NotRegistered
Microsoft.DBforMariaDB          NotRegistered
Microsoft.DeploymentManager     NotRegistered
Microsoft.DesktopVirtualization NotRegistered
Microsoft.DevOps                NotRegistered
Microsoft.DigitalTwins          NotRegistered
Microsoft.DocumentDB            NotRegistered
Microsoft.DomainRegistration    NotRegistered

Notice that the Microsoft.DocumentDB namespace is disabled. If you are wondering, DocumentDB was the precursor name of the Cosmos DB SQL API (before Cosmos DB supported multiple APIs). Like many other Microsoft products, early names tend to stick with the products :).

To register this namespace we can simply run the following line of code against the subscription using the Register-AzResourceProvider cmdlet.

Register-AzResourceProvider -ProviderNamespace Microsoft.DocumentDB

The following output is returned:

ProviderNamespace : Microsoft.DocumentDB
RegistrationState : Registering
ResourceTypes     : {databaseAccounts, databaseAccountNames, operations, operationResults…}
Locations         : {Australia Central, Australia East, Australia Southeast, Canada Central…}

If it is not obvious you would unregister a provider namespace (if you wanted to make it unavailable) using the Unregister-AzResourceProvider cmdlet as follows:

UnRegister-AzResourceProvider -ProviderNamespace Microsoft.DocumentDB

Once I had registered the Microsoft.DocumentDB namespace, I was able to deploy my Cosmos DB template into my subscription without error!


Depending upon your subscription and region, your enabled provider namespaces may vary, however in my case someone had explicitly un-registered Microsoft.DocumentDB from it. You might ask why someone might do that? Well, it is a good way to prevent deployments of certain resource types if they go against your company policy.

As you can see, if you run into a similar problem or want to start using resource types that are by default NotRegistered you can register and start using them incredibly easily.

Introduction to Azure Cloud Shell

Azure Cloud ShellIn my last couple of posts, I have described the remote management of Azure through the command line from what was essentially a fat (or thick) client. This gives you an awful lot of scripting and automation control over the platform by using either the Azure CLI or PowerShell through the PowerShell Az Module. This is perfect for most cases, but what if you are using an unsupported Operating System or you only have access to the browser (perhaps via a thin client)?

Thankfully a solution to this problem already exists and the good folks at Microsoft have made it easy for you to have full scripting ability over your Azure subscription through your web browser of choice! Enter Azure Cloud Shell…

Accessing Azure Cloud Shell

There are two ways to access Azure Cloud Shell, the first being directly through the Azure Portal itself. Cloud Shell PromptOnce authenticated, look to the top right of the Portal and you should see a grouping of icons and in particular, one that looks very much like a DOS prompt (have no fear, DOS is nowhere to be seen).

The second method to access Azure Cloud Shell is by jumping directly to it via which will require you to authenticate to your subscription before launching. There is an ever so slight difference between each method. Accessing the Shell via the Azure Portal will not require you to specify your Azure directory context (assuming you have several) since your Portal will have already defaulted to one, whereas with the direct URL method that obviously doesn’t happen.


Select your Azure directory context through

For both methods of access, you will need to select the command line environment to use for your Cloud Shell session (your choice is Bash or PowerShell) and the one you choose will partially depend upon your environment of preference.


I will explain the difference later, however, for now, I am going to select the Bash environment.

Configuring Azure Cloud Shell storage

When using Azure Cloud Shell for the first time you will need to assign (or create) a small piece of Azure storage that it will use.  Unfortunately, this will incur a very small monthly cost on your subscription.

ACS storage

The storage is used to persist state across your Cloud Shell sessions.  To get a little more visibility about what is going on I am going to click Show advanced settings:


It is slightly disappointing that at the time of writing there are only 7 available Cloud Shell storage regions -which means that your Shell storage might not be able to live in the same region as your other resources (depending upon where they are).


Would it really matter that your Cloud Shell blob might live in a different region? I think it is probably very unlikely that you will consume egress data into your Shell region since it is management that is the purpose of Cloud Shell, not data staging, but I suppose you might want to bear in mind when you are scripting.

In my specific case (as you will see above) I decided to use an existing resource group named RG_CoreInfrastructure and within create a new storage account named sqlcloudcloudshell [sic] under the North Europe region and within this create a new cloudShellfs file share.

I don’t really like this dialog box since it is not very intuitive and allows you to submit incorrectly named or existing resources -both leading to creation failure. I’d rather they are caught and reported on at input time. For the record, the general rules for our Cloud Shell are that the storage account name needs to be in lowercase letters and numbers, must begin with a letter or number, be unique across Azure, and between 3 to 24 characters long (phew!). The file share can only contain lowercase letters, numbers, hyphens, and must begin with a letter or number and cannot contain two consecutive hyphens. It is a little frustrating, but you will get there in the end with a bit of trial and error!

Whilst it is possible that you could pre-stage all of these things upfront and select an existing storage account (assuming it was in one of the 7 Cloud Shell regions), I was particularly interested in what Azure was going to provision, being mindful about not choosing storage that was more expensive than it needed to be. As it turns out, Azure created my storage as follows:

Performance/Access tier: Standard/Hot
Replication: Locally-redundant storage (LRS)
Account kind: StorageV2 (general purpose v2)

Other things of note were creation tagged this storage resource with the name-value pair of ms-resource-usage/azure-cloud-shell and set a file storage quota of 6GiB.

Running Azure Cloud Shell

Once setup has completed, the Azure Cloud Shell will launch as follows:


If you look at the menubar at the top of your Azure Cloud Shell window you will note that there is a dropdown currently set to Bash -which was the type of shell session chosen earlier. If we change this to PowerShell it will reconnect back into the Cloud Shell Container, this time using PowerShell as your entry point of choice.


Within a few seconds, you will now have entered into a PowerShell session.


Bash or PowerShell?

If you can remember when we first launched Azure Cloud Shell we had to select Bash or PowerShell as our environment of choice. The question is, which should you choose?

The real answer to this question is that it doesn’t really matter, and is simply down to your preference – especially since you can easily switch between the two. However, I would probably argue (especially for Linux fanboys like myself) that Bash (and therefore Azure CLI via Cloud Shell) is probably easier and more intuitive, and you could always directly enter a PowerShell Core for Linux session using the pwsh command in Bash (see also my previous post) if you wanted.

Whichever way you enter a PowerShell Cloud session, the Az module cmdlets are then directly available to you and you do not require any further configuration to your environment. I have specifically found that PowerShell does seem to favour more scripted hardcore Azure deployments since you can very easily use its OOP potential -such as assigning a resource object to a variable and accessing its properties/ methods/ collections programmatically through the object.

Basically, have a play and see what works for you.

Pre-Installed Software

The Azure Cloud Shell thankfully is also deployed with many pre-installed tools and runtimes. This is possibly a subject for another day, but just know that the following are available for your use all within your browser session(!!!):

Development Runtimes available include PowerShell, .NET Core, Python, Java, Node.js, and Go.

Editing tools installed include code (Visual Studio Code), vim, nano, and emacs.

Other tools you can use are git, maven, make, npm, and many more.

I fully expect these lists to consistently get bigger over time.


I hope you have found this post very useful and if you haven’t already done so please start testing Azure Cloud Shell now! It is the perfect place to quickly configure Azure using Azure CLI or the Azure PowerShell module (through scripting or simple commands) through your browser so that you don’t need to install those runtimes on your local machine.


AzureRM, Azure CLI and the PowerShell Az Module

There is now a variety of Microsoft provided command line tools available to connect to (and manage) Azure Resources and the situation is quite confusing to new-comers or those individuals who have not kept up to date with new developments. This post is designed to rectify this situation.

It is probably also worth me mentioning that I am focusing on the usage and deployment of these tools with Linux in mind, however, the following explanation is also applicable to Windows and macOS environments.

##PowerShell Core (on Linux)
If you are running a Linux machine, to use AzureRM or the new PowerShell Az module you will first need to install PowerShell Core for Linux. Installation is fairly easy to perform and you can follow this post using the relevant section for your particular distribution and release. In my specific case, I am running Linux Mint 18.1 Serena, however, to get my Ubuntu version I first run the following:

more /etc/os-release

This returns:

NAME=\"Linux Mint"
VERSION="18.1 (Serena)"
PRETTY_NAME="Linux Mint 18.1"

As you can see, the UBUNTU_CODENAME is xenial. So if I visit the Ubuntu list of releases page, I can see that xenial equates to version 16.04.x LTS. This means that (for me at least) I can follow the PowerShell on Linux installation section titled Ubuntu 16.04 (instructions provided below simply as an example):

# Download the Microsoft repository GPG keys
wget -q
# Register the Microsoft repository GPG keys
sudo dpkg -i packages-microsoft-prod.deb
# Update the list of products
sudo apt-get update
# Install PowerShell
sudo apt-get install -y powershell

To enter a PowerShell session in Linux, you simply type the following within a bash prompt (in Windows you instead use the powershell executable):



AzureRM module (aka Azure PowerShell)

Before you read further, you should understand that AzureRM is deprecated, and if you are currently using it to manage to Azure resources, then you should seriously consider migrating to one of the other options described in the next sections (and removing AzureRM from your system).

I first came across the AzureRM PowerShell Module many years ago when I wanted to manage Azure Resources from my Windows laptop through PowerShell. At the time, this was the only way of doing so from the command line, and the functionality of AzureRM was provided by (in Windows at least) a downloadable installer to make the module available to PowerShell. You can check out the latest version and instructions for AzureRM by visiting this link, but as mentioned earlier, you should avoid attempting this and instead use the Azure CLI or PowerShell Az module as described in the next sections. At the last time of trying, attempts to install AzureRM via PowerShell Core in Linux resulted in failure with warning messages pointing to the PowerShell Az module, so you are forced to go to the newer options anyway.

Upon import into your PowerShell environment, the AzureRM module provided up to 134 commandlets (in version 2) so that you could manage all your Azure subscription resources via PowerShell.

Azure CLI

The Azure CLI was intended as the defacto cross-platform command-line tool for managing Azure resources. Version 1 was originally conceived and written using node.js, and offered the ability to manage Azure resources from Linux and macOS, as well as from Windows (prior to PowerShell Core being available on macOS and Linux). Version 2 of the Azure CLI was re-written in python for better platform compatibility and as such, there is now not always direct one-one compatibility between commands across those versions. Obviously, you should use version 2 where possible.

Head over to Microsoft docs article titled Install the Azure CLI for instructions on installing the CLI for your operating system of choice, or you might also be interested in how to do so on my personal Linux distribution of choice (Linux Mint) in my post titled Installing Azure CLI on Linux Mint. Installing the Azure CLI will enable the ability to use the az command syntax from within your Windows command prompt, or via your PowerShell prompt, or through bash (if you are specifically using Linux).

Once installed, to use the Azure CLI you will prefix any instruction with the az command. Thankfully there is strong contextual help so you can simply run az for a full list of available subcommands, or provide a specific subcommand for help on that.

To execute an az command, you will first need to login to your Microsoft account (that you use to access your Azure subscription/s) as follows:

az login

This will return the following message:

Note, we have launched a browser for you to login.
For old experience with device code,
use "az login --use-device-code"

A browser window will be automatically launched for Azure and require you to log in with your credentials. Alternatively (as you can see in the message), you can use the old authentication method (which is especially useful if your machine does not have a browser). In this case, you would run the following command:

az login --use-device-code

Then log into your account from a browser entering the device code provided.

Either way, after you have done this, we can issue our az commands against our Azure resources, for example:

az vm list

Would list out all current Azure IaaS VMs in your subscription.
Another useful tip with az is to use the find subcommand to search through command documentation for a specific phrase. For example if I want to search for Cosmos DB related commands I would use the following:

az find --search-query cosmos

Returns the following list of commands:

`az cosmosdb database show`
    Shows an Azure Cosmos DB database

`az cosmosdb database create`
    Creates an Azure Cosmos DB database

`az cosmosdb database delete`
    Deletes an Azure Cosmos DB database

`az cosmosdb collection create`
    Creates an Azure Cosmos DB collection

`az cosmosdb collection delete`
    Deletes an Azure Cosmos DB collection

`az cosmosdb collection update`
    Updates an Azure Cosmos DB collection

`az cosmosdb database`
    Manage Azure Cosmos DB databases.

`az cosmosdb collection`
    Manage Azure Cosmos DB collections.

`az cosmosdb delete`
    Deletes an Azure Cosmos DB database account.

`az cosmosdb update`
    Update an Azure Cosmos DB database account.

Az PowerShell module

You might have already questioned that if PowerShell already had a way to manage Azure resources (through the AzureRM module) and we now have the Azure CLI (providing the cross-platform Az functionality), how could there be any synergy between those two environments?

The answer is that there isn't. This is one reason why AzureRM is deprecated.

With the arrival of PowerShell Core on Linux and macOS, it became possible to import the AzureRM module also into those environments, and yet as we have already found out, the Azure CLI was the newer mechanism to manage Azure resources. The problem is that both used different commands to do so (however subtle). In December 2018, Microsoft addressed this situation by introducing the new PowerShell Az module to replace AzureRM which gave a level of synergy between managing Azure resources through the Azure CLI and managing resources through PowerShell. This really means that if you understand one command line environment, your scripts will be relatively easily transferable into the other.

If you are looking to migrate from AzureRM to the new Azure Az module then you should check out this excellent post by Martin Brandl. It is also worth you checking out this announcement from Microsoft titled Introducing the new Azure PowerShell Az module.

To install the PowerShell Az module you can follow the Microsoft article titled Install the Azure PowerShell module. As you will read, you must install this module in an elevated prompt, otherwise, the installation will fail. On PowerShell Core for Linux this means that from bash you would first elevate your PowerShell session as follows:

sudo pwsh

Then you can run the following PowerShell Install-Module command:

Install-Module -Name Az -AllowClobber

Once installed and imported, you are now able to utilize this new PowerShell Az module
but must first login to your Azure account using the following PowerShell command:


This will return a message similar to the one below:

WARNING: To sign in, use a web browser to open
the page and
enter the code D8ZYJCM2V to authenticate.

Once you have performed this action, your Az module in PowerShell will be ready to use. For instance, we can now list IaaS VMs in PowerShell as follows:


As you may note, there is a correlation between this command and the Azure CLI command (az vm list) that we ran earlier. However, it is important to realize that the functionality and behavior of the PowerShell Az module and Azure CLI are not identical. For instance, in this specific case, the Azure CLI will return a verbose result set in YAML format by default whereas the PowerShell Az module returns the results in a shortened tabular format.


Hopefully, it is clear by now that the AzureRM module is redundant across all environments and if you need the ability to manage resources in Azure through PowerShell then the Az module is what you should be using. However, given the ease of multi-platform deployment and use of the Azure CLI, it is probably something you should not ignore and arguably might prefer over PowerShell Az (or at very least run both alongside each other). For example, at the time of writing, exporting an Azure resource template to PowerShell results in code that uses AzureRM rather than PowerShell Az whereas exporting to CLI uses (of course) the Azure CLI itself.

There is also an argument that the Azure CLI is far better suited to Azure automated deployments over ARM templates due to its brevity, and this is discussed in detail by Pascal Naber in his excellent post titled Stop using ARM templates! Use the Azure CLI instead.

Whether you eventually decide to favor Azure CLI over PowerShell Az (or use both), I sincerely hope this post has helped clear up any confusion you may have between all the available command line options to manage Azure resources -I know it had confused me!