How much storage am i using?


We were recently asked how much Azure storage we are actually using - initially I thought this was going to be a simple thing to find out - but actually it turned out to be a little more complex than I thought so here are a few notes about how i went about it.

Still kind of feels like I missed some easier method of doing this - if so please let me know....

My first thought was that this is something that is just available in the portal - indeed it is for a single storage account just on the insights blade - it gives a total along with a breakdown of the different storage types




If i then go to Azure Monitor to look at the storage insights I thought this would then allow me to just scale up that view and get the overall picture. Navigating to there I see this:



So it looks like this might give me what i want - there is a button to download to excel which should then allow me to just sum everything up. However there is an issue - we seem to have too many storage accounts for it to cope with and the report - the built in one is limited to 200 resources - this means I would have to manually filter multiple times to report across everything - likely to be error prone and very time consuming.

So i thought I'll just create my own copy of the report and tweak it to remove the limits - easy enough. I then discovered that this had 2 issues.

1. There is a hard limit internally in azure workbooks of 1000 items - after that data is just not shown

2. A lot of the time I was hitting insights api errors - not sure if that was due to throttling or some sort of capacity issues at the time but it was not reliable

Then I had a rethink - maybe this is available in resource graph query? (well short answer to that is that it isn't)

So what to do - it looked like I was going to have to write some sort of script either with azcli, powershell or something calling rest API's in order to get all this data.

Of these options the easiest for me is azcli - so I wrote a short script to extract the data I needed.

This is 'cheap and cheerful' no error checking and a real bare bones script but it gets the job done. The bonus being that you can just run it in cloud shell (bash) from a browser.

So here is the script:

for ider in `az account list |jq -r '.[].id'`
do 
az account set --subscription $ider
for rec in `az resource list --resource-type Microsoft.Storage/storageAccounts |jq -r '.[].id'`
do
az monitor metrics list --resource $rec --interval PT1H --metric "UsedCapacity" |jq -r '.value[].id,.value[].timeseries[].data[].average' |paste - - -d "," >>rich.txt
done
done

There is nothing particularly clever here - i just loop through all subscriptions, within each subscription loop through all the storage accounts and then extract the metrics information and send the output to a plain text file.

I make use of jq to extract values form the json data that is returned - this is a really  nice utility for that - just needs a big of getting your head round with the syntax.

The end result of that is a plain text file with 2 columns - the full resource if of the storage account (containing the sub,resource group and storage account name) along with the size in bytes of the used capacity.

All i then need to do is sum up the 2nd column (the total bytes used) and convert into a suitable measurement unit - in our case PB.....

A useful exercise - but still feels like it should be easier - lifting the workbook limit would make this far simpler for example. I also found that my approach is much slower at fetching the data that the workbook - so its somehow using a much more efficient API to fetch the data than the command line.

If nothing else introducing you to jq is definitely something worth knowing.

P.S. the above report does not include  storage for disks attached to VM's - these are handled differently as we use managed storage accounts for all VM disks



Comments

  1. Hi.

    Thank you for sharing this information about Azure storage this information is so useful to us.

    Here is sharing some Talend Big Data information may be its helpful to you.

    Talend Big Data Training

    ReplyDelete

Post a Comment