Anyone remember Schrödinger's cat - the 'thought' experiment from the 1930's that pondered if the cat was alive or dead - well I'm not going to go into that but just use it as a pointless premise to have a picture of a cat. If the picture shows, then the cat is alive, if it doesn't then the cat is dead.
Now thats set a little backstory on to the technical point i want to talk about - Azure Storage. This has been around since Azure was born as far as i know (i wasn't using it back then) It allows you to store files (among other things) directly in a storage container without the need of any kind of fileserver to make that access possible.
Access to the files is then made available via http/https calls to put or get the file - this all works nicely and is one of the core features of modern apps in Azure.
What a lot of people realised early on is that this means you can use Azure storage as a very cheap webserver - well that is along as you only want to serve static content - the http/https endpoints mean any files saved to Azure storage can just be displayed directly in a browser via that same endpoint. This means that simple image files or simple html (or even those with javascript in - as that would get rendered by the browser) can just be stored in Azure storage and made available as a 'website'. This can all be done very cheaply - see some pricing here https://azure.microsoft.com/en-us/pricing/details/storage/blobs/ and it's very easy to add caching features via CDN and have DR with multiple copies of the storage - quite a compelling case for simple stuff.
Anyway lets now get on to what i wanted to show which is how you can protect access to the storage endpoints in the case you don't want to just let the whole world have access to see this.
Now there are two main ways to do this
1) Restrict access via some sort of authentication requirement
2) Restrict access via 'firewall' / network restrictions
Lets talk about both of those - but for starters lets create a storage account and a cat picture to set the scene.
So we navigate to the portal and add a storage account - here below i just call it 'thisisastoragedemo' - i make this just for blob storage as thats all that is required for this 'web hosting' - the other options just affect the cost of it really not whether it works or not.
So once thats set up we need to add a conainer (a folder basically) which will hold my picture - note here i choose anonymous read access for blobs to public - this means that anyone with internet access can see this file from anywhere in the world.
Now i click on upload to add a file
Choose the kitten picture and upload
And its added
I now navigate to the ... symbol and click the blob properties option
Amongst other things this gives me the full url to my pic
If i paste that into a browser - the picture happily displays and Schrödinger's cat is alive!
Now i starts getting a little paranoid and decide i only want people who have the special secret token to be able to access this file - how do i do that - well if i navigate to the 'shared access signature' area of the storage object i am presented with this screen
This gives a lot of options that enable me to control access - i can create a special sas token that only allows certain types of operations, is only valid for a certain time span, can only be used for https and in a limited way from certain ip addresses (more on this later).
To now disable public access i go back to the main container screen and set access to private
Now when i try and access the file - it doesnt work - the cat is 'dead'
This also blocks my portal access to the container - so the cat really is 'dead' at this point (i.e. no downloading of the file via the browser interface).
However if i access the file and append the SAS token to the end of that - it works nicely as we can see the cat is alive and well - so thats accessing using something like this (with token appended on the end):
https://thisisastoragedemo.blob.core.windows.net/demo/kitten-3277268_640.png?sv=2017-07-29&ss=b&srt=sco&sp=rwdlac&se=2018-05-01T01:10:00Z&st=2018-04-30T17:10:00Z&spr=https&sig=Win3xzU%2BnsdlGXRWFoEIHZ%2FzGipbRGRDNd0DqacJ9lc%3D
All very nice - but what if i only want to allow access to someone from a certain source ip address - how can i do that? We saw earlier that this is possible in the SAS token itself - but this is very basic - we want some way of having much more control. This is solved by a recent addition to the storage service.
If we navigate to the firewall and virtual networks option we can see there are lots of options here - we can specify particular vnets/subnets with Azure, do the same for non Azure networks as well as grant some global options that may be useful.
In my case above i just want to give a single ip address access - so as i have it above only 99.99.99.99 can access the storage - if i save this the access is blocked from any other device as you can see in the screenshot below (even if i have the SAS key it would be blocked at this point) - the cat is again 'dead'
If i access it from my machine (99.99.99.99) then it works - but only if i also specify the SAS token - at this point we are protected in 2 ways but we can see the cat is 'alive' again
A side note to all of this is how do you find out what 'my' ip address is - for a simple laptop/desktop client - you can just type 'whats my ip' in google and it tells you - and you can use this directly in the portal config - as in mycase the traffic to google is going out via the same infrastructure as the traffic to the storage endpoint - this is probably the case in most organisations.
However in our network at least the routing is a little 'complicated' when servers need internet access - so complicated in fact that its actually quite hard to find what Azure will see as the source ip address.....
So as an example if we access google from the server lan it goes a certain way, if we access azure storage it goes via express route public peering and the source ip is hidden behind some other firewall NAT (in fact multiple addresses here). SO how can i find out what ip/range i need to add to grant the network access?
There is more than one way of solving this - but the simplest seemed to be to do the following:
First we enable some diagnostics that record whats going on with the storage account - this seems to only be there as a 'classic' feature for some reason.
That then logs everything into a 'hidden' container under the same storage account called $logs - we can't see this in the azure portal - but we can see in storage explorer - if i open that up and navigate i can see it - underneath that are folders for each month/day and we can find the most recent one.
If i open that i can see the access and find the source ip - example line shown below
1.0;2018-04-30T13:55:10.9527589Z;GetBlob;AnonymousSuccess;200;136;136;anonymous;;thisisastoragedemo;blob;"https://thisisastoragedemo.blob.core.windows.net/demo/kitten-3277268_640.png";"/thisisastoragedemo/demo/kitten-3277268_640.png";03d79907-101e-0069-648a-e08b93000000;0;99.99.99.99:33530;2009-09-19;178;0;286;380;0;;;"0x8D5A6C071D199BC";Friday, 20-Apr-18 13:13:01 GMT;;"Wget/1.12 (linux-gnu)"
So we could if we wanted build some sort of log analytics query on top of this stuff to have some kind of access audit of who is looking at stuff.
Anyway - there you go some basics on how storage can be accessed and secured - hope thats useful.
* No cats were harmed in the making of this production.......
It's Really A Great Post.
ReplyDeleteMicrosoft Azure Online Training
really i learnt a lot through azure training
ReplyDeleteTo appreciate this I like to share some useful information.
ReplyDeleteMicrosoft Windows Azure Training | Online Course | Certification in chennai | Microsoft Windows Azure Training | Online Course | Certification in bangalore | Microsoft Windows Azure Training | Online Course | Certification in hyderabad | Microsoft Windows Azure Training | Online Course | Certification in pune