It was highlighted this week the importance of knowing who does what through the Azure framework - an item (a storage account) had been removed and we needed to know who had done that and when.
Helpfully this is provided out of the box and all operations are automatically logged (whether they come from the portal, powershell, rest api's or any form of interaction with Azure). To access this information we just need to navigate to the 'azure activity log' feature from the portal and use the search interface there to find what has happened.
So if we open up that we get a simple search interface to give us what we want - see the pic below
Now this is all great - but the only limitation with this is that it only goes back 3 months - what if we want to check on something that happened before that or do more in depth analysis of the data? The good news is there is a very easy solution to enable this and we have to do almost no work to set this up - so lets do that.
Again we navigate inside the portal and this time go to the log analytics option - from there we create a new one
Once we have that we just need to configure it - to do that we navigate to the workspace data sources area and then choose azure activity log as the input source - when we click on that we now get a list of all our subscriptions and are able to activate 'event forwarding' from the activity go for that subscription into this central log analytics workspace
So in my case i work through and activate all the subscriptions i want to capture information for. It's just a case of clicking connect on the pop up window below for each one
Now thats in place the data is all being collected - but how do i query that?
Lets navigate to the OMS workspace that has been auto created for us:
Once we get to that - we go to the solutions gallery from there and activate this option - this creates a prebuilt set of graphs and also an object that we can use to query the data using the log analytics language.
So thats this solution:
The details for that look like this:
Once that has installed and is activated (takes a few minutes) - we get this screen
Now that looks pretty and will no doubt keep management happy - but the real power of this comes with the ability to write custom queries against the data - so for example something like this:
ActivityLog | where some query | where some query
Heres a demo of that
For an ex-dba at least that syntax is very familiar and easy to pick up (there was another language that came before inside OMS this was not so easy to use - but this new one is very easy).
So we can easily perform the audit check query we require now.
The tool also lets us save the results as pretty screens with graphs etc and drill down for details - we can also export to excel/power bi to do further analysis if required.
There is even a special website for working further with the data provided by Microsoft - this can be accessed by clicking the analytics link near the top right of the page which opens up this website
There is actually an incredible amount this tool is capable of and i really only scratched the surface. There are loads of built in solutions to help you process data from other data sources and you can very quickly get quite a lot of value from the data you have. The system also can process crazy amounts of data volume in a short space of time - not sure how this is architected underneath but its certainly very impressive.
The retention period for this can be up to 2 years (it's selectable up to that) - which for our purposes is good enough.
One additional comment i would make about this part of the Azure toolset is that the cost model (for this part of OMS at least) can be pretty cryptic. This is one of things Microsoft need to be more transparent on - some of the other options inside here seem to be per node billing and it appears that it could become quite expensive but it just doesn't seem clear (to me at least).
I really like the functionality you get out of it though for very little effort.
Comments
Post a Comment