PowerShell DCR Log Analytics: Part 2.1 – Overview

Welcome!

This is part one of the second generation of the Log Analytics learning series. Here we will cover…

  • Why a second generation?
  • What is Log Analytics?
  • What can Log Analytics do for you and your organization?
  • Why use custom PowerShell?
  • Why not use an agent?
  • My data team doesn’t want to work inside Log Analytics? And, what about long term storage?
  • How does it work?
  • Requirements
  • An overview of how the solution works.
  • Credit



Why a Second Generation?

To those familiar with the first generation of this series, you may be wondering why I would go back and re-do so much documentation. Well, in short, a lot has changed. Simply put, the Function App at the core, the setup process, and the client-side scripts have been somewhat drastically altered as a result of the new HTTP(S) authentication mechanisms explained here. This is a big leap forward!

As a result though, I plan to comb back through ALL of these articles, touch them up, and publish new versions. This includes updated wording to explain the new setup, removal of all the security concern sections, new explanations on how things work, and some new setup pictures and processes because Microsoft can’t leave buttons in Azure the same for more than 5 minutes.

I will be leaving all the prior article version online though such that anyone who was mid-way through, or has yet to upgrade, is not SOL for documentation. As I re-publish each individual article anew, a header will be added to the old version indicating where to find it’s updated corresponding article such as this one.

What is Log Analytics?

Log Analytics is a Microsoft Azure Cloud tool used for the collection, storage, manipulation, and dashboarding of data. What kind of data? In all honesty, when you go the custom route, this can be anything. You can collect dynamic information like Windows Event logs such as logon/logoff, screen saver events, Windows updates starting and succeeding/failing, etc. You can also collect more static data like device hardware inventory, application inventory, network information, disk information, etc. If you can locate/query the value you want to collect via PowerShell, then you can collect that data.

Key knowledge: With Log Analytics, you pay to ingest data (cost per GB ingested) but, at a base plan, are not charged a storage fee. However, this only allows for 30 days of storage. For those in need of longer-term storage, you can either pay for a longer retention period within Log Analytics or, see the below “And, what about long term storage?” section for options outside of Log Analytics. The actual cost discussion will come in the next part.

What can Log Analytics do for you and your organization?

For those using SCCM/MECM, the information I just listed off may not sound all that new. However, for those moving to modern Intune management (or perhaps starting out there), you will be sorely disappointed by the “shortcomings” of Intune reporting.

You might have heard of Discovered Apps and think to use that for instance however, if you read that article you will start to notice the concerning amount of purple notes. Making matters worse, a few years ago now we contacted Microsoft asking why MANY apps aren’t in this list at all, including some that were Microsoft authored, only to be given a bug # and this article detailing many of the shortcomings we had found – and which hasn’t been updated since 6/29/21 as of June 2023. So, its been nearly two years.

Updated 6/29/21: We are actively testing and baking a solution to improve the accuracy and timeliness of discovered apps. Though we don’t have any dates to currently share, stay tuned to this post and our In development docs for future updates.

It sounds like some improvement may come to this in Q3 2023, but I for one will not be holding my breath.

So, to answer the question originally posed in this section, what Log Analytics can do is give you the data you need and sadly (as of today) Intune cannot provide.

Why use custom PowerShell?

While I have yet to explain it fully, this solution involves a lot of PowerShell. In short, we use PowerShell to both query the information we want on devices, and then pass it to the cloud via an Azure Function App. The good news is you don’t have to invent wheels entirely on your own, but it will be extremely beneficial to know PowerShell throughout this guide.

To answer the question directly though, by using PowerShell and the REST APIs Microsoft provides over any current available Microsoft agent, we can highly customize what we target, how often we target it, and often do extremely fine filtering.

For instance, we might want to only collect a certain few event IDs every 4 hours, and only capture those where the username in question isn’t the system account, or maybe even only capture those with a specific path mentioned. That is all possible through the magic of PowerShell and means that you save money.

Why not use an agent?

I feel that what I just said may stir questions in some. Is some of the advanced control of PowerShell really worth it over going with a direct Microsoft provided agent?

The short answer is, of course, yes.

The long answer is that the existing offerings either…

  1. Can’t collect the same items such as Windows Security event logs and static data (think device inventory information, things that are not logs)
  2. Don’t have good overall platform support, including for Microsoft offerings such as Windows 365 and general Windows physical devices.

If you are curious to know more about how these solutions are sadly not yet enough to take over in this process, you can read more about it here: Azure Monitor Agent: The future is not here (yet). – Getting the Most Out of Azure (azuretothemax.net)

My data team doesn’t want to work inside Log Analytics? And, what about long-term storage?

You might be thinking to yourself “Well, it’s great that we can get this data into Log Analytics, but my data team uses other tools. They don’t want to work inside Log Analytics.”

That’s where Azure Event Hubs come into play. Once into Log Analytics, you can ship this data to an Event Hub and from there you can ship it off to any system that can pull out of an Event Hub such as…

You can then store it long term in one of those systems where the storage is likely cheaper. And, if you have a system like SNOW that can pull from Sumo, then the chain of where you can move data to and play with it just keeps going.

You can bring all your company data into one place, you can finally unite your Graph calls with actually accurate Application Inventory, or security logs, local admin reports, registry values, system uptimes, network locations, etc.

And again, you can quite literally collect anything that exists on client devices.

How does it work?

There will be full articles for each step of this process that goes into fine detail as to how each component works, along with how to set it up. This section is just an overview.

In the words of Nickleback, look at this photograph. (Sorry, I just had to move that joke over from the original series.)



That’s how it works, it looks a little scary even for me to look at – and I figured it all out!

Allow me to explain the primary flow.

  1. We use Intune Proactive Remediations to send our Intune Clients a PowerShell script. This is basically fancy, modern, and centrally managed task scheduler.

    – Example scripts will be provided for learning purposes, as well as some ready to deploy solutions much later on.

  2. That script queries what you want, assembled the data into a useable format, and kicks it off to a Function app in the Azure cloud.
  3. The Function then validates the legitimacy of the upload request, processes the data to be uploaded, and gets it shipped off to the needed Data Collection Endpoint (DCE) and corresponding Data Collection Rule (DCR).

    – The function app will be provided and is very easy to deploy.

    – For those familiar with AADDeviceTrust and my contributions to it, yes, this new series uses a new Function App using the latest and greatest in authentication security.

  4. The DCE and DCR process the data further (if needed) and ship it to a Custom Log Analytics Table.
  5. You can then query the data, build Workbooks in Log Analytics, create alerts, all to your hearts content.
  6. If you want to store it elsewhere, you can ship the data to an Azure Event Hub and then to external destinations from there.
  7. If you ship it to say Sumo, and you have a system that can pull out of Sumo like SNOW or QRadar, then the chain of data continues.

Note: Those familiar with my previous series will likely realize this is much less information than last time around. This is true, but only because I didn’t feel an overview was the right place to dump a lot of that information. You can still expect to find those details in more appropriate sections later on.

Requirements:

Last but not least, I want to mention some requirements.

  • You need to have Azure AD joined devices. These can be fully AAD or hybrid devices.

    Note: Local domain only devices are not compatible with this solution due to the authentication methods on both the original and new function app.

  • Intune Proactive Remediations have a licensing requirement. Odds are that any enterprise with interests in this level of data will already have this licensing. There is certainly more than one way to run a script on a schedule, but this is the only option I will be covering as it’s by far the most enterprise friendly.

There are, of course, other sources of cost besides licensing. That will be a discussion for the next part of the series.

Additionally, I also want to cover what you do NOT need. Some of these items are things required in older version that you no longer need now, some of them are items I frequently see confusion over that people believe they need.

  • You do NOT need PowerShell 7.2 on client devices.
  • You do NOT need a Key Vault for this solution. No secrets or anything like that are used which would require one.
  • You do NOT need an enterprise application registration, nor need to worry about a secret key, expiring or otherwise.*

*Note: You may be wondering why Visual Studio Code was on the previous diagram. You may also be wondering why I just specifically called out that you don’t need an enterprise application (or secret keys) yet, show an enterprise application registration on the diagram. This will be discussed in further detail as part of the Function App section however, in short, those sections of the diagrams are for developers. They are not something you need to worry about from a production standpoint.

Credit:

Note: I moved this from section 2.4. While it likely won’t make sense to folks who are very new to this topic, I think it’s important that it be laid out as clearly as possible near the beginning.

Like many works this project of mine is based on the works of others and credit is due there… but it’s also now rather difficult for even me to get my head around.

I obtained a copy of Jan Ketil Skankes (@JankeSkanke) of the MSEndpointMGR’s teams Intune Enhanced Inventory project in June 2022 which is what kickstarted my interest in Log Analytics and sent me on this journey.

The original DCR Function App I made was a modified version of Jan Ketil Skankes (@JankeSkanke) / MSEndpointMGR’s teams (now legacy) HTTP API Function App used in their Intune Enhanced Inventory project here / here. Again, that API is now being moved away from and has many technical limitations such as the inability to forward to Event Hubs, hence I upgraded it to the new API. On that old version of my Function App, much of their code remained as I only upgraded it to use the newer API then left almost everything else (primarily the endpoint authentication) as it was.

However, on this new version we are now deploying, I believe nothing is left of that original MSEndpointMGR Function App anymore outside of the concepts. The sections which handle uploading the material and handling the payload were again upgraded to the Log Ingestion API by me, and the rest of it which handles the authentication is now using the work of Nickolaj Andersen (@NickolajA) and his AADDeviceTrust project. Nickolaj is also part of the MSEndpointMGR team.

More confusingly, I then became a contributor to the AADDEviceTrust project, was invited to document how it worked, and while doing that spotted an opportunity to improve part of it. I don’t know when those additions will be rolled into the main project. So, for now, the Functions needed to make this authentication work are directly in the code of the Function app rather than PowerShell modules linking back to that project.

But wait, there’s more! All of the client-side scripts I use are still indeed based on Jan Ketil Skanke’s work to one degree or another. However, much like the original function app, I had to modify them such as to create the functions for using the DCR API, change the upload / validation procedure to work with the new function app of mine, etc. Then I had to add even more to make it work with the new AADDeviceTurst project. In this case though, there are still times where I use their code for tasks like pulling installed apps. I will make sure to provide more exact credit regarding this matter in articles directly involving client-side scripts.

All in all, I even get confused trying to figure out who gets credit where nowadays. I just know what I have created on top of the work of others is freaking cool and I want to share it with the community!

If you want to know more about the AADDeviceTrust project and my contributions, you can see this article here.



The Next Steps:

See the index page for all new updates!

PowerShell DCR Log Analytics: Part 2.2 – Cost – Getting the Most Out of Azure (azuretothemax.net)

Log Analytics Index – Getting the Most Out of Azure (azuretothemax.net)




Disclaimer

The following is the disclaimer that applies to all scripts, functions, one-liners, setup examples, documentation, etc. This disclaimer supersedes any disclaimer included in any script, function, one-liner, article, post, etc.

You running this script/function or following the setup example(s) means you will not blame the author(s) if this breaks your stuff. This script/function/setup-example is provided AS IS without warranty of any kind. Author(s) disclaim all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall author(s) be held liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the script or documentation. Neither this script/function/example/documentation, nor any part of it other than those parts that are explicitly copied from others, may be republished without author(s) express written permission. Author(s) retain the right to alter this disclaimer at any time. 

It is entirely up to you and/or your business to understand and evaluate the full direct and indirect consequences of using one of these examples or following this documentation.

The latest version of this disclaimer can be found at: https://azuretothemax.net/disclaimer/

Leave a comment