Webinar Transcript: Working with the new Exchange Cmdlets
This post is a transcript of the webinar that took place in February 2020. You can watch the on-demand version of the webinar here.
Vasil: Welcome everybody. Thanks for joining us for this webinar on behalf of me and Ingo. So just a quick introduction my name is Vasil Michev, I am an Office 365 MVP and I have been working with Office 365 ever since it got released what’s now 9 years ago. A big part of this has been working with the exchange cmdlets so I can pretty much say I’ve been raised by playing with those cmdlets for the past 9 years. We’ve asked Ingo to join us, he’s also a fellow Microsoft MVP and we will use his expertise as an administrator in one of the bigger Office 365 customers with a large number of seats. He will be able to give us the real-life experience, so, if you can introduce yourself real quick.
Ingo: Yeah, hello nice to be here. My name is Ingo Gegenwarth, I’m working for SAP and I’m also Microsoft Certified Master. Just to give you an idea here, our environment currently has roughly 130,000 user mailboxes, plus 10,000 to 15,000 shared mailboxes. We came from on-prem and we went the whole road down to exchange online right now, so we know all the environments with their disadvantages and advantages. We have the full-blown experience here. We hopefully can share some help for you. With that, let’s start Vasil.
Vasil: Thank you, Ingo. So yeah, let’s go ahead and get started. First let’s introduce what’s the issue here. Why do we want to cover this topic, and what’s new on it?
So, as an exchange administrator you’re most likely aware of Exchange Online PowerShell and have been using it for a while, you’re pretty much aware of how it works, I suppose. And those of you that are on-premises administrators as well should be familiar with it. Exchange in particular is probably the primary workload that takes the full advantage from PowerShell compared to all the other ones. So, how does exchange remote PowerShell work?
Basically, it’s designed for a world where you’re managing your work infrastructure and not designed for the cloud. So, with applying the same basic principles for connectivity to Exchange Online, we started seeing some issues. Most of you here, for example, are seeing issues with the amount of time it takes to perform basic tasks, which is caused because of the data that has to actually travel through the wire. Or, you’ve seen frequent disconnects and throttling messages. Error messages like this one.
I suppose these are very familiar to most of you, over the course of these past few years we’ve repeatedly run into issues like that. And just trying our best to make sure that our scripts are able to complete, we’ve come across several different tips that can help us address the most common issues. So, moving on, just a quick summary of the most common tips.
Tips to address most common issues
We’ve actually covered those in a previous webinar by me, so I’ll just give you the short version. Here, there are several things you can try, basically stuff like using the invoke command method. This allows you to run the given script block directly on the server side and then you can instruct the server to give you back just the data you need. If we have enough time I will make sure to do some demos here to give you the real comparison, but for the interest of time, I’ve taken a quick screenshot of how things differ. So, if you go from bottom to the top, you will see me running ‘get mailbox’ cmdlet 10 times for my own environment. You can see that it usually completes in less than a second.
Now, using the invoke command method, I can still use the same cmdlet, and you can see that there is sort of an improvement but it’s not really visible. The real improvement that we get is when I use the invoke command method but tell the server to just give me back only the parameter/the attributes that I care about. In this case, I’ve given the example of just getting back the name and the external directory object ID. As you can see from the result this is pretty much twice as fast and this is just for a smaller environment. I think I have, like, 29 mailboxes or so, you can of course get more and more speed improvement with a larger number of mailboxes. But yeah, that’s just one of the tips that we have.
Other tips include making sure that you use filtering to just get the data that you care about. For example, if you want to set some property on just the user mailboxes, it doesn’t make sense to use the ‘get mailbox’ Cmdlets unfiltered because it will return all the room mailbox, shared mailbox, and so on. And of course, there are many different examples of filters we can talk about. Again, if we have sufficient time we can go over some additional examples, there, but for the interests of time here’s a simple screenshot presenting how you can use server side filters to basically fetch. For example, just the mailbox is created on a given date or after a given date or the mailboxes that are currently put on hold or all the groups particular user is a member of and stuff like that.
Moving on, one of the common methods we use sometimes is just to pipe several cmdlets together. It’s quite easy to do and it allows you to perform some of the basic tasks. But when it comes to more complex tasks and more complex scenarios, it’s not a very good idea to use this method because you’re losing a lot of the flexibility you have there. You cannot put any meaningful error handling for example, if something happens, it just breaks and so on. So, one of the other tips we coverage previously is to try and use your own script works functions instead of using just piping, where possible.
Another similar tip is just to reuse the data where possible, so again piping doesn’t really help you with that. But you can store, for example, the output in a variable or save it to a file and then use it later. You can also take advantage of other methods, which are currently available. For example, even back in the day we had the MSO module, which we could use to get a list of users and groups in the organization, instead of having to rely on the Exchange cmdlets. Nowadays, we also have the Azure AD cmdlets for that or the graph API. Ingo, for example, can give you some rough estimates of how much of improvement you can get from using those methods instead of the exchange online cmdlets. Ingo?
Ingo: Yeah, I also just wanted to mention, the point here is use Azure AD or for instance, when you’re in a hybrid environment, use Azure AD. Just retrieve data which you really need directly from Azure AD. It is much faster than going through Exchange because Exchange is going to ask you and query Azure AD, it’s much slower, so you can combine that stuff. To add here, another common theme for piping and server-side filtering is also: think about every query you send to the server without filtering, all the responses have to come back to your shell. This means a massive amount of data needs to be transported. The finer you get your results from the whole objects, just selecting a few attributes, the less you have to transport through the terminal to your local PowerShell. Which then speeds up everything and also reduces the amount of memory you’re using. Okay, back to you.
Vasil: Thank you Ingo and yeah, there’s another example here, just some numbers on the screen: You can see a four-fold improvement when you are getting fetching the list of users and 10 ten-fold improvement when we are fetching the list of groups in the organization. But anyway, moving on.
Another tip we can give you is to add some proper error handling where possible and to re-establish sessions periodically. The screenshots we had on the previous slide, you saw the disconnect error there which most of you have run into in one form or another. Basically if you combine all these tips, you can sort of improve this whole situation. One gentleman from Microsoft by the name of Michael Bird has actually done this for us and has released the module, which basically incorporates most of those tips. It’s called Euro Cloud Command module, previously it was released as just a script, and you can find a link here into the original article, and the more recent article that he released I think a month or 2 back. Also, a link to the GitHub repository where you can get the module from, or you can even contribute to the module.
And just to share one of the other tips that we shared on our previous webinars. This was basically to use multiple accounts when you have to run a script or collect data for a large number of accounts there. In a nutshell, yeah, you can do things in parallel, split things into batches and stitch the data back together. All of this is available on demand, if you’re interested. But we are gathered here today to cover the newer stuff, so well, let’s finish up with this overview of what was bad in the exchange online cmdlets and switch to what’s new and good. Again, just to continue setting the stage for the actual problem. A quick overview of how remote PowerShell works in the case of exchange online.
Basically, we have the client which connects the front-end server and then the front-end server speaks to the back end, which does the execution. The back end then returns the data and it travels all the way back to the customer client executing it, where the process here can fail on multiple steps. Again, because it can fail on so many steps, we’ve seen various types of errors there. You are probably familiar not with just those disconnects but the annoying messages telling you that there was an error, some change occurred in the domain controller or something like this.
One of the reasons here is because those sessions were stateful, they relied on the server you are connecting to so something happened with the server and everything goes poof. They didn’t have any resume or retry capabilities, so again if something happened, you had to start from scratch. No support for pagination as we already spoken on the previous slides, it always returns the full object unless you do some tricks and there is the throttling issue and so on. So, in summary, I mean, it’s not that Microsoft was oblivious to those issues, they were pretty much aware of them, but it took them a while to present a solution which can help improve the situation. Those of you who have attended ignite or have watched the recordings on ignite are probably aware of it. I’ve actually borrowed those 3 slides from the presentation given at ignite.
This is the solution proposed by the team at Microsoft, which helps us address most of those issues. So we still have of course, the client connecting to the front end server, but the front end server now will use HTTP rest APIs to talk to the back end and this improves things tremendously. There’s also improvement in the way the actual business logic is executed, and, in a nutshell, we have a bunch of improvements coming from this model. First and probably one of the most important, the sessions are now stateless, and there is no server affinity. If something happens with the server, you are just redirected to the next available server. Which in turn gives us than are some retry/resume support? We also have support for pagination, we have this improved logic, which only returns a minimal set of attributes and greatly helps with increasing the performance, and some other stuff which we will cover. We’ll now switch back to doing some demos, but before doing so, let’s first give you an idea how you can get to the new cmdlets.
For the full content of this webinar, including the reveal of these new cmdlets, please click here to download the full session.