Get ActiveSync device statistics faster

Get-MobileDeviceStatistics is an “expensive” cmdlet, in that it requires Exchange to connect to the server where the mailbox is located and get information that is stored in the mailbox. And for whatever reason, it is more expensive than other cmdlets that do the same thing, like Get-MailboxFolderStatistics. If you have Exchange servers in multiple sites, running Get-MobileDeviceStatistics from a server in a different site than the target mailbox, it can take minutes to get data back.

A customer has about 25,000 mailboxes on servers in the US, Europe, and APAC. They wanted to find out which users’ mobile devices have synced in the last 30 days and information about them. Running something like Get-Mailbox -resultsize unlimited | %{Get-MobileDeviceStatistics -Mailbox $_.Identity} is not efficient in such an environment. You end up making expensive calls to remote servers for users that don’t even have a mobile device. The customer’s script would eventually just time out because of network issues with disconnected sessions.

I wrote a script that completes in far less time because it connects directly to a user’s mailbox server to execute the cmdlet. To be efficient processing tens of thousands of users, the script checks whether a user has any ActiveSync partnerships so it doesn’t waste time querying a mailbox that will return no data, keeps track of which server a mailbox database is mounted, establishes a connection to a server only as needed, and imports only the one cmdlet that will be executed (which completes much faster than importing every cmdlet the admin’s RBAC roles provide).

Getting mobile device statistics for this many users still takes several hours to complete, but it does complete. The script uses progress bars to indicate how far along it is in processing all mailboxes, as indicating when it is looking up the active server for a database and when connecting to a server (though the latter two only take a few seconds each).

For those that wonder why I didn’t use Invoke-Command and avoid wasting time importing the remote session, I originally wrote this in the customer’s VDI on a Windows 7 workstation with PowerShell 2, which wouldn’t let me execute the command directly in the remote session because of the Exchange WinRM configuration of restricted language mode when sending local variables to be used in a remote session. This only happened with Powershell 2. On a Windows 10 client with PowerShell 5.1, it works fine. I measured the run time of the script several times with a sample of users and compared it to doing the same with Invoke-Command, and the difference was negligible. However, if you prefer to do it that way, here is a version that uses it:

How to register an application in Azure AD (for use with my persistence actions reporting script)

These instructions are for registering an application in Azure AD so that my lateral movement reporting script can use the Graph API to access Azure AD audit logs and the list of applications granted access via user-based consent.

  1. Go to the Azure AD admin center.
  2. Click App registrations (Preview).
  3. Click +New registration.
  4. Enter a name (which can be anything), such as Lateral Movement Script.
  5. At the bottom, click Register.
  6. In the application’s Overview pane, copy the application’s ID by hovering just to the right of the GUID and click on the copy icon that appears.
  7. Paste the GUID into the script as the value for the $appId  variable.
  8. Back in the applications’ navigation pane, under Manage, click API permissions.
  9. Click +Add a permission.
  10. Under Commonly used Microsoft APIs, click Microsoft Graph.
  11. For the type of permission to add, click Application permissions.
  12. Expand AuditLog and check AuditLog.Read.All. (This allows the script to get the sign-in time of an application and members added to an Office 365 Group.)
  13. Expand Directory and check Directory.Read.All. (This allows the script to get the list of applications assigned to a user and when the consent was granted.)
  14. At the bottom, click Add permissions.
  15. Because the application (script) won’t be logging in as a user, in the list of permissions you can optionally choose to click User.Read, then Remove permission, then Yes, remove.
  16. The two permissions that you added require admin consent, so under Grant consent, click Grant admin consent for [Company], then Yes.
  17. If you also did step 15, your summary of permissions will look like this:
  18. In the navigation pane, under Manage, click Certificates & secrets.
  19. Under Client secrets, click +New client secret.
  20. Add a description (which can be anything) about who or what will be given this secret, such as John Doe.  (For example, you could choose to create a client secret for each admin that will run the lateral movement script. This gives you the flexibility to delete the secret for one admin without having to create a new one for the others if it were a shared secret.)
  21. Under Expires, you can leave the expiration as the default 1 year, or you may choose to set it one of the other choices.  (Regardless of the expiration, you can always delete the secret at any time.) Then click Add.
  22. Copy the secret by clicking the copy icon next to the value.  (It is important that you do this now, because once you leave the Certificates & secrets blade, you will never be able to see it again.)
  23. Paste the secret into the script as the value for the $clientSecret  variable.  (Because the secret is effectively a password that is used with the “username” that is the app ID, you may choose to store the secret securely in Windows via the DPAPI, and modify the script to get the encrypted secret, decrypt it, and assign it to the $clientSecret variable.)

Report Office 365 user activities that persist access even after account compromise detection

An activity that provides an attacker access to resources after detection of an account compromise is called persistence.  One example of this is automatic forwarding of email.  Using leaked credentials, the attacker logs into a mailbox and configures mail to forward (usually to an external address).  Even after the account’s password is reset, the forwarding setting remains in effect.

In Office 365, there are numerous methods of achieving persistence.  Many take advantage of collaborative features and those that exist to increase user productivity.  It is important to be aware of these mechanisms and check for them after you become aware of a compromised account.  I wrote a script that not only reports on persistence configuration, but if the configuration occurred during the time that you believe the account was compromised.  It reports on these activities:

  • Inbox rules that forward, to whom they forward, and if the rules were created/modified within the search window
  • Forwarding configured via Outlook on the web Options (aka OWA Forwarding), to whom it forwards, and if it was created/modified within the search window
  • Mailbox folder permission changes during the search window
  • Anonymous calendar publishing is configured, and if it was enabled within the search window
  • All applications that have access via user-based consent (known as OAuth grants), when the grant first occurred, and the last time the application signed in
  • Files located in SharePoint Online (and OneDrive for Business) that were shared within the search window, whether they were anonymous links or secure links, and if the latter, with whom they were shared
  • Members (internal and guest) added to a Microsoft Team or Office 365 Group within the search window, which Team/Group, and the UPN of the member
  • Flows that were edited within the search window, the admin URL of the Flow, and the connectors used
  • (Optional) Mobile device partnerships with the user’s mailbox, and the first and last sync time of each (This isn’t technically persistence, but can be included to show partnerships, usually to see if a new one was created)

Most of the information for these activities is available in the unified audit log. (Because the log is stored in a mailbox in Exchange Online, if you want to be sure that you have enabled unified logging, you can run  (Get-AdminAuditLogConfig).UnifiedAuditLogIngestionEnabled and check that the output is True.)  Some of the information, however, is available only via the Graph API:

  • Applications that the user has granted consent (this is available via PowerShell, but not per user, so it is not practical in a production tenant)
  • The date of the grant for an application’s consent
  • Application sign-ins
  • Members added to an Office 365 Group (though members added to a Team are available in the unified audit log…odd, yes)

Leveraging the Graph API requires that you register an application in Azure AD.  This is because you have to specify what permissions (access) you want the application to have, as well as how the application will authenticate. (There are multiple ways to authenticate, but I am using a client secret.)  I have created a separate post with those instructions.  Some admins may not want or have the ability to register an application, so it is not required, but the output will obviously not include the application consent information and Office 365 Group member additions.  For those that do, you will need to edit the script and set $useGraphApi  to $true , and provide the values for the $appId , $clientSecret , and $tenantDomain variables.

The script requires you to be connected to Exchange Online and have the Azure AD module installed and connected.  It will search, by default, for activities in the last seven days, but you can specify a start and/or end dates.  Bear in mind that the unified audit log retains entries for 90 days for non-E5 users (and one year for E5-licensed users or those assigned an Advanced Compliance add-on license).  Additionally, Azure AD audit and sign-in log entries are retained for 30 days for P1/P2-licensed users (seven days for Free and Basic users) (source).

Handling the output is something I spent a fair amount of time working through.  This is because of how PowerShell processes multiple objects in a script that contain different properties.  (It assumes the properties of the first object in the output apply to all successive objects, so those successive objects will only show properties that are also in the first object.)  I experimented with key/value pairs in generic properties, but it is ugly and requires much more work from you if you want to parse the values for further processing.

My compromise solution is that the default option will display the output to the screen as text and not native objects.  If you use the PassThru  switch, the objects are not output to the screen, but sent to the pipeline as native objects so you can do whatever you want with them.  Generally, this means you would assign the output to a variable and then work with each object in the collection, but you can even just output the variable to the screen and they will be displayed correctly.  (It is only when outputting them directly from the script that objects with different properties are displayed incorrectly.)

You can download the script or copy below.

  Get-O365PersistenceActions.ps1 (22.0 KiB)



 

Meeting cancellation script updated for large result sets

Articles in the "Meeting cancellation script" series

  1. Cancel future meetings in a mailbox
  2. Meeting cancellation script updated
  3. Meeting cancellation script updated for large result sets [This article]

The default for the script is to get all appointment items for the next year.  This includes all occurrences of recurring appointments.  For the really busy person, or if you specify the end date beyond a year, that could mean more than the maximum of 1000 items.  To accommodate more than the search result maximum, you would normally use paging.  A calendar view, however, does not support paging (even though the search result has a property to indicate there are more items than what were returned).  To work around this I added pseudo-paging that performs successive loops with a start date that is set to the start of the last appointment in the previous search result.  This causes duplicate items in the collection of appointments, though, so I added a step to filter these.  These changes are transparent, so you don’t need to do anything to accommodate mailboxes that have more than 1000 appointments.

Another change is the addition of the AttendeeMeetingsOnly parameter.  This switch causes the script to only process meetings where the mailbox is an attendee.  Parameter sets have been defined so you cannot use this parameter at the same time as IncludeAttendeeMeetings.  This now means the options are to process meetings where the mailbox is the organizer (the default), to additionally process meetings where the mailbox is an attendee (use IncludeAttendeeMeetings), or to process only the latter (use AttendeeMeetingsOnly).

I changed the way the script identifies recurring meetings because of an issue that sometimes occurs, causing the IsRecurring property to incorrectly have a value of True on a single-instance meeting.  And, lastly, already canceled meetings are now excluded.  (An attendee mailbox that has received a meeting cancellation, but has not been “accepted” by the user, is already in a cancelled state in the mailbox and cannot be declined.)  Thank you to Zafer for identifying bugs recently so they can be fixed.

Here is the updated script:

  Cancel-MailboxMeetings.ps1 (12.7 KiB)

PowerShell module for searching and restoring items in Recoverable Items

A recent addition to the Exchange administrator’s arsenal in Exchange Online is the ability to search and restore items located in the Recoverable Items folder of a mailbox, via Get-RecoverableItems  and Restore-RecoverableItems .  (I don’t find that it is documented yet, so I can’t link to cmdlet references.)  It is available only for Exchange Online and has other constraints, so I wrote a module that adds support for Exchange on-premises and additional features:

Module Native cmdlets
Supports Exchange Online mailboxes Yes Yes
Supports Exchange on-premises mailboxes Yes No
Supports archive mailboxes Yes No
Search in Deleted Items folder No Yes
Search in Deletions folder Yes Yes
Search in Purges folder Yes No
Search filter can use retention tag Yes No
Restore to original folder Yes Yes
Restore all item types Yes No

To avoid naming conflicts, my cmdlets are Get-MailboxRecoverableItems  and Restore-MailboxRecoverableItems .  The EXO cmdlets use native access to the store and require you to have the Mailbox Import Export role assigned to use them.  My module uses EWS and requires either full access or the impersonation right.  (Include the UseImpersonation parameter when you want to use the latter.)  If using on-premises, you can omit the Credential  parameter to use your current credentials.  If using Exchange Online (or to use explicit credentials on-premises) you need to use the Credential parameter to provide a credential object.

My cmdlets replicate the behavior of the native cmdlets as much as possible, such as the property names and format of the output:

  • SourceFolder is where the item is currently located.
  • If the item is stamped with the MAPI property that contains information about the folder it was in when it was deleted, the “short” folder ID is displayed in LastParentFolderId.  (Otherwise, it will have no value.)  This is not the complete entry ID for the folder, but is what is stored in the property, hence why I am calling it short.  (To be restored to that folder, if it still exists, requires building the complete ID, which the cmdlet will take care of.)
  • LastParentPath is where the item will be put if you restore it.  (This is why I mentioned I am replicating the behavior of the native cmdlets, because I wouldn’t have named the property that.)  This path will be the original folder if the LastParentFolderId is populated and that folder still exists, as indicated by the value of OriginalFolderExists.  Otherwise, it will be the path to the default folder of the item’s class.

Because of a bug in EWS when translating folder entry IDs in the archive mailbox, items restored from the archive mailbox will always be put in the primary mailbox’s default folder for the item’s class.  OriginalFolderExists will never have a value for items in the archive mailbox because the translation bug makes it impossible to determine if the original folder still exists.

You can’t pipe the output of Get-MailboxRecoverableItems  to Restore-RecoverableItems  (nor can you do so with the native cmdlets).  The Restore cmdlet takes the same arguments as the Get cmdlet.  You can modify the search filter for the restore based on the output of Get-MailboxRecoverableItems , including specifying an item entry ID or folder entry ID, but the Get cmdlet is essentially the same as running the Restore cmdlet with the WhatIf  parameter.

The default is to search just the Deletions folder of Recoverable Items (named, confusingly, RecoverableItems), but you can optionally specify any or all of the Purges folder, the archive mailbox’s Deletions folder, and the archive mailbox’s Purges folder.  (The cmdlet will silently skip the request to search an archive mailbox if there isn’t one, allowing you to pipe multiple mailboxes to the cmdlet without regard for whether any or all have archives.)  There are no restrictions on which parameters you use for a search filter (aka, there are no parameter sets), but using some combinations won’t be of value.  For example, providing an entry ID negates any value of also specifying a subject.

You can then run Restore-MailboxRecoverableItems  with the parameters you want to restore any or all of the matching items:

  • RestoreToFolderId is the short folder ID where the item has been placed.
  • WasRestoredToOriginalFolder will be True if the original folder was known and still exists, False is the original folder was known but no longer exists or if the original folder was not known.
  • WasRestoredSuccessfully will be True if the item was actually moved to the folder listed in ReturnedToFilePath, or False if an error occurred.
  • ReturnedToFilePath is the folder path where the item can now be found.

You can set whether to use autodiscover and, if not, the EWS URL at the top of the module.  (You can also enable SCP lookup, if using autodiscover, by setting the property on line 41.)  There are comments throughout if you want to see what is being done and why.  Download the module below:

  RecoverableItems.psm1 (26.3 KiB)

Exchange Online MFA module updated to use refresh token

One frustration of the MFA module for connecting to Exchange Online is its inability to use the refresh token it gets from Azure AD.  As a result, you can use the session for 60 minutes before you are prompted again for credentials.  This makes it very difficult to run any scripts or long-running commands without it stopping mid-run to get your username and password, just to have it happen again 60 minutes later.

This limitation has been fixed starting with version 16.00.2015.000.  If you load the module from the desktop shortcut, the updated version is installed automatically.  (If you side-load the module, you’ll want to run the shortcut so it updates and be sure your code is loading the highest version.

The other requirement is that you must use the UserPrincipalName parameter when running Connect-ExoPsSession .  It is not a required parameter (like it is for Connect-AzureAD ), so you might be used to simply running the cmdlet and entering your UPN in the authentication form.  The reason for the UPN requirement is because, if you provide it in the authentication form, the cmdlet has no reference for which user’s refresh token to present when the access token expires.  It only knows which user authenticated in the first place if you provide the cmdlet with your username and let it pass it to the authentication form.

The other benefit you get with this fix is that, unlike using PowerShell remoting with Basic authentication, the module is able to silently reconnect after the session has been broken.  I successfully tested this by connecting one afternoon, changing networks and putting my laptop in sleep mode overnight, then running a cmdlet in the existing shell the next day.  I briefly saw the modern authentication prompt, it went away, then created a new implicit remoting connection and executed the cmdlet, all without having to type a username or password.

Even if you do not have MFA requirements, you may want to consider using the module to connect to Exchange Online for this added benefit.