Updating AzureAD User Attributes

Hi Everyone!

I recently had the opportunity to use PowerShell to update AzureAD user attributes. This is different from what I normally do as we still leverage an on-prem AD setup.

I’d never used the command before but I know PowerShell and I’m fairly confident with the AD PowerShell commands.

The mission at hand was this: Update AzureAD user attributes so that the Marketing department had new address information

The first and rather dirty method I put together as a proof-of-concept is below:

$marketingUsers = Get-AzureADUser Filter "Department eq 'Marketing'"
foreach($user in $marketingUsers){
Set-AzureADUser ObjectID $user `
StreetAddress '51 River St.' `
City 'Ridgefield' `
State 'CT' `
PostalCode '06877' `
Country 'United States'
}

I know right, it’s ugly. It’s lacking any form of error checking, there’s no host output and it’s hard to read.

What I did next was put my code behind a few checks. You can see the improved code below:

#Get all marketing users
$marketingUsers = $null
try {
$marketingUsers = Get-AzureADUser Filter "Department eq 'Marketing'" ErrorAction Stop
}catch{
#Output the error message if any
Write-Host "Failed to collect Marketing users!" ForegroundColor Red
Write-Host $_.ScriptStackTrace ForegroundColor Red
}
#Checking if there are no marketing users found
if (!$marketingUsers){
Write-Host "No Marketing users found"
return;
}
#Run through each user and update
foreach($user in $marketingUsers){
try{
Set-AzureADUser ObjectID $user `
StreetAddress '51 River St.' `
City 'Ridgefield' `
State 'CT' `
PostalCode '06877' `
Country 'United States' `
ErrorAction Stop
}catch{
Write-Host "Failed to update $user" ForegroundColor Red
Write-Host $_.ScriptStackTrace
}
}

This was looking much better, it handles error nicely but there is still room for improvement…

I want to implement splatting and also look into ways to speed the script up!

I wanted to take a look into speed first. I know there are subtle different between using the -filter parameter and piping the results into a Where-Object commandlet. Lets run some tests!

I ran the below commands 5 times to get an average using Measure-Command and outputted in total miliseconds:

CommandGet-AzureADUser -ErrorAction Stop | Where-Object {$_.Department -eq ‘Development’}Get-AzureADUser -Filter “Department eq ‘Development'” -ErrorAction Stop
#11311.66696630.8861
#21769.62537973.5126
#32122.87496060.9699
#41963.65125315.6691
#53437.2685783.7616

Crazy! Switching from the -filter parameter to using the pipeline more than halved the time it took for the command to run!

Next was to build the hashtable for splatting in the Set-AzureADUser parameters before building the final version. This was simple done by using the below code:

#51 River St., Ridgefield, CT 06877
#Randomly generated fake address
#New props in a hashtable for splatting
$newProps = @{
StreetAddress = '51 River St.'
City = 'Ridgefield'
State = 'CT'
PostalCode = '06877'
Country = 'United States'
}

This now means I can simplify the Set-AzureADUser command.

You can find the full and finished script below:

#51 River St., Ridgefield, CT 06877
#New props in a hashtable for splatting
$newProps = @{
StreetAddress = '51 River St.'
City = 'Ridgefield'
State = 'CT'
PostalCode = '06877'
Country = 'United States'
}
#Get all marketing users
$marketingUsers = $null
try {
$marketingUsers = Get-AzureADUser ErrorAction Stop | `
Where-Object {$_.Department -eq 'Marketing'}
}catch{
#Output the error message if any
Write-Host "Failed to collect Marketing users!" ForegroundColor Red
Write-Host $_.ScriptStackTrace ForegroundColor Red
}
#Checking if there are no marketing users found
if (!$marketingUsers){
Write-Host "No Marketing users found"
return;
}
#Running through each user
foreach ($user in $marketingUsers){
try{
Set-AzureADUser ObjectId $user $newProps ErrorAction Stop
}catch{
Write-Host "Failed to update $user" ForegroundColor Red
Write-Host $_.ScriptStackTrace
}
}

Enjoy!

Mass PST Importing using AZCopy

Ooooofff!! Been a while, man! Long time no write ??‍♂️

So, a little bit of back story – sometimes I need to import a tonne of PSTs to people O365 account for my job. I used to do this in a VERY manual way of adding their account to my Outlook and running the import. Or just giving them the PST with a guide to importing. Something needed to change!

I found a better way, I could use AZCopy to upload the files to an Azure Storage Blob and then import automatically using the built-in O365 admin tools.

First step!

Login to https://compliance.microsoft.com and head into the Information Governance -> Import section. Create a new ‘Import’ and name is as you like, not special characters though!

Select the option to upload your data, and copy the SAS URL link. The download the Azure AzCopy program. This is what we use to upload the PST files.

Second Step!

Open a command prompt or PowerShell in the same location as the AzCopy program and run the below command:

azcopy.exe copy “path to PST files” “SAS URL” –recursive

I used the recursive option as, without it, my operation wasn’t seeing the PSTs.

Leave this to run, it can take a while depending on your data. There are other parameters to this but I didn’t use them.

Third Step!

Create a PST Import mapping file – this is the step that confused me but hopefully, I can shed some light on it! Download a copy of it from here

I left the TargetRootFolder, ContentCodePage, SPFileContainer, SPManifestContainer and SPSiteURL empty. Since my data was in the following path ‘E:\Company\over_20gb’ I needed to set the FilePath to ‘over_20gb’ for all entries.

This is an example of my file:

Workload FilePath Name Mailbox IsArchive

Exchange over_20gb first.last@company.com.pst first.last@company.com FALSE

Fourth Step!

Repeat step one until the upload, tick box buttons for I’m done uploading my files AND I have access to the mapping file. Use the Select mapping file and upload your file. Following that, you can Validate.

If all is well after the validation, you can either choose to filter your data or not and start the import.

The progress of this import is visible on the Importing page. Be prepared to wait as it can take a looooong time!

Enjoy! ?