I am currently load testing the afterModify connector Rules for my AD connection.
And when uploading only 50 accounts (the expectation is that there could easily be 50,000 changes at a time), Powershell is unable to keep up with the flow of information from the connector.
Has anyone else experienced this problem, and if so, how did you resolve it?
Thanks
We have seen this as well. The behavior that we have seen is related to three areas:
- Disk performance - A temporary copy of your afterModify rule gets written to disk for each data instance that is being processed.
- CPU - Each rule instance spawns its own workspace causing more running processes to be competing for processor speed.
- Memory - Each rule instance needs its own working area consuming memory.
While running your processing, you can use Task Manager/Performance Monitor to see if there is one of these areas being specifically impacted. If so, you may be able to add additional resources to the IQ Service to accommodate the overhead.
Otherwise, we have found that we need to minimize the processing that is done in these AD rules. See if you can use Before Operation rules to preprocess the data so that it is a standard update for the connector would be one option. Another option that we have used is to set an extension attribute that indicates the processing that needs to be done and use a stand-alone script to make the changes to AD. We have also done something similar by writing a script to update the users in the case of a new attribute sync where the data is massively out of sync.
CPU usage does max out repeatedly, so that is an area that we can boost.
The data is as clean as I can make it, its the number of instances that are being called upon that is the problem.
I like the idea of setting an extension attribute that indicates a script needs to be run, and might look at doing that in conjuction with what I am currently doing, and then schedule a daily run.
I am also hoping to see if I can get the afterModify rule to create a ‘queue file’ then get the PS script to use that as a source for users to update.
I have had an afterCreate and afterModify that create a “queue” file for later more complex processing.
In the afterCreate/afterModify, I wrote a small stub file for each user with the required data to start the processing. Then I had a scheduled task that would process the files. Two things to keep in mind here:
- From the queue files, you will have to manage your own retries/error handling in your PowerShell. My queue files had a counter that I would process up to 5 times. If I got to the sixth time, I filed the queue file in a “Failed” directory and emailed the admin.
- Also think about how you will handle this in the event of a failure of your IQ Service host. For me, I wrote them to a shared drive where my second IQ Service host could see them and run the scheduled task.
Hi @phil_awlings ,
We have also seen this issue. We upgraded our CPU and increased RAM of the IQ service machine. This decreased the number of errors but we still experience this issue when their is a large number of provisioning to AD.
Thanks
Morning,
Do you mind sharing the code that deals with the stub file creation?
I’m not sure if this is the right to go, but I would like to explore it in detail before I try something else.
Thanks
So in my AfterCreate PowerShell, I had the following function:
Function Write-CreateAppFile ($alias,$UPN,$sAMAccountName,$emailLicense)
{
$fileName="CREATE-" + $sAMAccountName
$Retry="0"
$userFile=[pscustomobject]@{Alias=$alias; UPN=$UPN; sAMAccountName=$sAMAccountName; EmailLicense=$emailLicense; RetryCount=$Retry}
$userFile | Export-Csv -NoTypeInformation -Delimiter : -Path $AppFilePath\$fileName
}
Then I pulled data from the accountRequest object and inserted it into the file:
$sAMAccountName=Get-AttributeValueFromAccountRequest $requestObject "SAMAccountName"
$firstName=Get-AttributeValueFromAccountRequest $requestObject "extensionAttribute1"
$lastName=Get-AttributeValueFromAccountRequest $requestObject "extensionAttribute3"
$upn=Get-AttributeValueFromAccountRequest $requestObject "userPrincipalName"
$alias=$upn.Split("@")[0]
Write-CreateAppFile $alias $UPN $sAMAccountName $emailLicense
This would create a small file named CREATE-agutschow with a colon delimited data in a specific folder like:
Alias:UPN:sAMAccountName:EmailLicense:RetryCount
agutschow:[email protected]:agutschow:E3:0
Then my standalone program could get the list of files and process them like:
$fileList=@(Get-ChildItem -Path $newFilePath\* -File -Include CREATE*)
#For each file:
for ($i=0; $i -le $fileList.Count -1; $i++)
{
$fileName=$fileList[$i].Name
$file = "$newFilePath\$fileName"
$user = import-csv $file -Delimiter :
$alias=$($user.alias)
$UPN=$($user.UPN)
$sAMAccountName=$($user.sAMAccountName)
$emailLicense=$($user.EmailLicense)
Continue Customer Logic
Hope this helps,
Alicia
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.