AD-LogiPlex Connector Aggregation Issue - LDAP Connection has been closed

Hello All,

We have AD LogiPlex connector in place to perform AD group based application user data import. This task has been in working condition, however, recently we have started noticing issues that it fails with an error - LDAP Connection has been closed. We have observed that it fails after processing one particular user (approx. after processing 1000 accounts) however this is also intermittent behavior. Please share any pointers to identify root cause.

2025-04-18T13:23:23,534 ERROR QuartzScheduler_Worker-5 sailpoint.connector.LDAPConnector:8031 - 775446018 Exception caught while in ContainerIterator.hasNext().
sailpoint.connector.ConnectorException: [ ConnectorException ]
[ Error details ] LDAP connection has been closed
at sailpoint.connector.LDAPConnector.getSupportedControls(LDAPConnector.java:5893) ~[connector-bundle-directories.jar:8.3p2]
at sailpoint.connector.LDAPConnector$ContainerIterator.getIterateMode(LDAPConnector.java:7814) ~[connector-bundle-directories.jar:8.3p2]
at sailpoint.connector.LDAPConnector$ContainerIterator.getIterateMode(LDAPConnector.java:7792) ~[connector-bundle-directories.jar:8.3p2]

Thanks,
Pallavi

Hello All,

We analyzed this issue further by enabling the aggregator and AD LDAP connector logs. Observing below snippet in the logs just before LogiPlex connector error:

2025-04-19T13:54:01,956 TRACE QuartzScheduler_Worker-4 sailpoint.connector.ADLDAPConnector:97 - **Entering getNewInstance: Arguments => javax.naming.ldap.InitialLdapContext@58460adc**
2025-04-19T13:54:01,956 TRACE QuartzScheduler_Worker-4 sailpoint.connector.ADLDAPConnector:108 - **Exiting getNewInstance: Arguments => javax.naming.ldap.InitialLdapContext@58460adc, Returns => com.sun.jndi.ldap.LdapCtx@7abe7870**
2025-04-19T13:54:01,971 TRACE QuartzScheduler_Worker-4 **sailpoint.connector.ADLDAPConnector:115 - Throwing checkForMorePagesOrVirtualPages - sailpoint.connector.ConnectorException: [ ConnectorException ] **
** [ Error details ] LDAP connection has been closed**
2025-04-19T13:54:01,971 ERROR QuartzScheduler_Worker-4 sailpoint.connector.LDAPConnector:8019 - 1353197612 Exception caught while in ContainerIterator.hasNext().
sailpoint.connector.ConnectorException: [ ConnectorException ] 
 [ Error details ] LDAP connection has been closed
	at sailpoint.connector.LDAPConnector.getSupportedControls(LDAPConnector.java:5896) ~[connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.LDAPConnector$ContainerIterator.getIterateMode(LDAPConnector.java:7817) ~[connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.LDAPConnector$ContainerIterator.getIterateMode(LDAPConnector.java:7795) ~[connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.LDAPConnector$ContainerIterator.access$000(LDAPConnector.java:7067) ~[connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.LDAPConnector.checkForMorePagesOrVirtualPages(LDAPConnector.java:1356) ~[connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.ADLDAPConnector.checkForMorePagesOrVirtualPages(ADLDAPConnector.java:7278) ~[connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.LDAPConnector$ContainerIterator.hasNext(LDAPConnector.java:7892) [connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.ADLDAPConnector$ADLDAPIterator.hasNext(ADLDAPConnector.java:11275) [connector-bundle-directories.jar:8.3p3]
	at sailpoint.connector.ConnectorProxy$CustomizingIterator.peek(ConnectorProxy.java:1331) [connector-bundle-identityiq.jar:8.3p3]
	at sailpoint.connector.ConnectorProxy$CustomizingIterator.hasNext(ConnectorProxy.java:1358) [connector-bundle-identityiq.jar:8.3p3]
	at sailpoint.services.standard.connector.LogiPlexConnector$LogiPlexIterator.fillQueue(LogiPlexConnector.java:2414) [logiplex-connector-20230417.0.1.jar:?]
	at sailpoint.services.standard.connector.LogiPlexConnector$LogiPlexIterator.hasNext(LogiPlexConnector.java:2519) [logiplex-connector-20230417.0.1.jar:?]```

Please share if there are any pointers.

@pallavi -

Below are the check‑points I normally walk through when an IIQ AD/LDAP aggregation suddenly starts throwing “LDAP connection has been closed” after ±1 000 objects.

1. Confirm the obvious: IIQ & connector patch level

  • I suspect IIQ 8.3p2 / 8.3p3 had a known issue where the paging cookie was not always reused correctly when the connector was pooled by LogiPlex; AD interpreted that as an abandoned search and closed the socket.
    If you are on 8.3p4 or later (or any 8.4/8.5 build) this fix is already in place. If not, upgrading the connector‑bundle‑directories.jar and logiplex‑connector to 8.3p4+ should be the first action. Please check with Sailpoint once via a support ticket.

2. Check whether an AD LDAP policy is firing

LDAP policy (DC‑side) Default Symptom when hit Quick test
MaxQueryDuration 120 s DC drops the search when the same query runs > 120 s. IIQ reconnects → “LDAP connection has been closed”. Measure how long it takes the failing aggregation to reach the error.
MaxPageSize 1 000 Connector requests a page larger than limit → DC closes connection after exactly 1 000 objects. Set Page Size in IIQ application to ≀ 500 and rerun aggregation.
MaxConnIdleTime 900 s Long‑running aggregation that does no I/O for > 15 min → DC closes the idle socket. Look for 15‑min (or policy value) pattern in failures.

You can see the live values on any DC with:

ntdsutil "ldap policies" "connections" "connect to server <DC‑Name>" q "show values" q q
``` :contentReference[oaicite:0]{index=0}  

If you find MaxQueryDuration or MaxConnIdleTime being hit, either:

* raise the limit on the DC (`ntdsutil 
 set MaxQueryDuration to 600 
 commit changes`) **or**  
* keep the search shorter – e.g., lower *Page Size* to 250, narrow *Search DN*, avoid pulling heavy multi‑valued attributes such as **memberOf**.

---

### 3  Re‑run the search outside IIQ to verify it is the server, not the connector  

From the IIQ host run:

```bash
ldapsearch -H ldap://<dc>:389 -D "<bindDN>" -w <pwd> \
    -b "<searchBase>" -E pr=500/noprompt "(objectClass=user)" sAMAccountName

Add the paged‑results control (-E pr=) and request two pages.
If the second page comes back clean, the network path is fine and the problem is likely the IIQ bug .
If the command hangs or returns “search res abandoned”, the DC (or a firewall / load‑balancer in the path) is closing the socket.

You can follow the above roadmap to boil down the exact issue.

Cheers!!

1 Like

Hello @pallavi ,

Quick question?

How did you solve/prevent the duplicate entitlements and accounts while using classic mode?

Thanks,
Shiva

Hi @sukanta_biswas,

Thank you for detailed explanation and helpful pointers. We will soon get all these evaluated and post an update. Thanks for your time.

Thanks,
Pallavi

In classic mode, you don’t have to aggregate the master application, it is just use as part of the chain of applications and connectors. It is recommended though to switch to Adapter mode if you want to benefit from features like delta aggregation, which need the last aggregation status to be stored.

Sorry, I’m a bit late to the party
 Hopefully the issue got resolved using @sukanta_biswas’s suggestions.

Hi @menno_pieters,

LogiPlex account aggregation started working as expected. So it appears that it was something related to the JNDI/LDAP connection timeout. We are evaluating the options to upgrade from 8.3P3 to the latest patchset.

Since past 2 years we have been using the LogiPlex connector in PROD in the classic mode. So far its working as expected for 200+ subapplications. We are evaluating an options to enable the partitioning while executing logiplex aggregation so that the avg. run time of this task will be reduced. We did not get the expected results when the partitioning was enabled with an option ‘detect deleted accounts’ in the task.

Thanks,
Pallavi

1 Like

yup.

  • Check AD server settings for connection timeouts.

  • Look especially at:

  • MaxConnIdleTime

  • MaxConnActiveTime

    • Engage your Active Directory team to confirm these values.
  • MaxConnIdleTime

  • MaxConnActiveTime

@pallavi - Please mark my post as solution provided.

Cheers!!!