Using Custom SaaS Connectors on IdentityIQ

Ever since the release of the SaaS Connectivity Framework, I’ve been using it to solve countless use cases that previously had little to no solution—at least not without a dedicated external system. My experience has been great so far, and the framework keeps delivering.

From early on, I thought it would be amazing to use these connectors within IdentityIQ. I knew it was technically possible, but it would require figuring out a middleware layer to interface with it. It wasn’t a pressing matter—until it became one.

Recently, I ran a POC for an IIQ/NERM integration, for which no out-of-the-box solution exists yet. I couldn’t afford to wait, and I didn’t want to reinvent the wheel—especially when I already had a fully functional NERM SaaS connector I developed that did everything I needed.

Let me walk you through how I used that connector in IdentityIQ, in the most standard and reusable way I could come up with.

Interacting with SaaS Connectivity framework

The SaaS Connectivity Framework is, at its core, a REST endpoint that handles registered operations in a specific way. It’s built on Node.js, so you can run it virtually anywhere—or deploy it directly on Identity Security Cloud, as originally intended. This endpoint expects POST requests with a JSON payload like the following:

{
  "type": "std:account:create",
  "input": {
    "attributes":{"email":"[email protected]","first_name":"John","last_name":"Doe","name":"john.doe","status":"Inactive","workflows":"b33ca9cc-c9a0-4eb7-91c8-3d6320419a81"}
  },
  "config": {…}
}

Without getting into details like optional accepted keys for this payload, let’s focus on what this means:

  • The standard operation type std:account:create is being invoked. Both standard and custom operations can be called this way.
  • Input parameters for the operation are provided.
  • A configuration for the entire runtime is included. This configuration is typically defined in the ISC source associated with the connector.

In broad strokes, we now understand how to invoke an operation with all the necessary context—something an ISC source would handle automatically.

At this point, you might be thinking: “Great, but how do I actually use this?” Well, if you plan to deploy the connector yourself, you’ll need the source code. The usual procedure is as follows:

git clone <repo url>
cd <repo_dir>
npm install
npm run debug

After running these commands, your connector should be up and running on the local port 3000. Similarly, you can invoke remotely deployed connectors by calling:
<isc_tenant_api_url>/beta/platform-connectors/<connector_id>/invoke

Keep in mind, though, that in this case you’ll need to provide a valid OAuth token for the tenant.

Now that we’ve clarified how to call a connector, let’s take a look at how it responds. Understanding the response is essential for knowing how to process the results. Here’s an example response payload:

{
  "data": <send response1>,
  "type": "output"
}
{
  "data": <send response2>,
  "type": "output"
}
{
  "data": { "commandMs" 2 },
  "type": "end"
}

This is an expanded sample payload from an ISC connector response. Note that this is a JSONL-style output—each line is a standalone JSON object. The special end blocks are only produced by ISC-hosted runtimes. At the time of writing, ISC-hosted runtimes return responses with a text/plain content type, whereas local runtimes return application/x-ndjson. Either way, the key takeaway is that this isn’t standard JSON. The closest resemblance to JSON is a single object per line, separated by newlines.

Interacting with the Connector from IdentityIQ

If you’ve been following along, you probably know where this is heading. It’s fairly clear that SaaS Connectivity operations are just standard connector operations—similar to those you would define in a Web Services connector. And since these are exposed as REST endpoints, it’s reasonable to conclude that the Web Services connector is the best option to interface with this framework.
However, there are a few challenges to tackle:

  • Response Format: Web Services connectors work best with structured data like JSON or XML. The SaaS Connectivity output is plain text in a JSONL format, which requires some workarounds.
  • Schema Complexity: Replicating desired schemas, parsing responses, and constructing provisioning payloads manually is tedious and error-prone. We need automation.
  • Configuration Management: We still need a way to define and maintain the connector’s configuration. There’s no dedicated UI for this—but considering the value this integration provides, it’s a compromise worth making.

I’ll walk you through the key configuration details. At the bottom of this post, you’ll find the necessary artifacts to get you started.

Connection Options

This part is straightforward. You need to point your connector to the HTTP address it’s listening on—typically port 3000 for local setups.

Schemas

Schemas must be configured manually, much like on ISC. The process is virtually the same, just without the nice interface or automatic discovery. If your connector supports custom schemas, you’ll also need to figure out how to include that information in your requests—similar to how I manage the automatic injection of configuration into each call.

Operations

Operation setup will vary depending on the connector. You’ll need to:

  • Map out each operation you plan to use
  • Define the request structure
  • Map the response accordingly


This can be extremely cumbersome for complex connectors, so I automated several pieces:

  • Config Injection: A dedicated Web Services Before Operation Rule reads the config and injects it into the request payload, so you don’t have to include config manually in every operation.
  • Automatic Update Account Request Injection: This rule (which calls the previous one) parses the provisioning plan and generates the appropriate change instructions.
  • Automatic Response Mapping: A dedicated Web Services After Operation Rule parses the JSONL response into valid JSON, then maps attribute values one-to-one based on the configured schema.

Configuration

The Web Services connector doesn’t offer a great way to store configuration data. My workaround? I use an unused operation named config and store the configuration in its Body Text area. Not the most user-friendly solution—but it works.

image

Bonus: Installing the Connector as a Service (Windows Example)

There are multiple ways to run the connector, but I’ve found one particularly convenient for Windows: NSSM (Non-Sucking Service Manager)—yes, that’s the actual name! It’s simple to set up. Just follow the configuration shown in the screenshot for reference.

Conclusion

I genuinely hope this concept sparks an official effort to integrate this powerful connector framework with IdentityIQ in a more standardized way. As my findings show, there’s nothing fundamentally preventing it. More importantly, I hope this helps you move forward with your own project—whether you’re experimenting or deploying in production.

Artefacts

application.xml (66.1 KB)
SaaS-WSAO Automatic Response Mapping rule.xml (3.2 KB)
SaaS-WSBO Add Automatic Update Changes rule.xml (5.0 KB)
SaaS-WSBO Inject Config rule.xml (1.5 KB)

5 Likes