Please be sure you’ve read the docs and API specs before asking for help. Also, please be sure you’ve searched the forum for your answer before you create a new topic.
Hi everyone, A quick question about ISC.
In IdentityIQ, we could use CustomObject to store custom data types… In ISC, is there any supported equivalent concept to store/manage a custom entity (non-identity domain), for example, “Location/Facilities” (site/building/floor) physical access (e.g. lockers, visitors, parking), as its own object?
I know ISC can store location as an identity attribute, but in our scenario, location is master/reference data that needs to be referenced by workflows and access processes; we’re trying to confirm whether ISC can model this as a separate entity, or if it’s intentionally out of scope.
If ISC doesn’t support custom entities/object storage:
What’s the recommended pattern you’ve seen in real ISC projects?
Is the best practice to keep reference/master data in an external system (IPaaS/MDM/CMDB/etc…) and only sync keys/attributes into ISC?
Any supported approach to query reference data from workflows (via HTTP/API) when needed?
Any pointers or docs would be appreciated, have a nice and great one!
No. ISC is an identity platform, obviously, so it wouldn’t make sense for it to also house other object types. Location is an attribute of an identity that should be sourced from the authoritative source.
To your point, though, there are other related attributes that we could want to assign based on the provided value - for example, we may want to set an address based on the office name. The best way to do that right now would be a lookup transform that returns the address value based on the office name, but that quickly grows out of hand when you need to set the street address, the city, the state, the postal code, the country, the timezone, etc., all based on the same office name. It would be great if there was a way to source all of those from a central lookup table so we don’t have to update 6 separate lookup transforms every time a new office opens. Obviously the best solution is to get all of those values directly from the source system, but that’s not always possible (for instance, in my environment, Workday returns home addresses for remote users as their work address, which makes sense from the Workday perspective but is obviously not what we want to populate downstream for these folks for privacy reasons).
You could write a script (or even a workflow, if you’re feeling brave) to populate a transform based on values you get from a separate database or whatever using the ISC API, but there’s nothing prepackaged for that at the moment.
If you take the notion of machine identity / IoT (non-human) identity further, then you can envision a Location / Site / Building / Facilities as identity types / profiles. And you can configure connectors to read from their respective Location / Site / Building / Facilities authoritative systems, thereby creating ‘local’/staged/mirrored data representations of those entities in ISC.
If you’re coming from an object oriented mindset. Identity Profile is like object class. Then each ‘identity’ is an instance of that class.
Then you can reference them in transforms using the “Get Reference Identity Attribute” operation. Say, get the ‘address’ attribute of a B-xyz Building identity, where “xyz” is the reference value from the human identity.
Note: Just make sure these identities are not active, so they don’t count towards your license.
That matches what I suspected: no first-class Custom Object equivalent in ISC, and the clean approach is to treat Location as authoritative data and keep it on the source side.
The lookup transform point is exactly the pain we’re trying to avoid unfortunately… Once it’s more than one field (street/city/country/timezone, etc.) it becomes a maintenance nightmare.
Two quick follow-ups if you don’t mind:
Have you seen a “supported” pattern for centralizing those lookups (even if external), without having to maintain multiple transforms per attribute?
If we go with “external table + API”, would you recommend doing it via workflow + HTTP to the external system and then writing back to ISC attributes (so transforms stay simple), or do you usually keep it entirely outside ISC and only push resolved values in from the authoritative source/integration layer?
Over and over again thanks you so much, this is helpful!
Hi @David_Norris , interesting approach, thanks for sharing, I was thinking about the same NHI management, and yes I’m coming from OOP mindset, so I was thinking about the same but need to validate and know more about the pros and cons, really appreciate your intersting angel and approach!
Using Location/Site/Building as separate Identity Profiles and referencing them via “Get Reference Identity Attribute” sounds workable as a staged reference-data model.
A couple quick checks:
In practice, should we treat them like “reference/normal identities” with a stable key (siteCode/buildingId) and simple correlation?
Any gotchas you’ve hit (processing noise, reporting, refresh, etc.)?
On licensing… have you actually validated that keeping them inactive means they don’t count, or is that just a rule of thumb?
If you’ve done this in practice, any guardrails you’d recommend would help a lot.
Over and over again I really appreciate your time, efforts, and approach :)!
Yeah, use a stable key (site code / building Id) so that the authoritative record would correlate. Any account attribute changes would then continue to reflect correctly (e.g. site’s telephone number changes).
This notion works well across various IdM / IGA solutions where there’s a customizable entity / identity object model (e.g. coming from ISIM / ISVG, or IIQ). IIQ custom object was nice…but you had to code around how that custom object’s data come into / out of existence. Handling the entities as identities generally tend to be more scaleable (as that also gives you graphical searchability, visibility, reporting).
Unfortunately, no. If the attribute values can’t be sourced from the authoritative source, you’ll need to use a transform (or multiple transforms) that you’d have to update either by hand or via the API. I don’t know if I’d use a workflow for this, but I guess it depends what other options you have for running automated tasks. For something like this, the workflow functionality is very limited and difficult to work with in my opinion, you’d be better off writing a script in PowerShell or whatever and throwing it on your IQService host or GitHub Actions or something.
On that note, just want to add this:
When it comes to relatively ‘static’ entities of Location / Site / Building…etc, (i.e. it’s not like they get new telephone number or postal code, or new buildings popping up every 6 hours). I tend to shy away from external up-time / availability dependency approach, and opt for staged / mirrored data instead. Together with the question of “Is last known info ‘good enough’?”.
But yeah, how frequently those entities get updated is a consideration factor.
This is a bigger issue, obviously, in larger orgs. Companies with lots of locations may add, change, and remove locations very frequently - think restaurants, banks, retail, etc.