I am working on a lookup transform and wanted to understand that if the values passed in the table can be passed through any other way on the runtime instead of passing them manually like key-value pair.
For Example in the lookup table i am passing key value as:
Userid( XXYZ) : yes
I need to pass this userid to the table at runtime. Is there any Script or API call can be made. Usually i paste the table value manually but need to automate it.
In SailPoint Identity Security Cloud, the Lookup Transform table is static and cannot be dynamically populated at runtime through scripts or APIs. The values must be configured manually in the transform definition. If you need dynamic values such as UserID, it is better to fetch them from an identity attribute or populate them through a workflow or rule and then reference that attribute in the transform.
The way a lookup transform works is that you pass in a value, and that value is checked against the defined table. If the value exists in the table, the corresponding response value is returned; otherwise, the default case is returned. You’d have to define the values to search for as well as the responses for each case; there is no way to check an external source for the key-value pairs dynamically.
You might want to take a look at this, to brainstorm:
…and yes, you can update the lookup transform via API (e.g. PowerShell SDK)…but you’ll need somewhere to store the script, the DB credential, the PAT and run the script.
Is this going to be one time effort or do you want to update the transform regularly by reading the CSV? like daily or once every few hours?
If you are thinking of reading from CSV in real time while identity attributes are calculated, then that’s not possible with Lookup transforms. What @kallajayaram has explained in this response is exactly what they are designed to do.
I am seeing that the responses are based on your original post on how to update a Lookup transform and these might not be leading you in the right direction. If you explain your requirement in pure business language rather than in terms of how to design the solution using a specific type of transform, you might receive better suggestions.
Yes, a Rule Transform can be used for this requirement.
If you have a large number of key-value mappings (for example 100+), maintaining them directly inside a Lookup Transform can become difficult. In such cases, the mappings can be maintained in an external source such as a CSV file or a database table, and the rule can dynamically read those values during execution.
A possible approach would be:
1. Store the key-value mappings in a CSV file or database table.
2. In the rule, read the data and load it into a Map structure.
3. Compare the incoming input value with the keys in the map.
4. Return the mapped value (for example yes/no). If the key is not found, return the default value.
This approach improves scalability and maintainability, as updates can be made directly in the external data source without modifying the transform each time.
Another alternative is to automate the Lookup Transform using the Transforms API, where a script reads the CSV or database entries and updates the lookup table dynamically in the transform JSON.
Both approaches help avoid manually maintaining large lookup tables within ISC and make the solution easier to manage in the long term.
Not sure if I understand this part. Where would the file be stored? And would you be able to read a CSV file from cloud rule which is the rule type for calculating the identity attributes
In ISC cloud rules we cannot directly read external files like CSV or access databases because cloud rules run in a restricted SailPoint environment.
A more practical approach is to maintain the key-value mappings in an external CSV file or database and use an automation script (for example Python or PowerShell) to convert the mappings into the lookup table JSON format.
The script can then call the SailPoint Transforms API to update the Lookup Transform automatically. This way the mappings can be maintained externally while the transform is updated programmatically.
This approach works well when there are 100+ mappings and avoids manually maintaining large lookup tables in the transform configuration.
Thanks for pointing that out. I’m still learning ISC and exploring different approaches, so I appreciate the feedback.
My understanding was that since cloud rules in ISC cannot directly access external files like CSVs or databases, one possible approach could be to maintain the mappings externally and update the Lookup Transform programmatically using the Transforms API.
I’m definitely open to corrections and would really appreciate learning the recommended approach from the community for handling larger lookup mappings in ISC.
While I agree with you, I do not think it’s necessary as you can manually convert a csv file to json “key”:”value” format simply using the formula in Excel and copy/paste them into your transform using VSCode