Automating delimited file aggregations using the File Upload Utility

Description

Join Expert Ambassador Krishna Mummadi as he showcases a few ways to use the File Upload Utility tool in Identity Security Cloud. Krish will show how to delimit file aggregations automatically, compare feed files, avoid unnecessary access gaps, and send emails with workflows.

Find the File Upload Utility and other awesome tools in the CoLab!

Resources

The Powershell script is in the process of being added to the CoLab.

9 Likes

I am receiving below “Connection Timeout” in our env, what are the parameters that I will need to check to resolve the issue. Please advise

<
java.net.SocketTimeoutException: Connect timed out
at java.base/sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:546)
at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:597)
at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:333)
at java.base/java.net.Socket.connect(Socket.java:648)
        at okhttp3.internal.platform.Platform.connectSocket(Platform.java:130)
        at okhttp3.internal.connection.RealConnection.connectSocket(RealConnection.java:263)
        at okhttp3.internal.connection.RealConnection.connect(RealConnection.java:183)
        at okhttp3.internal.connection.ExchangeFinder.findConnection(ExchangeFinder.java:224)
        at okhttp3.internal.connection.ExchangeFinder.findHealthyConnection(ExchangeFinder.java:108)
        at okhttp3.internal.connection.ExchangeFinder.find(ExchangeFinder.java:88)
        at okhttp3.internal.connection.Transmitter.newExchange(Transmitter.java:169)
        at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:41)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
        at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:94)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
        at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
        at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:88)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
        at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
        at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:229)
        at okhttp3.RealCall.execute(RealCall.java:81)
        at retrofit2.OkHttpCall.execute(OkHttpCall.java:207)
        at sailpoint.service.SailPointService.createSession(SailPointService.java:110)
        at sailpoint.utils.FileUploadUtility.call(FileUploadUtility.java:235)
        at sailpoint.utils.FileUploadUtility.main(FileUploadUtility.java:145)/>

From where you are running File Upload Utility command, some server in your environment or your computer ?

Check running from your computer as well, it just calling an API which is public. Guess some network connectivity issues in server from where you are running.

Cheers
Krish

1 Like

Great video, Very informative! Thanks, Krishna
Quick question: What’s the best way to schedule this PowerShell script to run automatically? Should I use Windows Task Scheduler, or is there a better method?

1 Like

Of course, Windows Task Scheduler

2 Likes

Hi. What would be the best way to implement HA for File Upload Utility? Lets say I have 2 IQService hosts (Primary and Secondary) and they both connect to a shared drive to pickup the delimited file. If the first IQServer host with File Upload Utility goes down, what is best practice for failing over to the second IQServer (w file upload).

Regards,
Matt

where can i found the powershell script

Where can I get the PowerShell script?

I have developed a method for implementing the File Upload Utility via Powershell that can be installed on a separate server to perform User List standardization and upload to Sailpoint via API. I have a Github repo here:

If there’s any questions please let me know.

2 Likes

In the current File Upload Utility (FUU) setup, we specify the path of our HR file in the config file, and that file is stored locally before being uploaded to SailPoint. I’m exploring whether we can automate the file selection process. For example, the HR team could place the file in a specific folder accessible to the IdentityIQ service (where my script runs). The script would then automatically detect and pick up the latest file from that folder, without requiring manual updates to the file name or path in the config each time. Any suggestion or guidance you can provide on this. Thank you.

1 Like
  1. Create a folder and share it to HR team, so that they can push the file everyday there.
  2. Give this folder path in your config file
  3. Your HR team need to maintain file name convention, for example: HR_Feed_DDMMYYYY.csv
  4. In script you can read home path from config file and then append it with File name + current date in that format.
1 Like

We currently manage approximately 100 flat file sources, and our goal is to automate the ingestion of these files using a file upload utility. A dedicated team is responsible for maintaining these source CSV files, including updates such as adding or removing accounts.
I have a few questions regarding the implementation:

  • Event-Based Triggering: Instead of scheduling uploads for all 100 sources daily, is it possible to trigger the file upload utility based on changes detected in each source file?

  • Script Consolidation: To streamline the aggregation process, can we implement a single PowerShell script to handle uploads for all sources? Any suggestions or best practices would be greatly appreciated.

  • HR Source Segregation: We already have an HR source utilizing a scheduled file upload utility. For the new sources, I plan to use a separate folder structure and file path to ensure there is no impact on the existing HR aggregation process. Any suggestions or best practices would be greatly appreciated.

  1. Event Basted Triggering: Some program has to run every minute to detect the events and trigger file upload utility. Better to use scheduled one, this is more of Programming and design.
  2. Bulk Uploading: You can use below option to upload all the files in a folder.
-f <arg>, --file <arg> Required --file /Users/neil.mcglennon/test/resources/ File or directories for bulk aggregation. This can be specified multiple times.
1 Like