Source showing error -Account File Import Failed with Delimited source while aggregation is running successfully

Hi Everyone,

we are facing this issue as Source showing error -Account File Import Failed with Delimited source while aggregation is running successfully on daily basis. if anyone faced such issue then please let me know how we can get this message removed

image

.

1 Like

Hi @Deepak_Chaudhary Something to look at is checking how the final row in your file is terminated

Are you using File Upload Utility or uploading the file manually ?

using the file upload utility

will check and confirm if this is the case but the question is daily aggregation is success.

1 Like

This might be using the oldest file that is present. Did you verified the data that is available in latest file is present in ISC or not. Maybe it is running the aggregation with the old file.

Hi @msingh900 I checked that the data is present in ISC as per the latest aggregation count.

Hi @Deepak_Chaudhary ,

Since, you’re using file utility, could you please check the logs available with your configured powershell script?

Additionally, any difference in the syntax of new and last aggregated file.

1 Like

Hi @AsGoyal As I have checked the Account file and the schema, both looks good to me.

I don’t have access to check the logs from the IQ Service. I think need to raise the sailpoint case.

1 Like

Hi @j_place @msingh900 @AsGoyal

I found that in the file that due to using the special character in the last name of 1 identity “Ç” which is showing as “?” in the “Account Name” in the source.

is this the real issue what are your thoughts?

1 Like

@Deepak_Chaudhary That’s correct. If some extra character is added in the delimited file, then usually SailPoint will gives this error.

I have observed this behaviour in IIQ. but in ISC I have not come across this lind of issue for now.

This could be the issue.

Hi @msingh900 this is not extra but using as special character. what are possible solutions to handle this if you have any idea?

Hi Everyone

if the flat file is saved with the correct character encoding, preferably UTF-8-BOM. Can this will resolve the issue?

1 Like

It should be UTF-8. This might fix the problem.

1 Like