How do you configure a custom location for

When running sailpoint under tomcat, how can you configure it to use an file outside of the .war file WEB-INF/classes/.
I’d like to be able to manage the separately from the war.

The file must be in the WEB-INF/classes folder - unfortunately there’s not any way around that that I’m aware of. That said, the SSB has the ability to use environment-specific properties files and will include the file for the target environment in the output WAR file.

1 Like

Thoughts on the following?

After a little digging I came up with this solution.
I enabled runCustomScripts in SSB added a scripts/build.custom.external-properties.xml

 <?xml version="1.0" encoding="UTF-8"?>
 <project name="">

    <target name="post.expansion.hook">
	    <unzip src="${build.iiqBinaryExtract}/WEB-INF/lib/identityiq.jar" dest="${build}/ext-tmp">
		       <include name="configBeans.xml"/>
	    <replace file="${build}/ext-tmp/configBeans.xml" token="" value="${}"/>
	    <jar destfile="${build.iiqBinaryExtract}/WEB-INF/lib/identityiq.jar" update="true" basedir="${build}/ext-tmp" includes="configBeans.xml"/>

    <target name="post.war.hook"/>

    <target name="clean"/>


This, by default, doesn’t change the behavior, but then under tomcat you can setup a that includes

 export CLASSPATH=$CLASSPATH:/path/to/configs

That will then load the alternative file.
This should support upgrades, etc. as long as there aren’t major changes.

In theory it looks like that should work, but note that you’re technically modifying contents inside the primary IIQ application JAR, which means you might be limiting yourself when it comes to official IIQ support (modifying contents inside the IIQ JAR file is not a supported activity, even if it’s done via a custom SSB build script).

If you do go this route, please have a plan in-place to use the OOB file location should you be asked by Sailpoint Support to do so, as that will likely be one of the first things they’d ask if you have issues with the DB or DB connectivity.

For me I have a tokenized file and a startup script that runs before tomcat starts.

I set the environment variable IIQDBURL (in Azure DevOps) to jdbc:mysql://<host_name>:<port>/<dbname>?useServerPrepStmts=true&tinyInt1isBit=true&useUnicode=true&characterEncoding=utf8&useSSL=false

I unzip the war: jar -xf identityiq.war

The file has tokens like this: dataSource.url=IIQDBURL

Then the script has sed statements like this: sed -i "s|IIQDBURL|$IIQDBURL|g" ./webapps/identityiq/WEB-INF/classes/

The end result is like this: dataSource.url=jdbc:mysql://<host_name>:<port>/<dbname>?useServerPrepStmts=true&tinyInt1isBit=true&useUnicode=true&characterEncoding=utf8&useSSL=false

I do this for all the per-environment variables: IIQDBDRIVER, IIQDBUSER, IIQDBPASS and so on.

Matt’s approach (injecting as part of a CI/CD pipeline) is one that I’ve seen work pretty well at several customers, too. The CI/CD automation here is key to ensuring consistency in the file updates/replacement (and this method is fairly similar to the long-established SSB model of file replacement as part of the build process.

1 Like

The reason I like the CICD approach is that the built war file can be deployed to any environment, the war file is no longer specific to dev test prod. To take this one step further, this works great for the containerized IIQ deployments that use. The same container can be run in dev/test/prod.

I agree with you on this. I’m trying hard to get our indentityIq deployments to fit into the “build once” best practices. Extracting the war file and overlaying our templated inside webapps was my next option.
My preference for externalizing the was to make system validation easier.
We can easily do a checksum on our war file and templated properties file. But if we try to compare the webapps folder with the war file to verify it matches we would need to use custom code because we would be overlaying in it.
By allowing external properties to be referenced we can validate the whole system.

  • webapps matches the war file
  • warfile matches the artifact from the DML
  • matches our data from our CMDB/vault

Thank you both for your input!