IDW Rule Runner Plugin: Rapid Development and Troubleshooting in the Browser

Additional Resources

2 Likes

Here is the link to the Rule Runner public repository:

My prototyped customization rule:

// === this is for development remove when complete
import sailpoint.connector.ConnectorFactory;
import sailpoint.connector.Connector;
import sailpoint.object.ResourceObject;
import sailpoint.tools.CloseableIterator;

ConnectorFactory cf = new ConnectorFactory();
Connector ac = cf.getConnector(application, null);
// === end of block

import sailpoint.tools.Util;

try{
  // Verify no issues with the Application configuration
  ac.testConfiguration();  
  
  // If this application has a large number of accounts you can decache through the loop and check for edge cases 
  // If you have one user in mind you can always change this to something like a getObject
  
  CloseableIterator<ResourceObject> ro = ac.iterateObjects("account", null, null);
  while(ro.hasNext()){
    ResourceObject object = ro.next();
    // Lets start by seeing what data we have to work with
    log.debug(object.getAttributes());
    
    // This also gives you a chance to check for those pesky edge cases before going live 
    // if(Util.isNullOrEmpty(object.getAttribute("id")) || Util.isNullOrEmpty(object.getAttribute("username")) || object.getAttribute("username").length() < 3){
    //    log.error("null value found or username length too short!");
    //    log.error(object.getAttributes());
    //    object.setAttribute("id","100");
    // }
    
    // Ok that's what we are looking for let's finish this out 
    // log.debug(object.getAttribute("username").substring(0,3) + object.getAttribute("id"));
    // object.setAttribute("id",object.getAttribute("username").substring(0,3) + object.getAttribute("id"));
    // log.debug(object.getAttributes());
    
    // This looks great! let's return the object and verify
    //return object;
    
  }
} catch (Exception e){
  log.error(e);
}

My clean stale workflow case rule:

import sailpoint.object.*;
import sailpoint.api.Terminator;
import java.util.*;
import java.util.concurrent.TimeUnit;

// Demoing the stop function
//TimeUnit.SECONDS.sleep(60);

try{
  // Set this to similar name as object to be deleted
  String workflowCaseName = "Dead WorkflowCase";

  // Set this Integer to the days how long workflowCases should be kept. ex.: 1 = delete all objects older than 1 day
  int daysGapToDelete = 30;

  int countDeletedWorkflows = 0;
  Terminator term = new Terminator(context);
  QueryOptions qo = new QueryOptions();

  // Calculate cutoff date
  Calendar cal = new GregorianCalendar();
  cal.add(Calendar.DATE, -daysGapToDelete);
  Date cutOffDate = cal.getTime();

  //Get WorkflowCases matching name critera and before cutoff date
  qo.addFilter(Filter.lt("created", cutOffDate ));
  qo.addFilter(Filter.like("name", workflowCaseName, Filter.MatchMode.START));
  term.setTrace(true);

  Iterator it = term.getIds(WorkflowCase.class, qo);

  int limiter=0;
  String listOfWfCase = "";

  while(it.hasNext()){ 
    log.info("Found workflowcases to delete");
    String id = (String)it.next();
    log.info("Found id: " + id);
    WorkflowCase wfCase = context.getObject(WorkflowCase.class, id);
    log.info("Got wfCase:" + wfCase);

    // Pull WorkflowCase into termination list of TaskResult is null, this is the cause of the PPM errors
    if(wfCase != null){
      if(wfCase.getTaskResult() == null){
        listOfWfCase = listOfWfCase + "Found WorkflowCase for termination " + wfCase.getId() + " - " + wfCase.getName() + " - " + wfCase.getTaskResultId() + "\n";
        countDeletedWorkflows++;
        try {
          term.deleteObject(wfCase);
        } catch (Exception e) {
          log.info("Got exception deleting workflowcase: " + e.toString());
          return wfCase;
        }
        limiter++;
        // Adjust this value for performance adjustments
        if(limiter % 200 == 0){
          context.decache();
        }
      } else {
        log.info("wfCase.getTaskResult(): " + wfCase.getTaskResult());
      } 
    } else {
      log.info("wfCase is null");
    }
  }
  log.info("finished loop with limiter:" + limiter);
  context.commitTransaction();

  if(listOfWfCase != ""){
    return listOfWfCase +"Total WorkflowCases deleted: " + countDeletedWorkflows;
  }
  else{
    return "No matching WorkflowCases found";
  }
}
catch (Exception e) {
  log.debug(e);
}

A rule I use to test connection on all applications in an environment from rule runner:

import sailpoint.object.*;
import sailpoint.connector.Connector;
import sailpoint.connector.ConnectorFactory;

try{
  
  QueryOptions qo = new QueryOptions();
  List applicationList = context.getObjects(Application.class, qo);
  String resultString = "";

  for(Application application : applicationList){
    //log.debug(application);
    try{
      Connector connector = ConnectorFactory.getConnector(application, null);
      connector.testConfiguration();
      log.debug("Test connection successful on: " + application.getName());
      resultString = resultString + "[OK] Test connection successful on: " + application.getName() + "\n";
    }
    catch(Exception e){
      log.debug("Test connection failed on: " + application.getName());
      log.debug(e);
      resultString = resultString + "[FAIL] Test connection failed on: " + application.getName() + " || " + e + "\n";
    }
  }
  
  return resultString;
}
catch(Exception e){
  log.error("Error when retrieving the applicaiton list: " + e);
}
3 Likes

My search object contents rule:

/* The premise of this rule is to search the codebase currently live in a SailPoint environment to find code that can't be found in the SailPoint repo or database for whatever reason. Simply change the searchTerm and iteration variables to search for any rules containing the searchTerm. The object searched for can be changed from rule by adjusting the class defined in the context.getObjects call to whatever you are looking for (ex. Workflow.class). */

import sailpoint.object.*;

//Set this to limit the number of objects searched for performance purposes
int iteration = 500;

//Set this to the term you are searching for in the live codebase 
String searchTerm = "plan.toXml";



QueryOptions qo = new QueryOptions();
String searched = "";
String foundInstances = "";

try{
  //Make sure to change the object class if you'd like to search something else like workflows
  List ruleList = context.getObjects(Rule.class, qo);

  log.debug(ruleList.size() + " objects ready to search...");
	
  for(Rule rule : ruleList){
    if(rule.toXml().contains(searchTerm)){
      //log.debug below is good when running in rule runner
      //log.debug( "Found it in: " + rule.getName());
      foundInstances = foundInstances + "***Found in: " + rule.getName() + "***\n";
      int lineNumber = 0;
      for(String line : rule.toXml().split("\n")){
        lineNumber++;
        if(line.contains(searchTerm)){
          foundInstances = foundInstances + "Line " + lineNumber + ": "+ line.trim() + "\n\n";
          break;
        }
      }
    }
    else{
      searched = searched + "Not found in: " + rule.getName() + "\n";
    }
    if(iteration == 0){
      return foundInstances + searched;
    }
    else{
      iteration--;
    }
  }

  return foundInstances + searched;
} catch(Exception e){
  log.error("Failed to search for: " + searchTerm + " || Exception: " + e);
}

A rule Iā€™ve used to create cloned identities and the rule I use to clean them up:

import sailpoint.object.Identity;
    import sailpoint.object.Capability;

    Identity provisionWorkgroup(){
        import sailpoint.object.Identity;
        Identity clone_workgroup = context.getObjectByName(Identity.class, "IDW Cloned Identities");
        Identity spadmin = context.getObjectByName(Identity.class, "spadmin");

        if(clone_workgroup == null){
            log.debug("Workgroup does not exist, begining provisioning process...");
            clone_workgroup = new Identity();
            clone_workgroup.setName("IDW Cloned Identities");
            clone_workgroup.setDisplayName("IDW Cloned Identities");
            clone_workgroup.setWorkgroup(true);
            clone_workgroup.setNotificationOption(Identity.WorkgroupNotificationOption.Both);
            clone_workgroup.setDescription("Workgroup used to easily manage cloned Identities from the IDW Clone Identity function. *This is an auto provisioned workgroup*");
            clone_workgroup.setOwner(spadmin);
            log.debug("Committing clone workgroup");

            context.startTransaction();
            context.saveObject(clone_workgroup);
            context.commitTransaction();
        }
        else{
            log.debug("Workgroup already exists...");
        }

        log.debug(clone_workgroup);
        return clone_workgroup;
    }

    //Use the top to provision 1 identity and the bottom for use with a retrieval script in a multi-threaded rule
    // Identity original_identity = context.getObjectByName(Identity.class, "zac_test");
    Identity original_identity = object;

    Identity workgroup = provisionWorkgroup();
    Identity identity = new Identity(); 
    //String name = "dynamic identity2";

    String name = original_identity.getName() + "_cloned";
    String password = "password1";

    log.debug("Creating Identity with the name: " + name);

    identity.setName(name.replace(" ", ""));
    identity.setPassword(password);
    identity.setDisplayName(name);

    identity.setProtected(original_identity.isProtected());
    identity.setFirstname(original_identity.getFirstname());
    identity.setLastname(original_identity.getLastname());
    identity.setUIPreferences(original_identity.getUIPreferences());

    for(Capability cap : original_identity.getCapabilities()){
    log.debug(cap);
    identity.add(cap);
    }

    for(Identity wg : original_identity.getWorkgroups()){
    log.debug(wg);
        identity.add(wg);
    }

    identity.add(workgroup);

    log.debug("Committing Identity");

    context.startTransaction();
    context.saveObject(identity);
    context.commitTransaction();

    log.debug("Successfully created Identity!");
    log.debug(identity);
import sailpoint.object.Identity;
import sailpoint.api.ObjectUtil;
import sailpoint.api.Terminator;

Identity clone_workgroup = context.getObjectByName(Identity.class, "IDW Cloned Identities");
Identity test_identity = context.getObjectByName(Identity.class, "zac_test_cloned");
List<String> deletedIdentitiesList = new ArrayList<String>(); 
Terminator terminator = new Terminator(context);

if(clone_workgroup != null){ 
    Iterator members = ObjectUtil.getWorkgroupMembers(context, clone_workgroup, null);
    while(members.hasNext()){
        Object[] object = (Object[]) members.next();
        Identity clonedIdentityToDelete = (Identity) object[0];
        if(clonedIdentityToDelete.isProtected()){
            log.debug(clonedIdentityToDelete.getName() + " is a protected identity, removing protection status...");
            clonedIdentityToDelete.setProtected(false);
        }
        log.debug("Deleting cloned Identity: " + clonedIdentityToDelete.getName());
        deletedIdentitiesList.add(clonedIdentityToDelete.getName());
        terminator.deleteObject(clonedIdentityToDelete);
    }
}

return deletedIdentitiesList;
2 Likes

Hi @zac_adams_idw,
Thanks for sharing the rule to cleanup stale workflow cases. Just wanted to understand what is the impact of there are workflow cases stuck and having these data more than 3-4 months ?
What performance issues we face ?
The perform maintenance job timings will be impacted if we have stale workflow cases ?

Well usually they donā€™t cause too much of a performance impact as they are pushed along by the perform maintenance task and arenā€™t something like a rule just running on a thread in the background. However this is all dependent on what your parent workflow is doing! For example, maybe since a workflow case never finishes, a new one is created in an endless loop. There is also an impact in usability as in some cases youā€™ll see all these old workflow cases in your task results page making it impossible to find what your looking for. Finally you have to remember workflow cases are temporary objects and shouldnā€™t exist in your DB after their purpose has been filled. Youā€™d be doing your DB teams a favor keeping your tables clean! Hope that answers your question!

1 Like

Thanks @zac_adams_idw for addressing the query !!

1 Like

Hi @zac_adams_iid ,
This looks great.
I am trying to get this plugin installed in my lab, however, it fails with ā€œRequest Entity Too Largeā€ message when I try uploading the zip file. Any pointers to how to fix this ?

Hey @mike_black! It sounds like your Tomcat upload limits are still set too low. Try taking a look at maxHttpHeaderSize and maxPostSize in $TOMCAT_HOME/conf/server.xml on your server and bump them up.

Hi @mike_black I had the same issue in my lab environment. My issue was related the nginx-proxy I am using in front of TomCat.

My solution was to import the plugin from the IIQ-console:
https://documentation.sailpoint.com/identityiq_83/help/plugins/workingwithpluginsconsole.htm?tocpath=Plugins|_____4

ā€“ Remold

Hi @zac_adams_idw

Amazing plugin, Iā€™m making great use of it!

Iā€™ve had one issue regarding logging, where periodically log.info/log.debug/etc arenā€™t displayed in the ā€˜Logsā€™ section.

If I create a new blank rule with just log.debug(ā€œTestā€); it works fine, however if I then load a rule with multiple log commands in then nothing is returned. The rule runs and provides the expected output, just no logs. Even tried adding log.debug(ā€œTestā€); on the first line of the opened rule but to no avail.

Any thoughts on how to resolve this would be greatly appreciated, thanks!

1 Like

In that case, it sounds like you may be leveraging a rule library with its own declaration of ā€œlogā€. This will supersede the log object passed in by the rule runner plugin and prevent the plugin from capturing those log outputs.

1 Like

Thanks Zac, that was exactly it, I was referencing a rule library that was declaring log.

Is there a way to override that at all? Possibly by declaring log in the source rule that could override the referenced rule?

1 Like

Rule libraries are invoked first so you should be able to reassign the same variable in your script. In the rule runner script set:
ruleLibraryLog = log

3 Likes