Login Information Parser

The Smart Service, Import Login Audit, will read your login audit file(s) and populate a database with information about logins.  This is provided mainly as an example (source code provided) as you may need to modify the smart service to make it work with your environment.

  • Requires a table named "logins" in the specified datasource.  The example SQL is provided.
  • Inputs:
    • datasourceName: The datasource where the login information will be populated
    • date: The date suffix for the login_audit.csv* file to be read. If null, it reads the latest file.
  • Supports MySQL, Oracle and Microsoft SQL Server databases
  • To those using this plugin - I discovered an issue where some of our users are logging IP addresses longer than the 20 characters set on the table in the documentation.  The IP address logged is two addresses separated by a comma and space. I would recommend extending the ipaddress column length to at least 50 (I did 100 just to be safe) to avoid this issue.  The plugin will simply write records up to the point a record would be truncated and stop writing additional records.  

    Here is a link to some information I found regarding why there would be two IP addresses logged:


  • Hi , made an update to the Plug-in listing and the sample DDL for the table now references a length of 50.  Thanks for the suggestion!

  • Hi

    We are currently on High Availability (HA) setup and want to get full picture of all the logins on the system.

    I came across this plugin recently. I wonder if you know if this plugin caters for High Availability multi-instance setup where there may be 1 login-audit file per instance?



  • As explained in docs.appian.com/.../High_Availability_and_Distributed_Installations.html in an HA set-up the use of a shared-logs folder is the recommendation.

    When you go to the logs of your cloud-site you should see that the logs are read from the shared-logs location.

    Looking at the code it seems it is designed to look for files in there, which suggests it should work.

         String logsPath = ConfigurationLoader.getConfiguration().getAeLogs();

         String logParentPath = new File(logsPath).getParent();

         String sharedLogsPath = logParentPath + File.separator + SHARED_LOGS_FOLDERNAME;

         if (new File(sharedLogsPath).exists()) {

         List<File> allSharedLoginAuditFiles = new ArrayList<>();

         allSharedLoginAuditFiles = listf(sharedLogsPath);

          if(allSharedLoginAuditFiles.size() > 0) {

          for (File file : allSharedLoginAuditFiles) {




    I encourage you to review the login-audit.csv files in the logs folder (aka shared-logs for HA) and review whether it contains all logins, if so, then the plug-in will work because it will read it from shared-logs.

  • I dont see the above code in the plugin src when i download it from App market. Is it updated version and available anywhere to download?

    Below is what i see in  com.appiancorp.loginparser-1.0.2.jar - com.appiancorp.loginparser.ImportLoginInformation.java



    String logsPath = ConfigurationLoader.getConfiguration().getAeLogs();

         String dateString = "";

         if(date != null){

           DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd");

           dateString = "." + dateFormat.format(date);        


         LOG.debug("Reading file: " + logsPath + "/login-audit.csv" + dateString);

         processLogins(logsPath + "/login-audit.csv" + dateString);

  • Oh! good point. I was looking at a completely different plug-in. I apologize.

    No, the existing version does not work with the shared logs functionality. It'll have to be updated in a similar fashion (similar to the code I posted above).

    Components come with the source and are available to extension or modification from the participants of this community.

  • Question: the login-audit.csv has information regarding the browser and device used to login. Does anyone know why this information was not added to the plugin to insert into the logins table?

  • Has anyone used this plug-in with IBM DB2? Did it work successfully?