Login Information Parser

Overview

The Smart Service, Import Login Audit, will read your login audit file(s) and populate a database with information about logins.  This is provided mainly as an example (source code provided) as you may need to modify the smart service to make it work with your environment.

  • Requires a table named "logins" in the specified datasource.  The example SQL is provided.
  • Inputs:
    • datasourceName: The datasource where the login information will be populated
    • date: The date suffix for the login_audit.csv* file to be read. If null, it reads the latest file.
  • Supports MySQL, Oracle and Microsoft SQL Server databases
Anonymous
  • I'd still like to fix the NPE by simply ignoring GZ files and reporting back to the user in the logs. Can you post the full stack trace showing the line that produces the NPE?

  • A plug-in that extracts files just like that is too invasive. What if that unzipping puts at risk the system because it makes it run out of disk space? That's too much responsibility. I'd say that when you do the auditing you better unzip beforehand as a conscious decision taking into account the available disk space.

  • Hi


    Hope your doing Well!
     
    Observed the below issues with the plug-in.
    1. When we pass the date as rule input to smart service but unfortunately if the log file is not exist on that date, the smart service is throwing the java.lang.NullPointerException. Is there any chance to handle in java code?.
    2. The second one is which mentioned by

    Can you please help out on this.
    Thanks!!

  • Hi

    Hope your doing Well!

    I could parse last one month files from system log named "login-audit.csv.2019-10-23" however the files before that were compressed and the named "login-audit.csv.2019-09-05.gz" and the plugin is throws null pointer exception.

    can you update the plugin so that it should check for these extension file and if exist then it should be capable of extracting zip and read the file?
    login-audit.csv.2019-09-05.gz>login-audit.csv.2019-09-05

    Or do we have any other plugin which help me here?

    Thanks in Advance

  • Has anyone used this plug-in with IBM DB2? Did it work successfully?

  • Question: the login-audit.csv has information regarding the browser and device used to login. Does anyone know why this information was not added to the plugin to insert into the logins table?

  • Oh! good point. I was looking at a completely different plug-in. I apologize.

    No, the existing version does not work with the shared logs functionality. It'll have to be updated in a similar fashion (similar to the code I posted above).

    Components come with the source and are available to extension or modification from the participants of this community.

  • I dont see the above code in the plugin src when i download it from App market. Is it updated version and available anywhere to download?

    Below is what i see in  com.appiancorp.loginparser-1.0.2.jar - com.appiancorp.loginparser.ImportLoginInformation.java

    Code:

    ------------

    String logsPath = ConfigurationLoader.getConfiguration().getAeLogs();

         String dateString = "";

         if(date != null){

           DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd");

           dateString = "." + dateFormat.format(date);        

         }

         LOG.debug("Reading file: " + logsPath + "/login-audit.csv" + dateString);

         processLogins(logsPath + "/login-audit.csv" + dateString);

  • As explained in docs.appian.com/.../High_Availability_and_Distributed_Installations.html in an HA set-up the use of a shared-logs folder is the recommendation.

    When you go to the logs of your cloud-site you should see that the logs are read from the shared-logs location.

    Looking at the code it seems it is designed to look for files in there, which suggests it should work.

         String logsPath = ConfigurationLoader.getConfiguration().getAeLogs();

         String logParentPath = new File(logsPath).getParent();

         String sharedLogsPath = logParentPath + File.separator + SHARED_LOGS_FOLDERNAME;

         if (new File(sharedLogsPath).exists()) {

         List<File> allSharedLoginAuditFiles = new ArrayList<>();

         allSharedLoginAuditFiles = listf(sharedLogsPath);

          if(allSharedLoginAuditFiles.size() > 0) {

          for (File file : allSharedLoginAuditFiles) {

          processLogins(file.getAbsolutePath());

          }

          }

    I encourage you to review the login-audit.csv files in the logs folder (aka shared-logs for HA) and review whether it contains all logins, if so, then the plug-in will work because it will read it from shared-logs.

  • Hi

    We are currently on High Availability (HA) setup and want to get full picture of all the logins on the system.

    I came across this plugin recently. I wonder if you know if this plugin caters for High Availability multi-instance setup where there may be 1 login-audit file per instance?

    Thanks

    Vishal