Tracking User login

Certified Lead Developer

Hi Everyone,

Can we Identify users who have not logged in within specified time after creating them?

  Discussion posts and replies are publicly visible

  • What is your use case?

    I have not found a simple/efficient way to do this OOTB. You could compare the user list with the login audit log, but it seems cumbersome to me.
    docs.appian.com/.../Logging.html

    You can configure Appian to deactivate these users, but only if you are using Appian Authentication, which is not common for a Production environment in my experience.
    docs.appian.com/.../Appian_Administration_Console.html

    Perhaps others have ideas on this...

  • Through a combination of the Log Reader plugin and a custom Appian application you can establish which Users have not sign in for XX Days and then deactivate those accounts. You could also initially send them a courtesy message e.g. 'You have not signed in for 20 days. If you do not sign in within the next 5 days your account will be deactivated'...you could run the controlling process on a schedule (e.g. every night) to detect which accounts have not signed in and need notifying, and which can be deactivated.

  • Hi ,

    I have question on the same.

    1) Log reader plugin (readcsvlogpagingwithheaders) this will read only one log file for one time, to check user logged in to Appian system for 90days  we need to run the function for 90 iteration ? do we have any other option here ??

    2) And the rotation of login-audit.csv file is mentioned as every day in documentation, where can i get the exact time of rotation so that i can schedule my process after that ? (https://docs.appian.com/suite/help/19.3/Logging.html#managing-log-files)

    3) In my appiancolud log in can see only 30days files with named (login-audit.csv.yyyy-mm-dd) and before that files has append with .gz (login-audit.csv.2019-09-22.gz) what does this file mean ? is it archived and the function o/p for this file is 

    CsvContents logName: "/usr/local/appian/ae/logs//login-audit.csv_2019-09-21"

    totalCount: -1

    rows: null (List of Text String)

    headers: null (List of Text String)

    4) Any Link for Archival settings of login-audit.csv?

     

  • We have implemented something similar using a combination of reading yesterdays log file on a daily basis.

    We have a timed process which runs around 5am looking explicitly at yesterdays log in audit file and then updates a DB table with when that user last logged in.

    As the process runs daily we do not have to worry about the archiving of login files - which seems to be a much easier solution. 

    Beware though, if you have a High Availability Cloud Solution then the logreader functions only have access to the system files the process model is running on - so you may not have the complete picture of log in information.

  • Thank you for response!

    Our requirement is to show the users who haven't login into Appian sys for past 150 days, so i should loop for 150 times every-time on hit on interface if i use "log reader" plugin function which will effect performance.

    So i am choose "Login Information parser" which will store the data into DB however this need MNI for last 150 days, atleast for the first execution into environment but only last one month login-audit files are not Zipped other were in Zip format which "login reader" or "Login Information parser" plugins will not work. So looking for any other plugin which extracts zip file from system logs and then parse the file inside that.  

  • It sounds like you need to look at a different option rather than trying to calculate the last log in dynamically - if you looked at the option where a database table is updated with the datetime the user last logged (as mentioned daily process to look at yesterdays log ins) you can then easily display and render in a report or a record those users who meet your have not logged in for the past X day criteria as the DB would contain the relevant date. Far more performant for real time queries. 

  • Yeah! that is already implemented .. we are good writing into DB .. issue is with file older than 30days which are zipped .. unable to parse them

  • 0
    Certified Lead Developer
    in reply to spandanat

    You should be running this utility nightly (as mentioned by ), and storing relevant data into your own database per user at that point.  There should not be any need to look at the archived log files when this is being done correctly.