Is there any way to achieve below scenario other than log streaming form Appain?
Currently our PROD environment has 3 servers and based on the load users are navigated to different Servers.
We would like to get login-audit.csv file from all the 3 servers. Currently with readcsvlog function we are able to get data only from the current server where the function is ran.
Any suggestions are appreciated.
Cant you see user account info in Admin console? I imagine this would be across all systems. However, if you want to do something with the data, good luck.
Thanks for the response Jacobe, We would like to export the data into an excel on daily basis from all the three servers.
I would recommend then a cron job on the servers that dumps the data where you need it. This data is not exposed in run time of appian very usefully so having an appian process handle this is not ideal. Log streaming would also work, or splunk to monitor the files into a single log server which then can create a daily report for you.
I can describe what we are doing: We have a cron job running at the servers, which are copying the Excel files to the one destination where an Appian process can read it and writes the data to a database. We have designed a small Appian app where you can apply different filters to the data and display it in a dashboard.Doing so, we have total transparency about log-ins (failed and successful), and usage of user-agents. We even track our licenses (unique logins) with this app.
I have a process that does uses the "read log" plug-in function. I found in our distribued prod environment that it had approximately a random chance of pulling the logs from one of the 3 distributed log directories if I ran several in a row.
So I made a separate process, with a "start process" smart service node calling this subprocess (and passing its data back to the parent). I then set this node to run on MNI 10 times, "run one at a time" (important). All the data gets passed back to the parent process and then deduplicated - and I have proven that this will pull from all 3 login-audit.csv files, when run a sufficient number of times. Even then, this information should be considered "approximate".
Also keep in mind that if the environment has fairly few logins for the current day, the login-audit.csv file in some of the distributed directories might still be the data for a previous day. So whatever you implement will need to account for this possibility as well.
Log-in for the current day - yes correct. To simplify this, our cron job is copying the files only from the past day.
Thanks, Will try it out. Can you suggest how we need to pull the previous day log? Currently I'm using below code and it is working fine.
sqlStatement: "SELECT * FROM login-audit",
sqlStatement: "SELECT * FROM login-audit.csv.2020-03-16",
I'm using the "Log Reader" plug-in which handles it slightly differently and, slightly better IMHO. With that instead of writing SQL statements you can just pull the contents of a named log file.
I'm not entirely sure what you mean by "suggest how we need to pull" - can you clarify?
Read prior day's log instead of current day.
In general you just include the date in the filename like you've done above. But for an environment with distributed logs, your engine 1 might have a file matching yesterday's date, while engine 2 still has yesterday's logins stored in the general "login-audit.csv".
So I set up something in my subprocess with one extra level of complexity - I have it query the named log file for yesterday, then i check if the results were blank, and if so, i query the general login audit file.
In my case, after deduplication and all, I write these to my own database table. But because of the possibility for duplicated entries to be written, I then take the further step of querying the existing table for matching entries and weeding out ones that have already been written (i.e. entries with the same username and login timestamp).
We are using the Log-Reader plug-in too. Our cron job just copies over the log-in.csv from past day to the Appian//log-folder.Appian renames the log-in files every day, as explained here: https://docs.appian.com/suite/help/19.4/Logging.html#managing-log-files
If you're using an on-prem install where you can just copy the log file, though, I don't believe you will have the same issues with multiple distributed servers as mentioned in the original post. If you do and you've found a way around this, I'd be curious to hear what technique you used.
We deal with the issue of multiple servers in our on-premise installation. The technique we are using is simple: our cron-job is copying the *csv files from the servers and is renaming those. Because we know how many servers we have, we just read the appropriate *csv files from the Appian/log directory based on the possible names (e.g. server1-login.csv, server2-login.csv). Caveat: yes we have to change the procedure if we would add another Application server, but that doesn't happen so often.
Gotcha. Unfortunately for cloud installations (which is my use case even if it wasn't mentioned in the original post), I don't believe an external cron procedure is feasible, as no utilities that I know of are granted permission to have direct visibility into the inner workings of the server, including log files.
Discussion posts and replies are publicly visible
© 2020 Appian. All rights reserved.