lHi,
Is there any way to achieve below scenario other than log streaming form Appain?
Currently our PROD environment has 3 servers and based on the load users are navigated to different Servers.
We would like to get login-audit.csv file from all the 3 servers. Currently with readcsvlog function we are able to get data only from the current server where the function is ran.
Any suggestions are appreciated.
Thanks.
Discussion posts and replies are publicly visible
David Pullen
Cant you see user account info in Admin console? I imagine this would be across all systems. However, if you want to do something with the data, good luck.
Thanks for the response Jacobe, We would like to export the data into an excel on daily basis from all the three servers.
I would recommend then a cron job on the servers that dumps the data where you need it. This data is not exposed in run time of appian very usefully so having an appian process handle this is not ideal. Log streaming would also work, or splunk to monitor the files into a single log server which then can create a daily report for you.
I can describe what we are doing: We have a cron job running at the servers, which are copying the Excel files to the one destination where an Appian process can read it and writes the data to a database. We have designed a small Appian app where you can apply different filters to the data and display it in a dashboard.Doing so, we have total transparency about log-ins (failed and successful), and usage of user-agents. We even track our licenses (unique logins) with this app.
I have a process that does uses the "read log" plug-in function. I found in our distribued prod environment that it had approximately a random chance of pulling the logs from one of the 3 distributed log directories if I ran several in a row.
So I made a separate process, with a "start process" smart service node calling this subprocess (and passing its data back to the parent). I then set this node to run on MNI 10 times, "run one at a time" (important). All the data gets passed back to the parent process and then deduplicated - and I have proven that this will pull from all 3 login-audit.csv files, when run a sufficient number of times. Even then, this information should be considered "approximate".
Also keep in mind that if the environment has fairly few logins for the current day, the login-audit.csv file in some of the distributed directories might still be the data for a previous day. So whatever you implement will need to account for this possibility as well.
Log-in for the current day - yes correct. To simplify this, our cron job is copying the files only from the past day.
Thanks, Will try it out. Can you suggest how we need to pull the previous day log? Currently I'm using below code and it is working fine.
queryappianlogs( sqlStatement: "SELECT * FROM login-audit", hasHeader: false() ).data
queryappianlogs( sqlStatement: "SELECT * FROM login-audit.csv.2020-03-16", hasHeader: false() ).data
I'm using the "Log Reader" plug-in which handles it slightly differently and, slightly better IMHO. With that instead of writing SQL statements you can just pull the contents of a named log file.
I'm not entirely sure what you mean by "suggest how we need to pull" - can you clarify?
Read prior day's log instead of current day.