Overview
Plug-in Notes:
Key Features & Functionality
The Appian Log Reader Plug-in contains functionality to:
Hi paulc919, did you find a solution for this?
Did you manage this jeffh0004?
How does the plugin specify which directory it's reading for the log files? In 3 of our environments, it's reading the log file from:
"/usr/local/appian/ae/logs/login-audit.csv"
In one of our environments, it's trying to read the log file from:
"/usr/local/appian/ae/shared-logs/login-audit.csv"
In this last environment with the different path, we're getting no results returned.
Is there a way to change this?
Mark Talbot Somnath Dey
im interested in knowing if you were able to achieve this jeffh0004.
For a 3 node system, can I specify the server from which it reads?
Thanks Mike. Additional column was the issue. There was an extra attribute after upgrade.
Download the log file in question and look back through the historical data (assuming it might be one that spans multiple days) and check whether the amount of columns has increased since the 20.4 update. As I mentioned in some older posts here, when the number of columns is inconsistent, the function fails. As Mark Talbot mentioned, you can try the new "tail" functions which should read the most recent row(s) from the log file and that would bypass this issue.
If the log file you're trying to read is the type that's rolled over on a daily basis, on the other hand, I'd still start out by downloading one of the current files, but this time maybe just check that its columns are still the ones you're trying to reference in your query. I'm not clear which log file you're looking at since I don't know what's in your cons!SC_AUDIT_FILEPATHS constant.
Hi All,
After our on premise environment upgrade to 20.4 version, readcsvlogpagingwithheaders() is not working as expected.
Whenever we are giving header attribute along with filter options, we get output as null. Below is the code snippet:
readcsvlogpagingwithheaders( csvPath: cons!SC_AUDIT_FILEPATHS[1] & if( local!date = today(), null, "." & datetext( local!date, "yyyy-MM-dd" ) ), startIndex: 1, batchSize: - 1, headers: { "loggedInTime", "loggedInUser", "attempt", "ipAddress", "source", "agent" }, filterColumName: "source", filterOperator: "=", filterValue: "Portal", timestampColumnName: null, timestampStart: null, timestampEnd: null )
Any suggestions to resolve this issue?
Thanks.
As an aside, it's majorly frustrating and confusing that Community fails to stack the in-thread replies in any comprehensible order here.
Mark Talbot that sounds good - for clarification does your new function assume the CSV text row will be "quote escaped" like it is in the original CSV file, or with quotes stripped like returned by the current "readCsvLog" functions?
Also, do you mean the function has been added to this plug-in, whenever the update is published at least?