Overview
Plug-in Notes:
Key Features & Functionality
The Appian Log Reader Plug-in contains functionality to:
Michael Chirlin - are there any updates as to who's in charge of maintenance on this plug-in? Is it being actively maintained / is there any chance of any of the issues or inconsistencies i've enumerated, being addressed in the foreseeable future?
Checking back in Mark Talbot / Somnath Dey, i can confirm that "csvToTextArray" fails to correctly parse CSV rows as returned by the original "readCsvLogPaging" function which (as i previously noted quite a while ago) incorrectly strips quote escaping from CSV text containing a comma (basically causing one cell worth of data to be treated as two cells when it contains a comma).
As I noted somewhere, that seems to have been fixed in the "tailCsvLogPaging" function (i.e. passing a returned row from that function through "csvToTextArray" returns the expected number of fields, even when a row cell includes a comma); however there seem to still be unresolved issues even with that one - for one, the "headers" value seems to *only* return blank, and additionally, with no "start index" (as i noted a year or so ago) it leaves me unable to really execute my use case of creating a grid and allowing users to page through it (starting of course with most-recent-first).
Any chance of some new changes to harden the behavior a bit and get them acting consistently, etc? I have an Admin tool that queries error messages from the log but I have to bend over backwards to parse the rows carefully enough that it doesn't blow up on me, and it seems like every other week I need to install yet another heuristic to sanitize a comma appearing in a data row in a new way. It's not really scalable.
this has (apparently) been addressed in a more recent reply. In my environment though i'm unable to test or confirm one way or another.
This is not working for me. When I try to give the server it always appends the path to "/usr/local/appian/ae/logs" and try to read the file from "/usr/local/appian/ae/logs/mysite-3/authz-audit.csv". As there is no such path we are not getting expected output. Can any one please let us know if any one is successful in reading a log file using this plugin in distributed environment.
It does not package log4j within it, no update is necessary.
On HA environments you need to specify the node name before the file name. Below is an example using Cloud node naming conventions:
first sever:
readcsvlogpaging( csvPath: "mysite/authz-audit.csv", startIndex: 1, batchSize: 10)
second sever:
readcsvlogpaging( csvPath: "mysite-2/authz-audit.csv", startIndex: 1, batchSize: 10)
third sever:
readcsvlogpaging( csvPath: "mysite-3/authz-audit.csv", startIndex: 1, batchSize: 10)
Mark Talbot Kindly help looking into this issue and advise.
The plugin version 2.1.2 (function tailcsvlogpagingwithheaders ) is working fine in test environment while the same version is giving error in production.
Can someone please advise why the same version is behaving differently in both environments.
The description mentions that this plugin can "Serve as the source for a service-backed record for viewing logs"; however when I attempted to configure a service-backed record type for viewing a log, I received the following error: "The Record Data Source cannot use plugins, query rules, or any of the following functions: query, queryEntity, queryProcessAnalytics, queryRecord, queryRecordType, executeStoredProcedureForQuery, executeStoredProcedureOnSave."
Also, is there documentation on how to use the various functions included in this plugin? I couldn't find any documentation included in the download. Thanks!