Overview
Plug-in Notes:
Key Features & Functionality
The Appian Log Reader Plug-in contains functionality to:
Mark Talbot - i was figuring a start index would work the same way it does normally, except of course that it implies positions from the end of the file, instead of positions from the start of the file.
My use case is that I want to create a paging grid of process errors, showing the most recent ones first, and otherwise page-able like normal. Without a relative start index in the function, I have no direct way of doing this, other than just increasing my batch size by X increments then artificially transforming the resulting query to trim to my desired number. That seems like a bit of an unnecessary pain when the function could just include the ability to call a "start [from the end] index".
Hi All,
I just noticed that whenever i read a login-audit file, the first line is always skipped while fetching the data. Is there is anyway to overcome this?
Thanks for the heads up on this one, I'll investigate.
The lack of a start index was an intentional design artifact. The tail plugin should always start from the end and work it's way backwards.
Hey Mike Somnath Dey is the owner of the plugin I will let him know about this. I did add a function called csvtotextarray. My expectation is that this function should correctly parse any CSV line returned by the smart service. I submitted the plugin update tonight.
Also I've found that in the "tail..." function, filtering doesn't seem to work. I tried a filter that "readCsvLogPaging" was able to use just fine (making sure to adjust to use "filterColumNumber" instead of column name), but it always returns zero results (using either "=" or "contains" operators on a known-good username).
Edit: I just tried the newer "tailCsvLogPaging()" function and I notice it *does* include the quotes around the text column from the same row in question. Unfortunately it seems the "tail..." function versions don't have a way of setting a start index for paging use, so I'm not sure exactly how to incorporate that into the tool I've previously created to allow paging through the Design_errors log.
Would it be possible to create a new version of the readCsvLog functions that doesn't strip out text-containing quotes as found in the original CSV file? In particular the "Error Message" column is liable to have text that contains commas, and since each log row is returned as mere plaintext, in order to create a dictionary of data we're forced to use the split() function on commas. But the function also removes the quote marks around strings that contain commas, so we have no way to verify we're splitting on the right things without really having to make big guesses. I bring this up now because yet another corner case in the heuristics I was using to read the row data has cropped up, causing extra headaches.
Honestly I'm not sure why this function doesn't return CDT or at least JSON data - that would make it incredibly less of a headache to use.
As a concrete example, here's a row in the CSV file itself showing the error message wrapped in quotes:
Whereas here's the same row, straight out of the readCsvLogPaging() function (notice the quotes have been stripped by the plug-in):
CC: Mark Talbot, Sam Zacks, April Schuppel
Ok great! I'll look for the update from the community
Mark Talbot - that sounds like a good feature. I won't be able to test it in our case because I had appian support archive the older design_errors.csv file in all our environments, which proved out my theory that it was breaking because the number of columns changed over time. But that means I no longer have a version to test this against. However anyone who has an instance that's been around since the '19 versions or so, should be able to reproduce the issue and test if the new functionality works.