Log Reader

Overview

Plug-in Notes:

  • Allows only System Administrators or members of the Designers group to access log data.  
  • Does not load files outside the logs directory or files that do not end in .log.* or .csv.*.
  • On multi-application-server environments, logs will only be shown from a single application server.
  • Custom headers can be specified as an optional parameter. This is meant to be used for .csv files that don't contain a header.

Key Features & Functionality

The Appian Log Reader Plug-in contains functionality to:

  • Read files in the Appian /log folder (.csv or .log) and return them as structured data
  • Aggregate data from a .csv file
  • Serve as the source for a service-backed record for viewing logs
Anonymous
Parents
  • Hi All,

    After our on premise environment upgrade to 20.4 version, readcsvlogpagingwithheaders() is not working as expected.

    Whenever we are giving header attribute along with filter options, we get output as null. Below is the code snippet:

    readcsvlogpagingwithheaders(
    csvPath: cons!SC_AUDIT_FILEPATHS[1] & if(
    local!date = today(),
    null,
    "." & datetext(
    local!date,
    "yyyy-MM-dd"
    )
    ),
    startIndex: 1,
    batchSize: - 1,
    headers: {
    "loggedInTime",
    "loggedInUser",
    "attempt",
    "ipAddress",
    "source",
    "agent"
    },
    filterColumName: "source",
    filterOperator: "=",
    filterValue: "Portal",
    timestampColumnName: null,
    timestampStart: null,
    timestampEnd: null
    )

    Any suggestions to resolve this issue?

    Thanks.

  • Download the log file in question and look back through the historical data (assuming it might be one that spans multiple days) and check whether the amount of columns has increased since the 20.4 update.  As I mentioned in some older posts here, when the number of columns is inconsistent, the function fails.  As mentioned, you can try the new "tail" functions which should read the most recent row(s) from the log file and that would bypass this issue.

    If the log file you're trying to read is the type that's rolled over on a daily basis, on the other hand, I'd still start out by downloading one of the current files, but this time maybe just check that its columns are still the ones you're trying to reference in your query.  I'm not clear which log file you're looking at since I don't know what's in your cons!SC_AUDIT_FILEPATHS constant.

Comment
  • Download the log file in question and look back through the historical data (assuming it might be one that spans multiple days) and check whether the amount of columns has increased since the 20.4 update.  As I mentioned in some older posts here, when the number of columns is inconsistent, the function fails.  As mentioned, you can try the new "tail" functions which should read the most recent row(s) from the log file and that would bypass this issue.

    If the log file you're trying to read is the type that's rolled over on a daily basis, on the other hand, I'd still start out by downloading one of the current files, but this time maybe just check that its columns are still the ones you're trying to reference in your query.  I'm not clear which log file you're looking at since I don't know what's in your cons!SC_AUDIT_FILEPATHS constant.

Children