Log Reader

Overview

Plug-in Notes:

  • Allows only System Administrators or members of the Designers group to access log data.  
  • Does not load files outside the logs directory or files that do not end in .log.* or .csv.*.
  • On multi-application-server environments, logs will only be shown from a single application server.
  • Custom headers can be specified as an optional parameter. This is meant to be used for .csv files that don't contain a header.

Key Features & Functionality

The Appian Log Reader Plug-in contains functionality to:

  • Read files in the Appian /log folder (.csv or .log) and return them as structured data
  • Aggregate data from a .csv file
  • Serve as the source for a service-backed record for viewing logs
  • The Log Reader application provided with the download demonstrates a service-backed record for viewing logs, as well as reports on specific log files. Administrators can view reports on system health over time, including design best practices, system load, and database performance. The application also contains a process that checks details from system.csv and alerts administrators if memory or CPU utilization exceeds a threshold.
  • Tail a log file with tailcsv and taillog. Note that tail is an expensive operation which is optimized for reading the last few lines of a large log file. It's not advisable to tail the entire log file from the end to the beginning. However, tail will perform much better than the other log functions when reading the last few lines from a very large log file. Use the batch size and timestamp filters to limit the number of lines read by the tailcsv and taillog functions.
  • Takes a line of text in CSV format and returns a text array
Anonymous
Parents
  • Hi Team ,

    I am some issue, I am trying the below expression but only getting the header values.

    readcsvlog(
    csvPath: "login-audit.csv",
    startIndex :1,
    batchSize:20
    )

    Currenlty we are working on Appian version 24.1

    thanks

  • To include paging info as above, you should use the function readcsvlogpaging(), as in:

    readcsvlogpaging(
    csvPath: "login-audit.csv",
    startIndex :1,
    batchSize:20
    )

  • Hi 

    I've been using the Plug In function without issue - I find you have to specify all input values on the function. even if they are null. Some functions within the plug in seem to expect all inputs named even if they are null. There may also be some error output listed in STDOUT that can help determine what the issue is. 

    I've been using the readcsvlogpaging function without issue - if you try setting the other inputs to {} or NULL that may work. 

    fn!readcsvlogpaging(
    csvPath: "login-audit.csv",
    startIndex: ri!pagingInfo.startIndex,
    batchSize: ri!pagingInfo.batchSize,
    filterColumName: "Timestamp",
    filterOperator: "startsWith",
    filterValue: local!convertedToday,
    timestampColumnName: "Timestamp",
    timestampStart: now() - intervalds(0,ri!pastMinutes,0),
    timestampEnd: now()

    ),

    In my use case I'm only looking for Todays log in records and within a specified time frame - hence the use of timestamp filtering. 

    Hope this helps. 

  • I am using this function first time , we haven’t used this function before. Yes it is the problem only with login audit

Comment Children
No Data