Log Reader

Overview

Plug-in Notes:

  • Allows only System Administrators or members of the Designers group to access log data.  
  • Does not load files outside the logs directory or files that do not end in .log.* or .csv.*.
  • On multi-application-server environments, logs will only be shown from a single application server.
  • Custom headers can be specified as an optional parameter. This is meant to be used for .csv files that don't contain a header.

Key Features & Functionality

The Appian Log Reader Plug-in contains functionality to:

  • Read files in the Appian /log folder (.csv or .log) and return them as structured data
  • Aggregate data from a .csv file
  • Serve as the source for a service-backed record for viewing logs
Anonymous
  • Hi Team,

    we are trying to read the csv file design_errors.csv using this plugin, however no data is returning. any reason why? 

  • Hi Trong Quan, we are also facing similar issue after upgrading to 23.3, Would you please let us know if you were able to get the issue resolved?

  • Hi Team, we are also facing similar issue, Would you please let us know if you were able to get the issue resolved?

  • Hi Team

    Unable to read sites_usage.csv from path "audit/sites_usage.csv" using readcsvlog plugin, after  Appian upgrade to 23.3 , however we were able to read audit/user_management.csv

    This function is working on older version of Appian (23.2)

    User management CSV

    Site Usage CSV

  • Is the plug-in out-of-date with the later Appian version?. We use the readcsvlogpagingwithheaders to pull the data from the logs and establish headers to help parse the data. However, that same function only create the header but doesn't pull in the data. The other functions from the plug-in are able to pull in the data but without pre-defined header so it uses the first row of data to make the headers which is incorrect and we can't parse the data like that.

    • We are not in High Availability environment
    • From what we can tell, the plugin stop working after version 22.2 or 22.3
    • We are on 23.2 as of this comment
  • Hi Guys,

    We just updated Appian version to 23.3 and our process that was using the Log Reader plugin stopped working. After short investigation it occurred that paths for our logs has changed to '/shared-logs/.../audit/'. We do not have High Availability environment, so it seems that the path to logs was updated also for non-HA installations. Are there any plans to update the plugin ?

  • Hi Aditya

    It depends whether you have HA or a normal Install.
    In a single node installation the csvPath: "perflogs/web_api_details.csv" works form me. So the full path would then be "/usr/local/appian/ae/logs/perflogs/web_api_details.csv".
    In case of a HA setup you would need to add the node name as well e.g., acme.appiancloud.com would be the following csvPath: "acme/perflogs/web_api_details.csv". In this case the full path would be 
    /usr/local/appian/ae/shared-logs/acme/perflogs/web_api_details.csv. Instead of logs we have in this case shared-logs which will be added by the plugin.

  • I am using readcsvlog(csvPath:"myPath") in return it appends "/usr/local/appian/ae/logs/" to my path in the begining which gives me wrong result, what should i do to fix it.

  • Hello,


    We have configured the Appian environment in HA and in order to access the logs, we have seen in some comments that it has to be done by putting the server prefix in the log path.


    Is there any way to extract all the log data from the different servers in HA?


    We need to be able to read directly from the log without putting any server in the path to have all the updated information.


    Thank you very much

  • Hello,

    I'm trying to use this plugin to get values from records_usage.csv in the context of our GDPR solution. I can read the file with readcsvlog but filter on Timestamp seem not working. When I run this:

    readcsvlog(
        csvPath: "/audit/records_usage.csv",
        timestampColumnName:"Timestamp",
        timestampStart: a!subtractDateTime(startDateTime: today(), days:100),
        timestampEnd: today()
    )

    The function return 0 row.

    Other filter works, for example:

    readcsvlog(
       csvPath: "/audit/records_usage.csv",
       filterColumName:"Timestamp",
       filterOperator:"startsWith",
       filterValue:"1 Feb 2023"
    )

    But my goal is retrieve a periode and this kind of filter doesn't support "Between"

    Regards

    Jean-Alain