Slow performance interface

Team,

I have a below code which is mix of rules and query. Code quires DB to get data on a quarterly basis which is around 50k every quarter then compares them with rule data and show count.

Interface is very slow to load and also takes lot of time to show second drop down when first one selected. Can you guys suggest some improvements  to improve the performance .

  Discussion posts and replies are publicly visible

Parents
  • Marcel is likely correct. Shipping data between a database and Appian will be 'slow' by definition, and will be very slow if the data volume is large and/or you make multiple round-trips to the database. Try and conduct as much processing as you can in the DB and then only ship those results to Appian for display.

  • Thank you for response , my code quires apppian group and compare username with db data and gives count.

  • 0
    Certified Lead Developer
    in reply to gauravp0003

    How many users does Appian have to query from the group, and how many does it have to grab from the DB?

    Are you querying DB for all rows, then for each row checking to see if that user is in groups?  You could be suffering as much as you are running that many queries against the groups engine.  An Appian deployment can have as many as 32 pairs of analytics and process engines, but even then will still have only 1 groups engine.  If you ask for a particular user a few thousand times in a row it could get bogged down. 

    If you pull all users once, and pull all rows from DB and use Appian to do the compare, that could also be very slow.  Especially if you compare each row of Groups against each row of DB.  That leads to a Cartesian Product.  Comparing just 32 users against 32 rows that way is over 1000 operations.  Its the number of users you're checking TIMES the number of rows in the DB, which I think you said is 50k.  50k times anything is a lot.

    Perhaps try gathering all usernames from Groups.  That will be a huge query, but only 1 huge query.  Then pass that as a parameter to your DB query, returning COUNT FROM yourTable WHERE userName IN (*your list*)  This should have 2 advantages; running only 2 queries and returning the least amount of data possible from your DB.  Make DBMS do calculations including counting the rows; it's much faster at data aggregation.

  • It would also be helpful to paste the SAIL code so we can see what expression you are running and provide a more specific recommendation.

  • Here is the snippet of code. I am sharing rule for username and they way i am counting the count checking against , I have other rules to firstname , lastname ,email from group to show in the GRID.

    Rule to get the count:

    load(
    local!paging:ri!paging,
    local!gpusername: touniformstring(rule!test_getUserNamesFromGroup(ri!group)),
    local!dbusername: touniformstring(rule!test_getQ1UserLoginCountFromDB()),
    local!rule1: todatasubset(local!gpusername,local!paging),
    local!rule2: local!dbusername,
    local!data: a!forEach(
    items: local!rule1,
    expression: {
    username: fv!item,
    count: count(
    wherecontains(
    fv!item,
    local!rule2
    )
    )
    }
    ),
    index(
    local!data,
    "count",
    {}
    )
    )

    test_getUserNamesFromGroup(ri!group)

    apply(
    user(
    _,
    "username"
    ),
    getmemberuserspaging(
    ri!group,
    0,
    - 1
    )
    )

    test_getQ1UserLoginCountFromDB()

    a!queryEntity(
    entity: cons!TEST_ENTITY_CONSTANT,
    query: a!query(
    logicalExpression: a!queryLogicalExpression(
    operator: "AND",
    filters: {
    a!queryFilter(
    field: "login_date",
    operator: "<=",
    value: todate("31-Mar-"& year(now())-1)
    ),
    a!queryFilter(
    field: "login_date",
    operator: ">=",
    value: todate("01-Jan-"& year(now())-1)
    )
    }
    ),
    pagingInfo: a!pagingInfo(
    1,
    - 1
    )
    )
    ).data.user_name

  • 0
    A Score Level 1
    in reply to gauravp0003

    are you looping the expression for generating the count? - If yes, there are there is one optimization i can think of

    1) Do the calls to the DB on the highest interface possible (i.e. on the grid interface) and passed as a reference - Avoid any call-loopings

    As well, where() is faster than wherecontains() so perhaps you can use some logical expression to evaluate instead.

  • Thank You, I modified my query as below and added one more parameter as username which takes username from another rule as an input and 10 at a time , that made my query fast and then searches for only those users. Please see below code.

    test_getQ1UserLoginCountFromDB()


    a!queryEntity(
    entity: cons!TEST_ENTITY_CONSTANT,
    query: a!query(
    logicalExpression: a!queryLogicalExpression(
    operator: "AND",
    filters:
    {
    a!queryFilter(
    field: "user_name",
    operator: "in",
    value: ri!username
    ),
    a!queryFilter(
    field: "login_date",
    operator: ">=",
    value: todate(
    "01-Jan-" & year(
    now()
    ) - 1
    )
    ),
    a!queryFilter(
    field: "login_date",
    operator: "<=",
    value: todate(
    "31-Mar-" & year(
    now()
    ) - 1
    )
    )
    }
    ),
    pagingInfo: a!pagingInfo(
    1,
    - 1
    )
    )
    ).data.user_name

  • 0
    Certified Lead Developer
    in reply to gauravp0003

    Very good, IN operator is key.  I would try seeing what you could do with 100 users.  Benchmark going through the whole set in batches of 10 and batches of 100.  I think you'll find through benchmarking that larger batches are actually much more efficient. 

    In general, well formed queries scale logarithmically, which means after paying an up front cost, you can always do double the operations in less than double the time, and you can always do 10 times the operations for less than 10 times the cost.  The more rows you return, the less noticeable any further increase in rows becomes in terms of time spent.  Unless of course the DB isn't well optimized.

    You might find it most efficient to query absolutely all the users at once.

  • Thank you. I tested the query with 150 users at once and got the results in 803 ms against 50k of db data

Reply Children
No Data