Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
+1
person also asked this
people also asked this
Replies
7 replies
Subscribers
7 subscribers
Views
3777 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
General
Huge data fetch - "Memory threshold" problem
saratt
over 7 years ago
Hi All,
I have a requirement in which there is a table of 30 columns and 1 million records.
Requirement is to fetch those records and write the data into an excel document.
1. I have created an xsd/cdt which got verfied and mapped with the table in DB
2. When i am trying to query the data with a query rule, the rule is fetching data only if the records number is <= 1400
If i try to fetch 1500 records, error occurs on the script node in which i am fetching and it sends me following error in the alert
There is a problem with task “Script Task” in the process “XXX Test”. ERROR:An error occurred while evaluating expression: CCC_CDTs:rule!YYY(pv!cid) (Expression evaluation error in rule 'YYY' at function 'queryAAA': Error evaluating function 'queryAAA' :
Memory threshold reached during output conversion
(rule: [YYY], type: [CCCDT28858], threshold: [1,048,576 bytes], objects successfully conver...
OriginalPostID-252548
Discussion posts and replies are publicly visible
Top Replies
sparshs
over 7 years ago
+1
You can fetch this using a simple appian custom plugin using spring jdbc or even simple jdbc. You cannot handle this requirement using appian provided query tools i.e query entity , query rules because…
0
saratt
over 7 years ago
...ted: [1,489])) (Data Outputs)
Please let me know the solution for this requirement to this "Memory threshold" problem
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Colton Beck
over 7 years ago
I'd highly recommend not creating an Excel document that contains one million rows.
One solution for your querying problem is to batch the query and run it multiple times. E.g. return 1000 rows, update document, return the next 1000 rows, update document, etc.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
sparshs
over 7 years ago
You can fetch this using a simple appian custom plugin using spring jdbc or even simple jdbc. You cannot handle this requirement using appian provided query tools i.e query entity , query rules because time for execution a appian databse operation is around 8 sec , you can change that setting in your custom.properties but that may have performance impacts on your environment. So i think you should create a simple appian plugin using java.
Cancel
Vote Up
+1
Vote Down
Sign in to reply
Verify Answer
Cancel
0
sikhivahans
over 7 years ago
@saratt Afaik, the best possible way with the available functionality (in terms of OOTB and Shared Components) is making use of 'Export SQL data to Excel' smart service at http://bit.ly/2eZoZz6. Following are the benefits of this plugin:
1. No need for writing query entities and implement batching in the queries and process model
2. No need to retrieve huge amount of data and store it in PVs
So by making use of a native SQL statement (which may be sourced from a table or view or an on-the-fly SQL statement making joins) in the above-mentioned smart service, you could pull the data into Excel. In case if the smart service takes more time to execute and is listed under the category as mentioned at https://forum.appian.com/suite/help/16.3/Appian_Health_Check.html#slow-smart-services, then it might be worth implementing batching in the native SQL statement so that we can at least escape from using query rules which again involves Prepare, Execute, Transform phases and consumes time.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
ChristineH
Certified Lead Developer
over 7 years ago
For something this large, I would look at the larger solution. If a million rows of data are needed in excel, what is the purpose, audience, and use. Based on those answers, you may find that including a BI tool or something else to augment your whole solution, might be the right solution.
Pulling that much data back into Appian will consume your resources and if not batched properly, exceed some of the limits (that are there to ensure that the system is not at risk due to an errant process).
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Mike Cichy
Appian Employee
over 7 years ago
Also, this does not sound like a good idea. Why Excel? Is this to export data? Excel can't handle that many rows and the file will be extremely large, csv can be used with a plugin but even then, the file is unviewable. If you are exporting the data to another DB, a csv may be appropriate but you may want to explore other technologies (ETL tools, etc.)
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
saratt
over 7 years ago
Friends, thanks to all of you for the responses. Each of the suggestions made a valuable contribution for my research on a POC to understand the pros and cons for handlig with huge data. - Sarat
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel