Appian Community
Site
Search
Sign In/Register
Site
Search
User
DISCUSS
LEARN
SUCCESS
SUPPORT
Documentation
AppMarket
More
Cancel
I'm looking for ...
State
Not Answered
Replies
8 replies
Subscribers
5 subscribers
Views
4404 views
Users
0 members are here
Share
More
Cancel
Related Discussions
Home
»
Discussions
»
Plug-Ins
Export Portal Data to DB
rishub
over 7 years ago
Hi All,
Do we have any plug-in through which we can export Portal Report Data to Database directly?
Thanks
Rishu
OriginalPostID-273584
Discussion posts and replies are publicly visible
0
ChristineH
Certified Lead Developer
over 7 years ago
Are you referring to the in memory reports?
If so, via a process, you could use the execute system report node, save to cdt, then write to ds.
Just one idea...
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
rishub
over 7 years ago
@Christineh, Actually we don't want to pull the data into PVs as the reports contains large data. So I was wondering if we have some plugin for that which we can use to store the Data from report to DB directly.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
ChristineH
Certified Lead Developer
over 7 years ago
Ahhhh.
I haven't seen a plugin on forum.
Can you update your models to write to db as the events occur so the footprint is smaller?
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
Stefan Helzle
A Score Level 3
over 7 years ago
You could create a web API which calls queryProcessAnalytics and creates a CSV. Not sure how much data you can export in one call. There are some limits. Another idea is, create an expression that creates CSV data from queryProcessAnalytics use the textDocFromTemplate to store it to a file. You could repeat (MNI) this and append to the same file when you write the placeholder to the file in the last line.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
rishub
over 7 years ago
@Stefanh791, Thank for the suggestion. I would see if we can implement this.
@Christineh, We can't update our models. Our Application is a product in itself & is having almost 300 process models with some huge process models so it won't be possible to update all of those process models.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
sikhivahans
over 7 years ago
@
rishub
In case if you are trying to export huge amounts of data I would suggest considering the upper limit. Without any special configurations in place, Analytics can retirve only 10,000 records and will fail beyond this count.
Just to add, resources.appian.analytics.application.maxreportrows and server.conf.processcommon.MAXIMUM_REPORT_MS are the settings in the custom.properties that will control the quantity and timeout of the analytics.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
rishub
over 7 years ago
in reply to
sikhivahans
Yes Sikhi Vahan, we will go there as a step now. i was just thinking of finding a plug in so that we don't have to do all these steps of configuration changes. My requirement is to get the data from Reports to DB, then use views for those joins and get them back on flat file/ txt file. We do have y=to update the Analytics limit as well as query size & records limit.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel
0
sikhivahans
over 7 years ago
in reply to
rishub
rishub I hope the environment will have its capacity increased to suffice the changes or you will be ensuring it by means of testing.
Few lines from docs in case if you are not aware: Before making such a change, we recommend testing the increased value in an environment that includes the same (or comparable) data sets to determine whether your available JVM memory allocation is sufficient.
Cancel
Vote Up
0
Vote Down
Sign in to reply
Verify Answer
Cancel