<?xml version="1.0" encoding="UTF-8" ?>
<?xml-stylesheet type="text/xsl" href="https://community.appian.com/cfs-file/__key/system/syndication/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"><channel><title>Data upload by file</title><link>https://community.appian.com/discussions/f/process/18928/data-upload-by-file</link><description>Hello 
 I comment a little on the need of the client and how we have proposed development. 
 We need to do a reading in some daily cases to update business data in our Appian tables, the client leaves us in a directory for different SFTP, access those</description><dc:language>en-US</dc:language><generator>Telligent Community 12</generator><item><title>RE: Data upload by file</title><link>https://community.appian.com/thread/74568?ContentTypeID=1</link><pubDate>Mon, 08 Jun 2020 10:14:10 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:cea48098-cb41-4fea-818f-1fa1255a1016</guid><dc:creator>alexc0002</dc:creator><description>&lt;p&gt;Thx&amp;nbsp;Lejanson.&lt;/p&gt;
&lt;p&gt;I think it&amp;#39;s a good idea.&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item><item><title>RE: Data upload by file</title><link>https://community.appian.com/thread/74486?ContentTypeID=1</link><pubDate>Mon, 01 Jun 2020 22:06:46 GMT</pubDate><guid isPermaLink="false">d3a83456-d57b-489c-a84c-4e8267bb592a:a349be79-a4a1-4a99-80ae-9a28b379006b</guid><dc:creator>legotx</dc:creator><description>&lt;p&gt;Hey Alex,&lt;/p&gt;
&lt;p&gt;I will suggest having a process where you read the file and put it in staging table before writing into your main tables. This way, you dont lose any data and your process model won&amp;#39;t get the overloaded.&lt;/p&gt;
&lt;p&gt;It depends on what you are importing also. If you are importing a CSV file, you can use the CSV to SQL plugin. I am not sure if this is still available.&lt;/p&gt;
&lt;p&gt;After that you can have a process that pulls those in the database and write into your main tables - which is time triggered. You can have a status column on your staging table so you can keep track which data has already been processed.&lt;/p&gt;
&lt;p&gt;Additionally, I would suggest having a threshold on the number of rows you want to process. I believe you can only do 1000 rows if I am not mistaken.&lt;/p&gt;
&lt;p&gt;Hope this helps.&lt;/p&gt;
&lt;p&gt;Lejanson&lt;/p&gt;&lt;div style="clear:both;"&gt;&lt;/div&gt;</description></item></channel></rss>