Overview
The Amazon S3 Utilities Plug-in leverages the Amazon AWS Java API to connect with Amazon S3 to store and retrieve files.
Key Features & Functionality
The following smart services are included:
The plug-in also includes a function:
Amazon S3 Utilities supports the following Amazon S3 features:
Note: The plug-in requires Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files when using client side encryption.
(https://www.oracle.com/technetwork/java/javase/downloads/jce8-download-2133166.html)
The Appian Secure Credential Store is leveraged for the credentials to integrate with Amazon S3. Before executing the plug-in, create an new secure credential store with the following 3 attributes. These values are obtained from Amazon AWS IAM console.
Hi,
getPreSignedURLForS3 is only to download the file securely or for upload as well?
Thanks Mike. Appreciate your help.. We implemented and its working as expected. Thanks again .Have a good day!
baratc I don't have documentation explaining this in detail, but in the plugin description there is a line "It can be used in a WebAPI object to redirect a user from Appian to a resource on S3". That is exactly what you need to do. As you experienced, you cannot generate an S3 link and directly put it on the form as it will expire inside 5s. What you need is a WebAPI that will take in a doc id, validate user access, generate the AWS link and respond with a 302 (redirect). You can then put static links to the web api on your form.
Hi Mike,
Thanks for sharing this plugin. We are using plug in to get the documents and display in appian. We are using links(S3URL) for users to download the documents. Looks like currently URL is expiring in 5000 milliseconds. I think time is very short for the user to navigate to download. To overcome this sceanario we used a!refresh function but still the life of the url is short. We have around 10+ docs and URLs need to available for the user to downlaod the pdf;s Please let me know whether any option is increase the expTimeMillis to 5 mins or can we pass expTimeMillis as a parameter to the plugin. Thanks in advance for your help.
Sylvain Furt , I agree with your comments. We are moving forward with using the 19.1 Plug-in April Schuppel suggested. As such, I have not made any modifications to the plug-in. As I was working on a proof of concept, I would have likely just changed the Plug-in to hard code for my region, as opposed to US_EAST_1. I agree, it is likely better to update so that it is parameterized. I too was able to get theh Plug-In to upload a file, but only if the S3 was in US_EAST_1
I do believe the new 19.1 Plug-in is superior. Anyone with access to a 19.1 environment with S3 needs should use that Plug-In instead. Others will need to modify this plug-in to work with the S3 Region they need. Depending on scope and bandwidth, the entire Plug-In could use a refresh, as it's using deprecated AWS methods.
community.appian.com/.../amazon-s3-connected-system-plug-in
@All I just tested the plugin on Appian 19.1 and I was successfully able to upload files to an S3 bucket that I just created.
Regarding the regions, this plugin is currently hardcoded to only work with Regions.US_EAST_1.
As an enhancement, you should add an extra input parameter to allow designers to configure which region the bucket is located into.
Thoughts?
Sylvain Furt (sylvain.furt) Yes i do have the credentials (Access Key and Secret Key in the third party credentials store with S3 Util having access to it as well.
jamesm881 - I don't believe that this plugin has been refreshed in a while. The source code is included in the jar. Could you make the update and share it?
@Ankur - this plugin requires access to a secure credential store that contains the credentials for S3. This is a configuration that needs to be done in the admin console on your environment.