Overview
Manage AWS S3 data stores with Appian! Users can access their S3 objects directly from an Appian interface. Appian documents can be uploaded to a S3 bucket with Server-Side Encryption and be configured as Public or Private. The AWS S3 Connected System Plug-in uses the AWS Java SDK to connect to S3.
Key Features & Functionality
Requirements
Access Key Id: the access key id for connecting to S3 can be retrieved from AWS Management Console Navigate to the Users > Summary page and click the “Security credentials” tabSecret Access Key: the secret access key can only be viewed once upon the creation of an access key, see AWS Access Keys documentation for more information: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys
Notes
Users that want to upload objects as 'Public' must have the correct IAM privileges or an "Access Denied" error will return.
S3 endpoints which restrict traffic based on source will need to allow the IP and VPC endpoints outlined in KB-1582 based on site region.
Hi Team,
Is there a way to put custom tags/metadata to uploaded files?
There appears to be a bug in IntegrationExecution such that PutObjectRequests are being created without the contentLength being set to the size of the document to be uploaded. Which results in the library buffering the content of the input stream to calculate it. This causes files larger than 2147483647 (~2GB) to get an outOfMemory exception for exceeding the maximum size of a byte[].
For files less than 2147483647 (~2GB) the plugin will work as expected, but with as AWS's documentation puts it, 'this can be very expensive'.https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/PutObjectRequest.htmlA second option would be to use the File based method instead, which will set the content length automatically based on the file's length().
Hi TJ, the plug-in has now been updated to fix these issues.
Post upgrade to 23.2 we are facing issue with Amazon S3 bucket. The connected system seems to be have successful connection but when used in an integration it is throwing below error:
Amazon Service Exception Status: Access Denied
Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: HMVPQ97TC6HEE35X; S3 Extended Request ID: OeWoWY9Ibx21AtLqyFiNx7nqic9b5/BfNBdjowlKS7JCp2WwBdboo2CKk4hUPoYHcIrMdthUyf0=; Proxy: null)
PFB screenshot of error received. I have also updated plugin to latest version.
Need inputs on this. Thanks in advance.
Do we have have limitations on file size to be transferred? What is the maximum file size that it currently supports?