AWS S3 Bucket Management

Overview

Manage AWS S3 data stores with Appian! Users can access their S3 objects directly from an Appian interface. Appian documents can be uploaded to a S3 bucket with Server-Side Encryption and be configured as Public or Private.
 
The AWS S3 Connected System Plug-in uses the AWS Java SDK to connect to S3.

Key Features & Functionality

  • Create Bucket -- Adds a new bucket to the S3 instance. A bucket is needed in order to store objects (files).
  • Upload File -- Uploads a file to any specified bucket on the S3 instance.
  • Upload Multiple Files -- Uploads multiple files to any specified bucket on the S3 instance.
  • List Buckets -- Returns all available buckets on the S3 instance.
  • List Objects -- Returns all available objects from a specified bucket.
  • Delete Bucket -- Permanently removes a bucket from the S3 instance.
  • Delete Object -- Permanently removes an object from a specified bucket.
  • Cognito Authentication -- Pulls credentials from AWS Cognito Identity Pool.

Requirements

Access Key Id: the access key id for connecting to S3 can be retrieved from AWS Management Console Navigate to the Users > Summary page and click the “Security credentials” tab
Secret Access Key: the secret access key can only be viewed once upon the creation of an access key, see AWS Access Keys documentation for more information: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys

Notes

Users that want to upload objects as 'Public' must have the correct IAM privileges or an "Access Denied" error will return.

S3 endpoints which restrict traffic based on source will need to allow the IP and VPC endpoints outlined in KB-1582 based on site region.

Anonymous
Parents
  • There appears to be a bug in IntegrationExecution such that PutObjectRequests are being created without the contentLength being set to the size of the document to be uploaded. Which results in the library buffering the content of the input stream to calculate it. This causes files larger than 2147483647 (~2GB) to get an outOfMemory exception for exceeding the maximum size of a byte[].

    For files less than 2147483647 (~2GB) the plugin will work as expected, but with as AWS's documentation puts it, 'this can be very expensive'.

    https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/PutObjectRequest.html

    A second option would be to use the File based method instead, which will set the content length automatically based on the file's length().

Comment
  • There appears to be a bug in IntegrationExecution such that PutObjectRequests are being created without the contentLength being set to the size of the document to be uploaded. Which results in the library buffering the content of the input stream to calculate it. This causes files larger than 2147483647 (~2GB) to get an outOfMemory exception for exceeding the maximum size of a byte[].

    For files less than 2147483647 (~2GB) the plugin will work as expected, but with as AWS's documentation puts it, 'this can be very expensive'.

    https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/model/PutObjectRequest.html

    A second option would be to use the File based method instead, which will set the content length automatically based on the file's length().

Children
No Data