AWS S3 Bucket Management

Overview

Manage AWS S3 data stores with Appian! Users can access their S3 objects directly from an Appian interface. Appian documents can be uploaded to a S3 bucket with Server-Side Encryption and be configured as Public or Private.
 
The AWS S3 Connected System Plug-in uses the AWS Java SDK to connect to S3.

Key Features & Functionality

  • Create Bucket -- Adds a new bucket to the S3 instance. A bucket is needed in order to store objects (files).
  • Upload File -- Uploads a file to any specified bucket on the S3 instance.
  • Upload Multiple Files -- Uploads multiple files to any specified bucket on the S3 instance.
  • List Buckets -- Returns all available buckets on the S3 instance.
  • List Objects -- Returns all available objects from a specified bucket.
  • Delete Bucket -- Permanently removes a bucket from the S3 instance.
  • Delete Object -- Permanently removes an object from a specified bucket.
  • Cognito Authentication -- Pulls credentials from AWS Cognito Identity Pool.

Requirements

Access Key Id: the access key id for connecting to S3 can be retrieved from AWS Management Console Navigate to the Users > Summary page and click the “Security credentials” tab
Secret Access Key: the secret access key can only be viewed once upon the creation of an access key, see AWS Access Keys documentation for more information: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys

Notes

Users that want to upload objects as 'Public' must have the correct IAM privileges or an "Access Denied" error will return.

S3 endpoints which restrict traffic based on source will need to allow the IP and VPC endpoints outlined in KB-1582 based on site region.

Anonymous
  • Hello,

    We are currently trying to implement the AWS S3 Connected System. When setting up the connected system, we plug in the Access Key ID, Region and Secret Access Key, but when we go to test the connection we are getting "No Owner Associated with this Account. I see below it was mentioned that the getS3AccountOwner method was used to verify the connected system. Is there a way around this? Current we are working with a no owner on the bucket we are leveraging.

  • How can I get a version compatible with Appian 19.2?

  • The Updated Connected System has been listed. Thank you

  • Thanks for bringing this to our attention. We have updated the plugin to verify successful connection by the getS3AccountOwner method. I will let you know when the updated plugin has been added to the Appmarket

  • The list bucket operation should not assume that it has full access in AWS to do this. Security settings may prevent the client from having this permission (listing all buckets is a security risk in a shared account).

  • So, we are trying to use AWS S3 connected system. After providing Access Key ID,Region and Secret Access Key, we are getting Access Denied Error with 403 as error code. 

    After checking logs from AWS console, it seems on click of Test connection, Appian is making ListBucket request (which is to list all buckets available). Since, AWS is getting shared across other account and application they can't give full access to AWS S3. 

    Also, on Test Connection, we are not providing any specific bucket name. 

    Just for testing we change the access to Full Access on S3 and was able to connect but it is not advisable for security concerns. 

    Please let me know if ther is any other way to make it work.

  • The plugin will now only support Appian Version 19.3 and above. If on version 19.2 or less, then consider updating to 19.3 for Download Document functionality.

  • I am getting the error when adding the connected system to the application. Is anyone else facing the same? "Expression evaluation error [evaluation ID = CBN2NGXG] : Error creating bean with name 'plugin.[com.appian.ps.aws.s3.cst].[AWSS3ConnectedSystemTemplate]@2': Failed to introspect bean class" 

  • When will the 19.2 version of this plugin be released? Without the ability to pull files back the utility of this plugin is very limited.

  • Did you find a solution for your problem? I have the same issue. I need to download the document from my S3 bucket.